You can’t pirate their models, and even if they leaked, running them would need an expensive machine.
There are lots of open source models. They can get close but are limited by your hardware.
If you want close to GPT, there is the falcon 40b model. You’ll need something with more than 24 GB VRAM or deep down cpu offload with 128 GB RAM, I think, maybe 64.
With 24 GB VRAM you can do a 30B and so on…
For reference, the GPT models are like 135B. So a100 nvlink territory.
You can’t pirate their models, and even if they leaked, running them would need an expensive machine.
There are lots of open source models. They can get close but are limited by your hardware.
If you want close to GPT, there is the falcon 40b model. You’ll need something with more than 24 GB VRAM or deep down cpu offload with 128 GB RAM, I think, maybe 64.
With 24 GB VRAM you can do a 30B and so on…
For reference, the GPT models are like 135B. So a100 nvlink territory.