eu8@lemmy.worldtoSelfhosted@lemmy.world•Guide: Self-hosting open source GPT chat with no GPU using GPT4AllEnglish
1·
1 year agoTake my answer with a grain of salt, but I’m pretty sure if you have a GPU you can just run the same models and it should work more efficiently for you. The only difference for you is you can run some of the larger models.
Reddit’s database was pretty poorly designed. They designed it to be really flexible so they could make changes easily early on, but it was highly inefficient. I don’t know if it’s still like that, but the old website’s source code is public and it is very inefficient.