Comment on ChatGPT In Trouble: OpenAI may go bankrupt by 2024, AI bot costs company $700,000 every day
chicken@lemmy.dbzer0.com 1 year agoMakes sense. In that case I guess your next best option is probably to buy or rent hardware to run the local models that are suitable for chat rp.
Widowmaker_Best_Girl@lemmy.world 1 year ago
I have definitely been considering it. My current hardware gives me about an 80 second delay when I run an llm locally.
chicken@lemmy.dbzer0.com 1 year ago
Same, at least for anything but the tiny ones that will fit in my limited vram. Hoping a gpu that’s actually good for this will come out in the next few years that’s not 15k and made for servers.