Comment on ChatGPT In Trouble: OpenAI may go bankrupt by 2024, AI bot costs company $700,000 every day
Widowmaker_Best_Girl@lemmy.world 1 year agoI have definitely been considering it. My current hardware gives me about an 80 second delay when I run an llm locally.
chicken@lemmy.dbzer0.com 1 year ago
Same, at least for anything but the tiny ones that will fit in my limited vram. Hoping a gpu that’s actually good for this will come out in the next few years that’s not 15k and made for servers.