Comment on Your AI Girlfriend Is a Data-Harvesting Horror Show

<- View Parent
DarkThoughts@kbin.social ⁨4⁩ ⁨months⁩ ago

I tried oobabooga and it basically always crashes when I try to generate anything, no matter what model I try. But honestly, as far as I can tell all the good models require absurd amounts of vram, much more than consumer cards have, so you'd need at least like a small gpu server farm to local host them reliably yourself. Unless of course you want like practically nonexistent context sizes.

source
Sort:hotnewtop