Comment on "Vampire-chan Can't Suck Properly" Anime Adaptation Announced with New Teaser Visual
SatouKazuma@ani.social 20 hours agoName…absolutely does not check out. Saliently enough, have you managed to try DeepSeek, or even get it set up locally?
MrLLM@ani.social 20 hours ago
Uhh, oh, fair enough (゚∀゚)
Yeah, I’ve successfully run the cut down version of deepseek-r1 through Ollama. The model itself is the 7b (I’m VRAM limited to 8GB). I used an M1 Mac Mini to run it, in terms of performance, is fast and the quality of the generated content is okay.
Depending on your hardware and SO, you will or not be able to get to run a LLM locally with reasonable speed. You might want to check the GPU support for Ollama. You don’t need a GPU as it can run on the CPU, but it’ll certainly be slower.
SatouKazuma@ani.social 20 hours ago
I have a very beefy PC, so I don’t think VRAM or any hardware will really be the limitation, thankfully. Thanks for the links!