Comment on "Vampire-chan Can't Suck Properly" Anime Adaptation Announced with New Teaser Visual
asudox@lemmy.asudox.dev 1 day ago
She might not be able to suck blood, but definitely something else ( ͡° ͜ʖ ͡°)
Comment on "Vampire-chan Can't Suck Properly" Anime Adaptation Announced with New Teaser Visual
asudox@lemmy.asudox.dev 1 day ago
She might not be able to suck blood, but definitely something else ( ͡° ͜ʖ ͡°)
SatouKazuma@ani.social 23 hours ago
No. Just…no.
MrLLM@ani.social 23 hours ago
FBI here, as long as OP is a teenager, I see no problem.
Elevator7009sAlt@ani.social 13 hours ago
I do think a lot of people online tend to forget teenagers use the internet when people express attraction. I distinctly remember reading a comment on Reddit about how the commenter expressed attraction to Joja Siwa online as a child, and got called a pedophile, and it made their same-age self worry that their attraction was pedophilia. (Them being a child at the time, they didn’t realize that attraction to a child isn’t bad if you’re also a child around the same age…) So thanks for remembering.
DragonTypeWyvern@midwest.social 10 hours ago
I’m 14 and what is this???
SatouKazuma@ani.social 23 hours ago
Name…absolutely does not check out. Saliently enough, have you managed to try DeepSeek, or even get it set up locally?
MrLLM@ani.social 23 hours ago
Uhh, oh, fair enough (゚∀゚)
Yeah, I’ve successfully run the cut down version of deepseek-r1 through Ollama. The model itself is the 7b (I’m VRAM limited to 8GB). I used an M1 Mac Mini to run it, in terms of performance, is fast and the quality of the generated content is okay.
Depending on your hardware and SO, you will or not be able to get to run a LLM locally with reasonable speed. You might want to check the GPU support for Ollama. You don’t need a GPU as it can run on the CPU, but it’ll certainly be slower.