Comment on Microsoft Bans Employees From Using DeepSeek App
ilmagico@lemmy.world 1 day agoI literally run deepseek r1 on my laptop via ollama, and many other models, nothing gets sent to anybody. Granted, it’s the smaller 7b parameter model, but still plenty good.
Microsoft could easily host the full model on their infrastructure if they needed it.
brucethemoose@lemmy.world 1 day ago
True, though there’s a big output difference between the 7B distil (or even 32B/70B) and the full model.
And Microsoft does host R1 already, heh. Again, this headline is a big nothingburger.
Also (random aside here), you should consider switching from ollama. They’re making some FOSS unfriendly moves, and depending on your hardware, better backends could host 14B models at longer context, and similar or better speeds.
Sabata11792@ani.social 1 day ago
What other back ends are good?
brucethemoose@lemmy.world 1 day ago
Completely depends on your laptop hardware, but generally: