Comment on Uses for local AI?
umami_wasbi@lemmy.ml 3 months agoIMO LLMs are ok to get a head start of searching. Like got a vague idea of something but don’t know the exact keywords. LLMs can help and use the output on whatever search engine you like. This saves a lots of time tinkering the right keywords.
dwindling7373@feddit.it 3 months ago
Sure, or you could send an email to the leading international institution on the matter to get a very accurate answer!
Is it the most reasonable course of action? No. Is it more reasonable than waste a gazillion Watt so you can maybe get some better keywords to then paste in a search engine? Yes.
kitnaht@lemmy.world 3 months ago
Once the model is trained, the electricity that it uses is trivial. LLMs can run on a local GPU. So you’re completely wrong.
dwindling7373@feddit.it 3 months ago
No I’m not. Other questions?
kitnaht@lemmy.world 3 months ago
Those were statements.