Such as…?
Comment on Uses for local AI?
dwindling7373@feddit.it 3 months agoThere are a huge number of vastly better solutions to get that…
Aquila@sh.itjust.works 3 months ago
dwindling7373@feddit.it 3 months ago
A privacy respecting search engine.
AustralianSimon@lemmy.world 3 months ago
Duckduckgo or SearX
umami_wasbi@lemmy.ml 3 months ago
IMO LLMs are ok to get a head start of searching. Like got a vague idea of something but don’t know the exact keywords. LLMs can help and use the output on whatever search engine you like. This saves a lots of time tinkering the right keywords.
dwindling7373@feddit.it 3 months ago
Sure, or you could send an email to the leading international institution on the matter to get a very accurate answer!
Is it the most reasonable course of action? No. Is it more reasonable than waste a gazillion Watt so you can maybe get some better keywords to then paste in a search engine? Yes.
kitnaht@lemmy.world 3 months ago
Once the model is trained, the electricity that it uses is trivial. LLMs can run on a local GPU. So you’re completely wrong.
dwindling7373@feddit.it 3 months ago
No I’m not. Other questions?