Comment on Uses for local AI?
dwindling7373@feddit.it 1 month agoSure, or you could send an email to the leading international institution on the matter to get a very accurate answer!
Is it the most reasonable course of action? No. Is it more reasonable than waste a gazillion Watt so you can maybe get some better keywords to then paste in a search engine? Yes.
kitnaht@lemmy.world 1 month ago
Once the model is trained, the electricity that it uses is trivial. LLMs can run on a local GPU. So you’re completely wrong.
dwindling7373@feddit.it 1 month ago
No I’m not. Other questions?
kitnaht@lemmy.world 1 month ago
Those were statements.
dwindling7373@feddit.it 1 month ago
Notwithstanding that running an LLM is still more expensive than a search engine, in any reasoning around running an LLM you must include the training and, most of all, the incentive as a consumer you are giving to further training.
It’s like arguing that cooking a steak has negligible environmental impact. The point is the whole industry meant to provide you the steak in the first place.
dwindling7373@feddit.it 1 month ago
Notwithstanding that running an LLM is still more expensive than a search engine, in any reasoning around running an LLM you must include the training and, most of all, the incentive as a consumer you are giving to further training.
It’s like arguing that cooking a steak has negligible environmental impact. The point is the whole industry meant to provide you the steak in the first place.