The requirements to run good local LLMs have really been shrinking this past year… I have a lot of faith that there is a generally useful yet tiny AI tool within the grasp of Mozilla.
Comment on Mozilla lays off 60 people, wants to build AI into Firefox
DarkThoughts@kbin.social 8 months agoIf they're local they'd be basically useless due to a lack of computing power and potential lack of indexing for a search engine chatbot, so I doubt it. It would also have to be so polished that it wouldn't require further user knowledge / input, and that's just not a thing with any local LLM I've come across. Mozilla can gladly prove me wrong though. I certainly wouldn't mind if they generally can make the whole process of local LLMs easier and more viable.
pennomi@lemmy.world 8 months ago
KairuByte@lemmy.dbzer0.com 8 months ago
I can understand your thinking, but it could be as simple as giving the user the option to outsource the computation to a secure something or other, if their machine can’t handle it.
And yeah, the requirements are still quite high, but they are being reduced somewhat steadily, so I wouldn’t be surprised if average hardware could manage it in the long term.