Comment on Mozilla lays off 60 people, wants to build AI into Firefox
kakes@sh.itjust.works 8 months agoIn their defense, Mozilla did create the easiest way to run and integrate an LLM locally, so if anyone could do it, I imagine it would be them.
DarkThoughts@kbin.social 8 months ago
Yes, but what would a local model do for you in this case? Chatbots in browsers are typically used as an alternative / more contextualized search engine. For that you need proper access to an index of search results. Most people will also not have enough computing power to make use of any complex chatbot / larger context sizes.
kakes@sh.itjust.works 8 months ago
Pennomi wrote a whole list of potential ideas. And honestly, while I agree that local LLMs on typical hardware are underpowered for most tasks, it’s possible they would have the option for those that can run it.
People are getting all upset over this announcement without even knowing what their plan actually is, like the word “AI” is making them foam at the mouth or something. I’m just saying we should reserve judgements for when we have an idea of what’s happening.
DarkThoughts@kbin.social 8 months ago
And I replied to that comment, without any mouth foaming.
kakes@sh.itjust.works 8 months ago
Yes, and then you asked for ideas, which were in that comment that you replied to.