From the post it seems like theyll ad support for self-hosted models before the feature leaves beta
Comment on Mozilla roll out first AI features in Firefox Nightly
captainjaneway@lemmy.world 4 months ago
I think it makes sense. I like ChatGPT and I appreciate having easy access to it. What I really wish is the option to use local models instead. I realize most people don’t have machines that can tokenize quickly enough but for those that do…
Blisterexe@lemmy.zip 4 months ago
maxinstuff@lemmy.world 4 months ago
Seconding this. Why not allow people to run llama3 or other open source models?
ahal@lemmy.ca 4 months ago
From the post: