Duck.ai doesn’t data mine, and has o3 mini which I have found to be very good. Its got some extra functionality like lines to break up text.
Google quietly released an app that lets you download and run AI models locally
Submitted 2 months ago by Pro@programming.dev to technology@lemmy.world
https://github.com/google-ai-edge/gallery
Comments
toastmeister@lemmy.ca 2 months ago
stardust@lemmy.ca 2 months ago
Yeah duck is all over bothered with since it came out since you don’t even need to login to use it.
Kuma@lemmy.world 2 months ago
Nice! I saw Mozilla also added an ai chat in the browser recently (not in the phone version that I have seen tho).
It is too bad duck.ai only runs the small models. Gpt4o-mini is not very good, it can be very inaccurate and very inconsistent :( I would like to see the 4.1-mini instead, faster and better and got function calling, so it can do web searches for example. O3 can’t so it can only know what it knows until 2023.
But thanks for the information I will be looking out for when 4.1 is added!
Libra@lemmy.ml 2 months ago
I’ve been using duck.ai recently myself and quite like it. My only complaint with it is that the chats have a length limit, so if you’re working on complex projects you can run into those limits pretty quick. I use it for worldbuilding for a novel I’m working on and I have to use chatgpt for thematic stuff because it has a better memory, but otherwise it’s great for quick/small things.
rirus@feddit.org 2 months ago
Alibaba also provides an OpenSource App, it even has support for their multimodal voice chat Model qwen2.5 omni: github.com/alibaba/MNN
Wazowski@lemmy.world 2 months ago
Excellent, I will be sure not to use this, like all Google shit.
bizzle@lemmy.world 2 months ago
In a few years you won’t be able to anyway
Kolanaki@pawb.social 2 months ago
I’m just reaching the end game faster then.
rickyrigatoni@lemm.ee 2 months ago
All the time I spent trying to get rid of gemini just to now download this. Am I stupid?
JustARegularNerd@lemmy.dbzer0.com 2 months ago
I wouldn’t think so - it depends on your priorities.
The open source and offline nature of this without the pretenses of “Hey, we’re gonna use every query you give as a data point to shove more products down your face” seems very appealing over Gemini. There’s also that Gemini is constantly being shoved in our faces and preinstalled, whereas this is a completely optional download.
NGC2346@sh.itjust.works 2 months ago
Enclave on iOS does the trick for the rare times i need a local LLM
th3dogcow@lemmy.world 2 months ago
Didn’t know about this. Checking it out now, thanks!
moonlight6205@lemm.ee 2 months ago
Is the chat uncensored?
ofcourse@lemmy.ml 2 months ago
Censoring is model dependent so you can select one of the models without the guardrails.
RizzoTheSmall@lemm.ee 2 months ago
You never heard of ollama or docker model runner?
fmstrat@lemmy.nowsci.com 2 months ago
Android and iOS.
cupcakezealot@lemmy.blahaj.zone 2 months ago
god i can’t wait for the ai bubble to pop
Allero@lemmy.today 2 months ago
There is already GPT4All.
Convenient graphical interface, any model you like, fully local, easy to opt in or out of data collection, and no fuss to install - it’s just a Linux/Windows/MacOS app.
For Linux folks, it is also available as flatpak for your convenience.
Squizzy@lemmy.world 2 months ago
So it doesnt require internet access at all? I would only use these on a disconnected part of my network.
Allero@lemmy.today 2 months ago
Yes, it works perfectly well without Internet. Tried it both on physically disconnected PC and laptop in airplane mode.
KeenFlame@feddit.nu 2 months ago
Wonder what this has over its competitors, I hesitate to think they released this for fun though
Obelix@feddit.org 2 months ago
Google hosting their shit on Microsofts servers and telling you to sideload and not using their own software distribution method for their own OS is kind of crazy if you think about it
AmbiguousProps@lemmy.today 2 months ago
Why would I use this over Ollama?
Greg@lemmy.ca 2 months ago
Ollama can’t run on Android
AmbiguousProps@lemmy.today 2 months ago
That’s fair, but I think I’d rather self host an Ollama server and connect to it with an Android client in that case. Much better performance.
Euphoma@lemmy.ml 2 months ago
You can use it in termux
Diplomjodler3@lemmy.world 2 months ago
Is there any useful model you can run on a phone?
gens@programming.dev 2 months ago
Llama.cpp (on which ollama runs on) can. And many chat programs for phones can use it.
pirat@lemmy.world 2 months ago
Try PocketPal instead