God I wish, I would just love local voice control to turn my lights and such on and off… but noooooooooooo
art@lemmy.world 1 year ago
We need to move AI from the cloud to our own hardware running in our homes. Free, open source, privacy focused hardware. It’ll eventually be very affordable.
pyldriver@lemmy.world 1 year ago
Otkaz@lemmy.world 1 year ago
pyldriver@lemmy.world 1 year ago
I have home assistant, but have not heard anything good about rhasspy. Just want to control lights and be able to use it to play music and set timers. That being said I run home assistant right now and can control it with Alexa and Siri but… I would like local only
Kolanaki@yiffit.net 1 year ago
I have that using Wiz lights and ITEEE
AA5B@lemmy.world 1 year ago
But you want a local base station, else there’s no local control. You want to use local-only networks like z-wave, zigbee, Thread, Bluetooth, etc, even though they require a base station because that’s what gives you a local-only way of controlling things.
Matter promises a base station may no longer be necessary for smart devices to control each other, but it is rolling out very slowly
Kolanaki@yiffit.net 1 year ago
The base stations are what uses the cloud/AI shit. The setup I have doesn’t even require an Internet connection. Why in the hell would I want a base station that costs money, is controlled by Amazon or Google, and requires an Internet connection for my local shit?
I don’t want a piece of hardware that does nothing but act like a fucking middleman for no good reason.
a1studmuffin@aussie.zone 1 year ago
It’s the year of the voice for Home Assistant. Given their current trajectory, I’m hopeful they’ll have a pretty darn good replacement for the most common use cases of Google Home/Alexa/Siri in another year. Setting timers, shopping list management, music streaming, doorbell/intercom management. If you’re on the fence about a Nabu Casa subscription, pull the trigger as it helps them stay independent and not get bought out or destroyed by commercial interests.
AA5B@lemmy.world 1 year ago
Thumbs up for Nabu Casa and Home Assistant!
I haven’t yet played with the local voice stuff but have been following it with interest. Actually, now that Taspberry Piis are starting to become available again, I’m on the fence between buying a few more, vs finding something with a little more power, specifically for voice processing
foggenbooty@lemmy.world 1 year ago
Get something with a little more power. Pi’s are reaching outside the price where they make sense these days. You can get an Intel N100 system on AliExpress/Amazon for pretty cheap now and I’ve got mine running ProxMox hosting all kinds of stuff.
captain_aggravated@sh.itjust.works 1 year ago
I do wonder how much of those voice assistants could run on-device. Most of what I use Bixby for (I know. I KNOW.) is setting timers. I think simple things like that can run entirely on the phone. It’s got a shocking amount of processing in it.
AA5B@lemmy.world 1 year ago
While you may have points against Apple and how effective Siri may be, with this latest version kind of products, even the watch has enough processing power to do voice processing on device. No ads. No cloud services
whofearsthenight@lemm.ee 1 year ago
Pretty much. If you want a voice assistant right now, Siri is probably the best in terms of privacy. I bought a bunch of echos early, then they got a little shitty but I was in, and now I just want them out of my house except for one thing - music. Spotify integration makes for easy multi-room audio in a way that doesn’t really work as well on the other platform that I’ll consider (Apple/Siri) and basically adds sonos-like functionality for a tiny fraction of the price. The Siri balls and airplay are just not as good, and of course, don’t work as well with Spotify.
But alexa is so fucking annoying that at this point I mostly just carry my phone (iPhone) and talk to that even though it’s a little less convenient because I’m really goddamned tired of hearing “by the way…”
Soundhole@lemm.ee 1 year ago
That’s already here. Anyone can run AI chatbots similar too, but not as intelligent, Chatgpt or Bard.
Llama.cpp and koboldcpp allow anyone to run models locally, even with only a CPU if there’s no dedicated graphics card available. And there are numerous open source models available.
Hell, you can even run llama.cpp on Android phones.
This has all taken place in just the last year or so. In five to ten years, imo, AI will be everywhere and may even replace the need for mobile Internet connections in terms of looking up information.
Zetta@mander.xyz 1 year ago
Yes, and you can use run a language model like Pygmalion Al locally on koboldcpp and have a naughty AI chat as well. Or non sexual roleplay
Soundhole@lemm.ee 1 year ago
Absolutely and there are many, many models that have iterated on and surpassed Pygmalion as well as loads of uncensored models specifically tuned for erotic chat. Steamy role play is one of the driving forces behind the rapid development of the technology on lower powered, local machines.
Chreutz@lemmy.world 1 year ago
Never underestimate human ingenuity
When they’re horny
das@lemellem.dasonic.xyz 1 year ago
And where would one look for these sexy sexy AI models, so I can avoid them, of course…
Zetta@mander.xyz 1 year ago
Which models do you think beat Pygmalion for erotic roleplay? Curious for research haha
MaxHardwood@lemmy.ca 1 year ago
GPT4All is a neat way to run an AI chat bot on your local hardware.
Soundhole@lemm.ee 1 year ago
Thanks for this, I haven’t tried GPT4All.
Oobabooga is also very popular and relatively easy to run, but it’s not my first choice, personally.
teuast@lemmy.ca 1 year ago
it does have a very funny name though
teuast@lemmy.ca 1 year ago
You’re probably right, but I kinda hope you’re wrong.
Soundhole@lemm.ee 1 year ago
Why?
teuast@lemmy.ca 1 year ago
Call it paranoia if you want. Mainly I don’t have faith in our economic system to deploy the technology in a way that doesn’t eviscerate the working class.
scarabic@lemmy.world 1 year ago
Don’t these models require rather a lot of storage?
Soundhole@lemm.ee 1 year ago
13B quantized models, generally the most popular for home computers with dedicated gpus, are between 6 and 10 gigs each. 7B models are between 3 and 6. So, no, not really?
It is relative so, I guess if you’re comparing that to an atari 2600 cartridge then, yeah, it’s hella huge. But you can store multiple models for the same space cost as a single modern video game.
scarabic@lemmy.world 1 year ago
Yeah that’s not a lot. I mean… the average consumer probably has 10GB free on their boot volume.
It is a lot to download. If we’re talking about ordinary consumers. Not unheard of though - some games on Steam are 50GB+
So okay, storage is not prohibitive.
art@lemmy.world 1 year ago
Storage is getting cheaper every day and the models are getting smaller with the same amount of data.
scarabic@lemmy.world 1 year ago
I’m just curious - do you know what kind of storage is required?