Ah yes. The “freedom” the usa has spread all over its country and other nations… Yes of course we must protect that freedom that is ofc a freedom for people to avoid getting owned by giant corporations. We must protect the freedom of giant corporations to not give us ai if they want to. I don’t disagree but think people are more important
Avoiding AI is hard – but our freedom to opt out must be protected
Submitted 23 hours ago by Pro@programming.dev to technology@lemmy.world
https://theconversation.com/avoiding-ai-is-hard-but-our-freedom-to-opt-out-must-be-protected-255873
Comments
KeenFlame@feddit.nu 5 hours ago
underline960@sh.itjust.works 13 hours ago
I doubt we’ll ever be offered a real opt-out option.
Instead I’m encouraged by the development of poison pills for the AI that are non-consensually harvesting human art (Glaze and Nightshade) and music (HarmonyCloak).
Zenith@lemm.ee 9 hours ago
I’ve deleted pretty much all social media, I’m down to only Lemmy. I only use my home PC for gaming, like CiV or cities skylines or search engines for things like travel plans. I’m trying to be as offline as possible because I don’t believe there’s any other way to opt out and I don’t believe there ever will be. Like opting out of the internet is practically impossible, AI will get to this point as well
T156@lemmy.world 10 hours ago
Remind me in 3 days.
Although poison pills are only so effective since it’s a cat and mouse game, and they only really work for a specific version of a model, with other models working around it.
Loduz_247@lemmy.world 11 hours ago
But do Glaze, Nightshade, and HarmonyCloak really work to prevent that information from being used? Because at first, it may be effective. But then they’ll find ways around those barriers, and that software will have to be updated, but only the one with the most money will win.
underline960@sh.itjust.works 10 hours ago
It’s not guaranteed.
AI is a venture capital money pit, and they are struggling to monetize before the hype dies out.
If the poison pills work as intended, investors will stop investing “creative” AI when the new models stop getting better (and sometimes get worse) because they’re running out of clean content to steal.
DarthObi@feddit.org 16 hours ago
You don’t need AI. There are enough porn sites with real humans.
ICastFist@programming.dev 13 hours ago
And lots of hentai for stuff that is humanly impossible
muntedcrocodile@lemm.ee 23 hours ago
I think it may he more productive to get people to use alternative ai products that are foss and/or respect privacy.
But_my_mom_says_im_cool@lemmy.world 21 hours ago
You got downvoted because Lemmy users like knee jerk reactions and think that you can unmake a technology or idea. You can’t, Ai is here and it’s forever now. Best we can do is find ways to live with it and like you said, reward those who use it ethically. The Lemmy idea that Ai should be banned and not used is so unrealistic
atomicbocks@sh.itjust.works 17 hours ago
You seem to misunderstand the ire;
AI in its current state has existed for over a decade. Watson used ML algorithms to beat Jeopardy by answering natural language questions in 2011. But techbros have gotten ahold of it and decided that copyright rules don’t apply to them and now the cat is out of the bag?!? From the outside it looks like bootlicking for the same bullshit that told us we would be using blockchain to process mortgages in 10 years… 10 years ago. AI isn’t just here to stay it’s been here for 70 years.
RvTV95XBeo@sh.itjust.works 7 hours ago
If AI is going to be crammed down our throats can we at least be able to hold it (aka the companies pushing it) liable for providing blatantly false information? At least then they’d have incentive to provide accurate information instead of just authoritative information.
oxysis@lemmy.blahaj.zone 22 hours ago
Is it really though? I haven’t touched it since the very early days of slop ai. That was before I learned of how awful it is to real people
But_my_mom_says_im_cool@lemmy.world 21 hours ago
They don’t mean directly, i guarantee that companies, service providers, etc that you are with do indeed use Ai. That’s what I took the headline to mean. Some facet of everyone’s life uses Ai now
turtlesareneat@discuss.online 19 hours ago
Hell AI has been making fully automated kill-chain decisions for 5 years now. Yes it’s in everything.
fxdave@lemmy.ml 16 hours ago
The problem is not the tool. It’s the inability to use the tool without a third party provider.
blinx615@lemmy.ml 13 hours ago
Local is a thing. And models are getting smaller with every iteration.
smarttech@lemmy.world 8 hours ago
AI is everywhere now, but having the choice to opt out matters. Sometimes, using tools lik Instant Ink isn’t about AI it’s just about saving time and making printing easier.
WaitThisIsntReddit@lemmy.world 14 hours ago
If there was an ai to detect ai would you use it?
NotASharkInAManSuit@lemmy.world 9 hours ago
Yes. That is actually an ideal function of ethical AI. I’m not against AI in regards to things that is is actually beneficial towards and where it can be used as a tool for understanding, I just don’t like it being used as a thief’s tool pretending to be a paintbrush or a typewriter. There are good and ethical uses for AI, art is not one of them.
spankmonkey@lemmy.world 22 hours ago
I disagree with the base premise that being opt out needs to be a right. That implies that having data be harvested for companies to make profits should be the default.
We should have the right to not have our data harvested by default. Requiring companies to have an opt in process with no coercion or other methods of making people feel obligated to opt in is our right.
ItsComplicated@sh.itjust.works 22 hours ago
As the years have passed, it has become the acceptable consensus for all of your personal information, thoughts, and opinions, to become freely available to anyone, at anytime, for any reason in order for companies to profit from it.
People keep believing this is normal and companies keep taking more. Unless everyone is willing to stand firm and say enough, I only see it declining further, unfortunately.
Zenith@lemm.ee 9 hours ago
The death of the private life
sugar_in_your_tea@sh.itjust.works 13 hours ago
I’m there with you, and I’d join in a protest to get it.
taladar@sh.itjust.works 19 hours ago
I would maybe not go quite that far but at the very least this should apply to commercial interests and living people.
I think there are some causes where it should be acceptable to have your data usable by default, e.g. statistical analysis of health threats (think those studies about the danger of living near a coal power plant or similar things).
sugar_in_your_tea@sh.itjust.works 13 hours ago
I disagree. Yes, there are benefits to a lot of invasions of privacy, but that doesn’t make it okay. If an entity wants my information, they can ask me for it.
One potential exception is for dead people, I think it makes sense for a of information to be released on death and preventing that should be opt in by the estate/survivors, depending on the will.
spankmonkey@lemmy.world 19 hours ago
I sure hope those studies are not being done by for profit companies!
General_Effort@lemmy.world 21 hours ago
How would that benefit the average person?
spankmonkey@lemmy.world 20 hours ago
Send me your name, birthdate, web browsing history, online spending history, real time location, and a list of people you know and I will explain it to you.