Can you point to something specific in the law that you have a contention with?
Comment on Teen deepfake victim pushes for federal law targeting AI-generated explicit content
Meowoem@sh.itjust.works 9 months ago
Think of the children being used to push an agenda that helps the very wealthy? Well I’ll be, what a totally new and not at all predictable move.
Ban all ai that aren’t owned by rich people, make open source impossible, restrict everything that might allow regular people to compete with the corporations - only then will you children be safe!
LWD@lemm.ee 9 months ago
joyjoy@lemm.ee 9 months ago
You kind of have to be rich in order to run these image generation AIs. The RTX 4090 TI isn’t cheap.
TheRealKuni@lemmy.world 9 months ago
You kind of have to be rich in order to run these image generation AIs. The RTX 4090 TI isn’t cheap.
Any iPhone or iPad on the current version of iOS can run Stable Diffusion locally with the (free) Draw Things app.
Hell, if you’re willing to run on the CPU instead of the graphics card (which takes much longer) you can get Stable Diffusion working on pretty much any PC. And honestly any semi-recent nVidia card will have drivers to run it.
What’s more, there are free sites for SD image generation.
Image generation isn’t expensive, and it gets cheaper and cheaper every year.
JackGreenEarth@lemm.ee 9 months ago
The NVIDIA 1660 ti is perfectly adequate and not only the property of rich people.
TwilightVulpine@lemmy.world 9 months ago
I’m as suspicious of “think of the children” stuff as anyone here but I don’t see how we are fighting for the rights of the people by defending non-consensual deepfake porn impersonation, of children or anyone.
If someone makes deepfake porn of my little cousin or Emma Watson, there’s no scenario where this isn’t a shitty thing to do to a person, and I don’t see how the masses are being oppressed by this being banned. What, do we need to deepfake Joe Biden getting it on to protest against the government?
Not only the harassment of being subjected to something like this seems horrible, it’s reasonable to say that people ought to have rights over their own likeness, no? It’s not even a matter of journalistic interest because it’s something completely made-up.
General_Effort@lemmy.world 9 months ago
We’re not talking about whether we should make fakes. We’re talking about whether people who do, should be prosecuted - IE physically overpowered by police officers, restrained with handcuffs, and locked up in a prison cell. Some empathy?
If some classmate of your little cousin makes a fake, should the police come and drag them out of school and throw them in prison? You think that would help?
Realistically, it’s as likely to happen as prosecution of kids who “get into fights” for assault. Kids tell mean lies about each other but that is not resolved in civil suits over defamation. Even between adults, that’s not the usual thing.
Civil suits under this bill would be mainly targeted against internet services, because they have the money. And it would largely be used over celebrity fakes. That’s the overwhelming part of fakes out there and they have the money to splurge on suing people who can’t pay. It would be wealthy, powerful people using it against horny teens.
Also, this bill is so ripe for industrial abuse. Insert a risqué scene in a movie, and suddenly “pirates” can be prosecuted under this.
TwilightVulpine@lemmy.world 9 months ago
You do have a point about the excesses of police work, but if you want to talk about empathy you should also consider the position of the kid who is harassed and traumatized over something they didn’t even have any say over. There is some discussion to be had over what degree of punishment ought to be appropriate, and the need to limit police brutality, well beyond this particular matter.
But as far as demanding that every such work is taken down, and giving vulnerable people the means to demand so without exposing themselves further, it is perfectly reasonable.
Except that in the case of deepfake porn it’s not a matter of fuzzy two-sided conflicts. One side is creating the whole problem, and one side is just the victim of it despite not being involved in any way. That’s the whole point of deepfake. The most that lies might play into it is in finding out that the porn is real, and in such case there is even more reason to take it down.
Gotta say I have a hard time feeling sorry for the people who can’t be satisfied by the frankly immense amount of porn we have and decided that they absolutely must have porn from that one specific person who never consented to it. Maybe they are wealthy and powerful, sure. Does that mean it’s a free pass to fabricate deepfake porn with their likenesses? I don’t think so. Nobody is owed that. As much as you insist that it will be used by the powerful against the poor masses, it still seems to me that whatever regular dude decides to do it is crossing serious boundaries. This is not brave freedom fighter, it’s just an asshole.
I think most likely what will happen is that these internet services will just take those down. As they should.
wildginger@lemmy.myserv.one 9 months ago
If my little cousin makes AI child porn, of anyone at all let alone a classmate he knows physically in real life, I dont think he should be allowed to kick his feet and go about his day.
Like… Making kiddie porn of your classmates is not excusable because youre a horny teen. Sorry, bud, its fucking not
General_Effort@lemmy.world 9 months ago
If two 14-year-olds get it on, they should both be prosecuted for child abuse? That is what you are actually saying?
curiousaur@reddthat.com 9 months ago
The issue is there really is no way to stop it unless you make ai illegal. The cat is already out of the bag. The models and hardware are getting better and faster and cheaper.
How do you suppose you enforce a law like this when people stop even sharing the photos they create, maybe don’t even save them themselves, because it’s so easy and instant to create more when you want to see them. “Put her face on her body in this position”, bam, instant album of photos to jerk off to, then delete them. That’s how good and how available these models are getting.
How do you think restrictions on this should, or could, be enforced?
TwilightVulpine@lemmy.world 9 months ago
Nah, making deepfake porn illegal doesn’t require making all of AI illegal. As proposed this law would neither apply to candid photography generation nor to entirely imaginary AI porn. As proposed it’s targetting those generating and distributing such images rather than the technology itself, and giving victims means to defend themselves against being publicly humilliated.
It could be handled much like any matter of copyright is, that anyone hosting and sharing it must take it down or face the punishment.
Technology allows many things to be done quickly and easily, but whether they are legal and protected is a whole different matter. The models can be as good as they want, as quick as copying a file, it doesn’t mean that people won’t be sued over it.
It seems a bit questionable to assume that everything that is technologically possible ought to be permitted, no matter who is harmed. And frankly this is much more harmful than any piracy or infringement.
curiousaur@reddthat.com 9 months ago
When it’s widely available, you could share a perfectly legal photo, along with the prompt. Then everyone who runs it would see similar generated images on their own devices, without distributing anything illegal.
I’m trying to point out how futile it is to fight this, and that any attempt to actually stop it will eventually lead to limits on the AI models themselves.
Gigasser@lemmy.world 9 months ago
Tbh, I’ve always thought about it like this, making deepfake tech illegal would be like making photoshopping faces on porn images illegal. At the end of the day the technology itself shouldn’t be regulated, the end products themselves should be though. If you Photoshop some kids face onto some nude body, you should be arrested for possession regardless if it was “real” or not. The same should go for deepfake porn exploiting children.
However I see very little wrong with some guy photoshopping adult celeb or “friends” faces onto nude model bodies, same for those who do it with deepfake tech, just don’t distribute it.
curiousaur@reddthat.com 9 months ago
wildginger@lemmy.myserv.one 9 months ago
Cant stop people from killing others with hammers unless we make hammers illegal guys
Darkncoldbard@lemmy.world 9 months ago
Mr. Vulpine, what you’ve just said is one of the most insanely idiotic things I have ever heard. At no point in your rambling, incoherent response were you even close to anything that could be considered a rational thought. Everyone in this chat room is now dumber for having listened to it. I award you no points, and may God have mercy on your soul.
TwilightVulpine@lemmy.world 9 months ago
Any reason why you are quoting Adam Sandler movies at me?
Because if you have any criticism you could at least be specific and original.
Darkncoldbard@lemmy.world 9 months ago
Ahhhh, fine. It’s reasonable to say that people ought to have rights over their own likeness? So if you’re walking down the street and someone’s recording you, what? You melt down over your likeness? Hide in your house for fear that someone will take a picture of you?