Comment on Court Bans Use of 'AI-Enhanced' Video Evidence Because That's Not How AI Works
Downcount@lemmy.world 7 months ago
If you ever encountered an AI hallucinating stuff that just does not exist at all you know how bad the idea of AI enhanced evidence actually is.
turkalino@lemmy.yachts 7 months ago
Everyone uses the word “hallucinate” when describing visual AI because it’s normie-friendly and cool sounding, but the results are a product of math. Very complex math, yes, but computers aren’t taking drugs and randomly pooping out images because computers can’t do anything truly random.
You know what else uses math? Basically every image modification algorithm, including resizing. I wonder how this judge would feel about viewing a 720p video on a 4k courtroom TV because “hallucination” takes place in that case too.
Downcount@lemmy.world 7 months ago
There is a huge difference between interpolating pixels and inserting whole objects into pictures.
turkalino@lemmy.yachts 7 months ago
Both insert pixels that didn’t exist before, so where do we draw the line of how much of that is acceptable?
Downcount@lemmy.world 7 months ago
Look it this way: If you have an unreadable licence plate because of low resolution, interpolating won’t make it readable. An AI, on the other hand could just “invent” (I know, I know, normy speak in your eyes) a readable one.
You will draw yourself the line when you get your first ticket for speeding, when it wasn’t your car.
Blackmist@feddit.uk 7 months ago
I mean we “invent” pixels anyway for pretty much all digital photography based on Bayer filters.
But the answer is linear interpolation. That’s where we draw the line.
Catoblepas@lemmy.blahaj.zone 7 months ago
What’s your bank account information? I’m either going to add or subtract a lot of money from it. Both alter your account balance so you should be fine with either right?
FlyingSquid@lemmy.world 7 months ago
Whenever people say things like this, I wonder why that person thinks they’re so much better than everyone else.
Hackerman_uwu@lemmy.world 7 months ago
Tangentially related: the more people seem to support AI all the things the less it turns out they understand it.
I work in the field. I had to explain to a CIO that his beloved “ChatPPT” was just autocomplete. He become enraged. We implemented a 2015 chatbot instead, he got his bonus.
We have reached the winter of my discontent. Modern life is rubbish.
turkalino@lemmy.yachts 7 months ago
Normie, layman… as you’ve pointed out, it’s difficult to use these words without sounding condescending (which I didn’t mean to be). The media using words like “hallucinate” to describe linear algebra is necessary because most people just don’t know enough math to understand the fundamentals of deep learning - which is completely fine, people can’t know everything and everyone has their own specialties. But any time you simplify science so that it can be digestible by the masses, you lose critical information in the process, which can sometimes be harmfully misleading.
Krauerking@lemy.lol 7 months ago
Or sometimes the colloquial term people have picked up is a simplified tool for getting the right point across.
Just because it’s guessing using math doesn’t mean it isn’t hallucinating in a sense the additional data. It did not exist before and it willed it into existence much like a hallucination while being easy for people to catch onto quickly as not trustworthy thanks to previous definitions and understanding of the word.
Part of language is finding the right words to use so that people can quickly understand topics even if it means giving up nuance but absolutely it should be based on getting them to the right conclusion even if in a simplified form which doesn’t always happen when there is bias. I think this one works just fine.
cucumberbob@programming.dev 7 months ago
It’s not just the media who uses this term. According to this study which I’ve had a very brief skim of, the term “hallucination” was used in literature as early as 2000, and in Table 1, you can see hundreds of studies from various databases which they then go on to analyse the use of “hallucination” in.
It’s worth saying that this study is focused on showing how vague the term is, and how many different and conflicting definitions of “hallucination” there are in the literature, so I for sure agree it’s a confusing term. Just it is used by researchers as well as laypeople.
Hackerman_uwu@lemmy.world 7 months ago
LLMs (the models that “hallucinate” is most often used in conjunction with) are not Deep Learning normie.
Catoblepas@lemmy.blahaj.zone 7 months ago
Has this argument ever worked on anyone who has ever touched a digital camera? “Resizing video is just like running it through AI to invent details that didn’t exist in the original image”?
“It uses math” isn’t the complaint and I’m pretty sure you know that.
becausechemistry@lemm.ee 7 months ago
It’s not AI, it’s PISS. Plagiarized information synthesis software.
recapitated@lemmy.world 7 months ago
Just like us!
abhibeckert@lemmy.world 7 months ago
Sure, no drugs involved, but they are running a statistically proven random number generator and using that (along with non-random data) to generate the image. The result is this - ask for the same image, get two different images:
Image
Gabu@lemmy.world 7 months ago
Tell me you don’t know shit about AI without telling me you don’t know shit. You can easily reproduce the exact same image by defining the starting seed and constraining the network to a specific sequence of operations.
Natanael@slrpnk.net 7 months ago
But if you don’t do that then the ML engine doesn’t have the introspective capability to realize it failed to recreate an image
Malfeasant@lemmy.world 7 months ago
Technically incorrect - computers can be supplied with sources of entropy, so while it’s true that they will produce the same output given identical inputs, it is in practice quite possible to ensure that they do not receive identical inputs if you don’t want them to.
Hackerman_uwu@lemmy.world 7 months ago
IIRC there was a random number generator website where the machine was hookup up to a potato or some shit.