People complain about AI possibly being unreliable, then actively root for things that are designed to make them unreliable.
Cloudflare announces AI Labyrinth, which uses AI-generated content to confuse and waste the resources of AI Crawlers and bots that ignore “no crawl” directives.
Submitted 1 week ago by Tea@programming.dev to technology@lemmy.world
https://blog.cloudflare.com/ai-labyrinth/
Comments
Greyfoxsolid@lemmy.world 1 week ago
shads@lemy.lol 1 week ago
I find this amusing, had a conversation with an older relative who asked about AI because I am “the computer guy” he knows. Explained basically how I understand LLMs to operate, that they are pattern matching to guess what the next token should be based on a statistical probability. Explained that they sometimes hallucinate, or go of on wild tangents due to this and that they can be really good at aping and regurgitating things but there is no understanding simply respinning fragments to try to generate a response that pleases the asker.
He observed, “oh we are creating computer religions, just without the practical aspects of having to operate in the mundane world that have to exist before a real religion can get started. That’s good, religions that have become untethered from day to day practical life have never caused problems for anyone.”
Which I found scarily insightful.
A_Random_Idiot@lemmy.world 1 week ago
Oh good.
now I can add digital jihad by hallucinating AI to the list of my existential terrors.
Thank your relative for me.
ArchRecord@lemm.ee 1 week ago
Here’s the key distinction:
This only makes AI models unreliable if they ignore “don’t scrape my site” requests. If they respect the requests of the sites they’re profiting from using the data from, then there’s no issue.
People want AI models to not be unreliable, but they also want them to operate with integrity in the first place, and not profit from people’s work who explicitly opt-out their work from training.
A_Random_Idiot@lemmy.world 1 week ago
I’m a person.
I dont want AI, period.
We cant even handle humans going psycho. Last thing I want is an AI losing its shit due from being overworked producing goblin tentacle porn and going full skynet judgement day.
DasSkelett@discuss.tchncs.de 1 week ago
This will only make models of bad actors who don’t follow the rules worse quality. You want to sell a good quality AI model trained on real content instead of other misleading AI output? Just follow the rules ;)
Doesn’t sound too bad to me.
cupcakezealot@lemmy.blahaj.zone 1 week ago
i mean this is just designed to thwart ai bots that refuse to follow robots.txt rules of people who specifically blocked them.
cultsuperstar@lemmy.world 1 week ago
I introduce to you, the Trace Buster Buster!
If you’ve never seen the movie The Big Hit, it’s great.
_cnt0@sh.itjust.works 1 week ago
This is so fucking retarded on so many levels. It’s time to regulate the shit out of “AI”.
nyan@lemmy.cafe 1 week ago
Will it actually allow ordinary users to browse normally, though? Their other stuff breaks in minority browsers. Have they tested this well enough so that it won’t? (I’d bet not.)
fubarx@lemmy.world 1 week ago
So this showed up last week: github.com/raminf/RoboNope-nginx
Similar vibe, minus the AI.
_cryptagion@lemmy.dbzer0.com 1 week ago
Now this is a AI trap worth using. Don’t waste your money and resources hosting something yourself, let Cloudflare do it for you if you don’t want AI scraping your shit.
NotProLemmy@lemmy.ml 1 week ago
They used AI to destroy AI
DomesticForeigner@lemm.ee 1 week ago
so sick and tired of ai this, ai that.
oppy1984@lemm.ee 1 week ago
Spiderman pointing at Spiderman meme.
fmstrat@lemmy.nowsci.com 1 week ago
And this, ladies and gentleman, is how you actually make profits on AI.
Deebster@infosec.pub 1 week ago
So they rewrote Nepenthes (or Iocaine, Spigot, Django-llm-poison, Quixotic, Konterfai, Caddy-defender, plus inevitably some Rust versions)