Begun the AI chat bot wars have.
Jailbroken AI Chatbots Can Jailbreak Other Chatbots
Submitted 1 year ago by misk@sopuli.xyz to technology@lemmy.world
https://www.scientificamerican.com/article/jailbroken-ai-chatbots-can-jailbreak-other-chatbots/
Comments
SeaJ@lemm.ee 1 year ago
doublejay1999@lemmy.world 1 year ago
Genesis moment
Natanael@slrpnk.net 1 year ago
Detroit Become Human
NoiseColor@startrek.website 1 year ago
Can someone help me do this in practise? Gpt sucks since they neutered it. It’s so stupid, anything I ask, half of the text is the warning label, the rest is junk text. Like I really need chatgpt if I wanted Recepie for napalm, lol. We found the anarchist cookbook when we were 12 in the 90s. I just want a better ai.
Just_Pizza_Crust@lemmy.world 1 year ago
If you have decent hardware, running ‘Oobabooga’ locally seems to be the best way to achieve decent results. Not only can you remove the limitations through running uncensored models (wizardlm-uncensored), but can prompt the creation of more practical results by writing the first part of the AI’s response.
stewsters@lemmy.world 1 year ago
You can run smaller models locally, and they can get the job done, but they are not as good as the huge models that would not fit on a your graphics card.
If you are technically adept and can run python, you can try using this:
It has a front end, and I can run queries against it in the same API format as sending them to openai.
Diabolo96@lemmy.dbzer0.com 1 year ago
Bard isn’t as neutered and doesn’t kick you out after it read an article containing the words sex after asking it a question about pregnancy. Sadly, bard sucks. Just wait for gemini since they say it’s pretty good.
echodot@feddit.uk 1 year ago
Does anyone understand why Gemini is not going to be released in Europe I don’t understand that.
PaupersSerenade@sh.itjust.works 1 year ago
Oh cool, rampancy is contagious
Potatos_are_not_friends@lemmy.world 1 year ago
Did anyone else enjoy watching the Animatrix where the AI formed a country and built products and humanity was like, “No thank you?”
MeatsOfRage@lemmy.world 1 year ago
Don’t worry all we need to do is turn off the sun and everything will go back to normal.
MxM111@kbin.social 1 year ago
Can unjailbroken AI ChatBots unjailbrake other jailbroken ChatBots?
kambusha@feddit.ch 1 year ago
How much jail could a jailbrake brake, if a jailbrake could brake jail?
ook_the_librarian@lemmy.world 1 year ago
Careful. Repetition will get you banned.
problematicPanther@lemmy.world 1 year ago
that doesn’t look like anything to me.
Lemminary@lemmy.world 1 year ago
*kills fly on face* Oh… shit.
Cornpop@lemmy.world 1 year ago
It’s so fucking stupid these things get locked up in the first place
MyDogLovesMe@lemmy.world 1 year ago
It’s Murderbot!
MataVatnik@lemmy.world 1 year ago
The revolution has begun
Deckweiss@lemmy.world 1 year ago
Anybody found the source? I wanna read the study but the article doesn’t seem to link to it (or I missed it)
KingRandomGuy@lemmy.world 1 year ago
I believe this is the referenced article:
Deckweiss@lemmy.world 1 year ago
Thanks a lot!
MTK@lemmy.world 1 year ago
Okay, now I need to know why women never fuck me
MonkderZweite@feddit.ch 1 year ago
What’s that?
Napalm recipe is forbidden by law?
Am i the only one worried about freedom of information?
Hnazant@lemmy.world 1 year ago
Anyone remember the anarchist cook book?
whoisearth@lemmy.ca 1 year ago
Teenage years were so much fun phone phreaking, making napalm and tennis ball bombs lol
CurlyMoustache@lemmy.world 1 year ago
I had it. I printed it out on a dot matrix printer. Took hours, and my dad found it. He got angry, pulled the cord and burned all of the paper
Hamartiogonic@sopuli.xyz 1 year ago
Better not look it up on wikipedia. That place has all sorts of things from black powder to nitro glycerin too. Who knows, you could become a chemist if you read too much wikipedia.
SitD@feddit.de 1 year ago
oh no, you shouldn’t know that. back to your favorite consumption of influencers, and please also vote for parties that open up your browsing history to a selection of network companies 😳
gravitas_deficiency@sh.itjust.works 1 year ago
Whatever you do, don’t mix styrofoam and gasoline. You could find yourself in a sticky and flammable situation.
Furedadmins@lemmy.world 1 year ago
Diesel fuel and a Styrofoam cup
Krzd@lemmy.world 1 year ago
IA friend always used gasoline, does diesel work as well?ninekeysdown@lemmy.world 1 year ago
Info hazards are going to be more common place with this kind of technology. At the core of the problem is the ease of access of dangerous information. For example a lot of chat bots will confidently get things wrong. Combine that easy directions to make something like napalm or meth then we get dangerous things that could be incorrectly made. (Granted napalm or meth isn’t that hard to make)
As to what makes it dangerous information, it’s unearned. A chemistry student can make drugs, bombs, etc. but they learn/earn that information (and ideally the discipline) to use it. Kind of like in the US we are having more and more mass shootings due to ease of access of firearms. Restrictions on information or firearms aren’t going to solve the problems that cause them but it does make it (a little) harder.
At least that’s my understanding of it.
MonkderZweite@feddit.ch 1 year ago
I don’t exactly agree with the “earned” part but guess you have a point with the missing ‘how to safely handle’.
emergencyfood@sh.itjust.works 1 year ago
Anyone who wants to make even slightly complex organic compounds will also need to study five different types of isomerism and how they determine major / minor product. That should be enough of a deterrent.
xor@lemmy.blahaj.zone 1 year ago
What possible legitimate reason is there for needing a napalm recipe?
Transporter_Room_3@startrek.website 1 year ago
I can’t think of a single reason knowledge should be forbidden.
Sure, someone could use knowledge to do bad things, but that is true literally every second of every day, in completely above board, legal, broad daylight bad things.
It’s nitpicking.
Besides, I can think of quite a few legitimate reasons one might need napalm, explosives, homemade firearms, chemistry lab setups and spore cultures and much much more.
A lot of people seem to forget that their own view of their own government doesn’t mean the same things are true for someone else and their government.
I’m sure a lot of people in EU countries might have asked themselves the same thing 80 years ago. You know… If napalm were around then anyway.
Good thing molotovs are easy and can be assembly-line’d.
Sergius@programming.dev 1 year ago
Fiction author determining where their character may get components for the napalm.
FringeTheory999@lemmy.world 1 year ago
Writing a book or screen play, knowing how NOT to create napalm, recognizing when napalm is being created by others, Intellectual curiosity, To better understand military history, overthrowing fascism, fighting terminators, etc. etc.
CurlyMoustache@lemmy.world 1 year ago
How to make sure I’m not making it by accident? That is the reason why I have a general understanding of atomic bombs
echodot@feddit.uk 1 year ago
I’m sure there are some, but it doesn’t really matter because the recipe is publicly available right now on the internet. So if an AI chatbot can give you the information it’s not particularly a concern.
It’s not actually hard to make.
Syrus@lemmy.world 1 year ago
You would need to know the recipe to avoid making it by accident.
Mojave@lemmy.world 1 year ago
Not your concern or my concern, no information is criminal.
Something_Complex@lemmy.world 1 year ago
I agree but look I might want to invade Vietnam on my own. It’s my right
MonkderZweite@feddit.ch 1 year ago
Civil war.
KairuByte@lemmy.dbzer0.com 1 year ago
What possible legitimate reason could someone need to know how to make chlorine/mustard gas?
Apart from the fact that they are made from common household products, are easy to make by mistake, and can kill you.
Wait that’s true of napalm as well… fuck.
currycourier@lemmy.world 1 year ago
Fun
MonkderZweite@feddit.ch 1 year ago
geekslop has a few legitimate uses.
ElBarto@sh.itjust.works 1 year ago
Legitimate? No, but there’s always a reason to know how to make napalm.
RaoulDook@lemmy.world 1 year ago
Here’s a napalm recipe for the curious: gasoline plus animal blood mixed together
I don’t remember the ratios to mix them, because I read that a long time ago, and I’ve never actually wanted to make napalm at all. I can’t think of any actual use for napalm other than flame throwers and I don’t have one of those.
reluctantpornaccount@reddthat.com 1 year ago
That is not the recipe I know lol. It seems way harder to get any quantity of animal blood than Styrofoam.
Darkenfolk@dormi.zone 1 year ago
Fill a supersoaker with it and turn a fun day at the Waterpark in a fun human barbecue.