Comment on Jailbroken AI Chatbots Can Jailbreak Other Chatbots

<- View Parent
Transporter_Room_3@startrek.website ⁨11⁩ ⁨months⁩ ago

I can’t think of a single reason knowledge should be forbidden.

Sure, someone could use knowledge to do bad things, but that is true literally every second of every day, in completely above board, legal, broad daylight bad things.

It’s nitpicking.

Besides, I can think of quite a few legitimate reasons one might need napalm, explosives, homemade firearms, chemistry lab setups and spore cultures and much much more.

A lot of people seem to forget that their own view of their own government doesn’t mean the same things are true for someone else and their government.

I’m sure a lot of people in EU countries might have asked themselves the same thing 80 years ago. You know… If napalm were around then anyway.

Good thing molotovs are easy and can be assembly-line’d.

source
Sort:hotnewtop