The military, the department of government responsible for mass murder, should not have any fucking AI in their system, absolutely anywhere. Doubly so without any sort of guardrails.
Cruel@programming.dev 6 hours ago
This honestly strikes me as a story people don’t understand. Mass surveillance is not lawful and the government thus agreed not to do that. However, they still needed the guardrails removed. People interpret this as them wanting mass surveillance, but that’s not necessarily true.
I work for a company that uses AI for legal work, processing and analyzing court cases, discovery documents, etc. We had problems with AI models like Gemini and GPT refusing to do what we needed because of guardrails against violence and abuse of minors. It refused to discuss and analyze cases that involved murders described in detail, or cases involving child molestation, etc. We weren’t used it for unlawful purposes, very much the opposite.
I feel like if people knew that we, like the DoD, had to use uncensored models that allowed such things, people would complain “Wow, you guys are trying to remove guardrails for child porn and violence! How terrible!”
Is it so shocking that a military needs their AI to work with such things even if they’re not implementing it? They cannot afford to have AI in critical moments be like “sorry, my guidelines say I can’t help with this.”
This seems like the time Trump advised against pregnant women against using Tylenol. So people started buying and using it in protest. This is yet another reaction to Trump punishing them, but people are pretending Anthropic is making a stand for the people and OpenAI is somehow not. It’s not that simple.
Ulrich@feddit.org 4 hours ago
Cruel@programming.dev 4 hours ago
Why? I can’t think of any reason that would not also preclude their usage if all computer assisted tools.
Ulrich@feddit.org 4 hours ago
Because no other computer assisted tools are straight up fucking wrong half the time?
Cruel@programming.dev 1 hour ago
If your AI tools are wrong half the time, you’re using it wrong. My legal AI is linked to databases of statutes and case law, providing results more reliable than most legal professionals.
ScoffingLizard@lemmy.dbzer0.com 3 hours ago
There are bad actors in government. People working those cases like the ones you did have way more integrity than the bad faith actors in the DoD. There are bad faith actors in governments breaking the law every day.
XLE@piefed.social 2 hours ago
I have a question about those guardrails. At any point, did any of your accounts get disabled for discussing abuse in this (or any) context?
I(’m guessing this happened zero times, which probably means those guardrails are just irritating suggestions designed to keep you prompting…)
Cruel@programming.dev 1 hour ago
Not cancelled. But they may have been flagged internally, I don’t know.
We weren’t violating their terms, only violating their built in model guidelines.
But even adjusting prompts, it didn’t yield reliable results. So we have to use uncensored open weights models for many things. It’s not SOTA, but it’s better than nothing.
eyelevel@lemmy.world 5 hours ago
[deleted]Cruel@programming.dev 1 hour ago
-
Government can certainly do illegal things. But why ever enter into a contract with ANY organization or business if you’re concerned that they can violate the terms?
-
Generally not.
-
Even if they aren’t, Anthropic started business with them months ago. Backing out now for this reason would be a pretense.
-
testusr@lemmy.world 6 hours ago
Shame on you and your company for introducing AI at all into such sensitive matters. This issue is not just about security and privacy but about outsourcing human judgement when human life is on the line.
Cruel@programming.dev 5 hours ago
Wait. You still trust human judgment?
My company has facilitated the filings of hundreds of pre se litigants who cannot afford lawyers, helped swamped public defenders with more cases than they could ever hope to defend without just making plea deals.
Meanwhile you probably sit around complaining about prison industrial complex and corrupt justice system. Doing nothing. Worthless.