Comment on Promised myself I will support them after they go stable. They kept their promise and so did I
upsidebeet@sh.itjust.works 2 days ago[deleted]
Comment on Promised myself I will support them after they go stable. They kept their promise and so did I
upsidebeet@sh.itjust.works 2 days ago
sugar_in_your_tea@sh.itjust.works 2 days ago
The fault lies with the makers and users of the softeware. Software doesn’t have political opinions, it’s software.
It’s like saying Panzer tanks were fascist because they were built by the Nazis. Tanks cannot be fascist, they’re tanks. So despite being made and used by fascists, they’re not fascist, they’re tanks.
That’s the same exact thing here. Facial recognition software can be used by fascists, but that doesn’t make the software itself fascist.
PeriodicallyPedantic@lemmy.ca 2 days ago
The other person deleted their comment so I can’t really know what the argument was, but I would like to make a distinction:
While tools cannot be political themselves, tools can lend themselves to specific political purposes.
A tank cannot itself be fascist, but it can make fascism more viable. Surveillance software cannot be political, but it is easily abused by fascists to destroy political opposition.
What matters is the harm and benefits. Is the harm caused by the tool justified by it’s benefits? Or are the primary use cases for the tool to prop up fascism?
(I suspect that “authoritarianism” would be a better term to use here, but I’m continuing the theme of the thread)
sugar_in_your_tea@sh.itjust.works 1 day ago
Their argument was that software can, in itself, be fascist, and that’s what we went around and around on. The example given was facial recognition software that can determine race (and later, country of origin).
Essentially, I said exactly what you’re saying, while they argued the opposite. I wish I quoted them, but I did only directly address their claims, if you’ll take my word for it.
I don’t want the government to have and use facial recognition software (their example) and extensive security camera systems (my example, such as Flock), not because those solutions are fascist in and of themselves, but that they can be used by fascists to accomplish their goals. Even if the current regime uses them purely for good (i.e. completely opt in facial recognition, cameras inaccessible to police until there’s a warrant with no passive collection) the next regime may not.
PeriodicallyPedantic@lemmy.ca 1 day ago
The extension of the argument I’m making (and maybe them kinda?) is that it’s functionally the same as if the software were political.
You can make software that nearly exclusively benefits a particular political belief for family of beliefs.
So even if it’s not actually technically political, it can be functionally political, at which point the argument is splitting hairs.
upsidebeet@sh.itjust.works 2 days ago
sugar_in_your_tea@sh.itjust.works 2 days ago
Again, it’s not the software itself that’s fascist, it’s what it’s being used for that’s fascist. Facial recognition for determining citizenship could absolutely be used for non-fascist purposes, like simplifying border crossings to not require documentation. Likewise, surveillance systems can also not be used until there’s an actual warrant (i.e. no passive recording).
The technology itself isn’t fascist, it’s how it’s applied that’s fascist. The mass data collection is fascist, the tools used to collect that data isn’t fascist in the same way that guns and tanks aren’t fascist, but they can certainly be used by fascists.
upsidebeet@sh.itjust.works 2 days ago