Comment on Promised myself I will support them after they go stable. They kept their promise and so did I
upsidebeet@sh.itjust.works 2 days ago[deleted]
Comment on Promised myself I will support them after they go stable. They kept their promise and so did I
upsidebeet@sh.itjust.works 2 days ago
sugar_in_your_tea@sh.itjust.works 2 days ago
No, that’s not fascist. Facial recognition software can be used for a variety of reasons, like unlocking a phone or laptop, gaining access to secure areas, or home automation stuff.
It’s only fascist if used by a government to oppress minorities. The software itself cannot be fascist, but it can be used by fascists.
upsidebeet@sh.itjust.works 2 days ago
sugar_in_your_tea@sh.itjust.works 1 day ago
The fault lies with the makers and users of the softeware. Software doesn’t have political opinions, it’s software.
It’s like saying Panzer tanks were fascist because they were built by the Nazis. Tanks cannot be fascist, they’re tanks. So despite being made and used by fascists, they’re not fascist, they’re tanks.
That’s the same exact thing here. Facial recognition software can be used by fascists, but that doesn’t make the software itself fascist.
PeriodicallyPedantic@lemmy.ca 1 day ago
The other person deleted their comment so I can’t really know what the argument was, but I would like to make a distinction:
While tools cannot be political themselves, tools can lend themselves to specific political purposes.
A tank cannot itself be fascist, but it can make fascism more viable. Surveillance software cannot be political, but it is easily abused by fascists to destroy political opposition.
What matters is the harm and benefits. Is the harm caused by the tool justified by it’s benefits? Or are the primary use cases for the tool to prop up fascism?
(I suspect that “authoritarianism” would be a better term to use here, but I’m continuing the theme of the thread)
upsidebeet@sh.itjust.works 1 day ago