Comment on Police Unmask Millions of Surveillance Targets Because of Flock Redaction Error
JollyG@lemmy.world 3 days agoAs I said before: In a conversation about technology as it actually exists, talking about potentials is not interesting. Yes all technology has the potential to be good or bad. The massive surveillance tech is actually bad right now in the real world
This issue with asserting that technology is neutral is it lets the people who develop it ignore the impacts of their work. The engineers that make surveillance tech make it, ultimately, for immoral purposes. When they are confronted with the effects of their work on society they avoid according with the ethics of what it is that they are doing by deploying bromides like “technology is neutral.”
Example: Building an operant conditioning feedback system into a social media app or video game is not inherently bad, you could use it to reinforce good behaviors and deploy it ethically by obtaining the consent of the people you use on. But the operant conditioning tech in social media apps and video games that actually exists is very clearly and unambiguously bad. It exists to get people addicted to a game or media app, so that they can be more easily exploited. Engineers built that tech stack out for the purpose of exploiting people. The tech, as it exists in the real world, is bad. When these folks were confronted with what they had done, they responded by claiming that tech is not inherently good or bad. (This is a real thing social media engineers really said) They ignored the tech—as it actually exists—in favor of an abstract conversation about some potential alternative tech that does not exist. The effect of which is the people doing harm built a terrible system without ever confronting what it was they were doing.
InternetCitizen2@lemmy.world 3 days ago
I don’t see how that is the case. The tech is neutral, but the engineers know what the application they are hired for is. That is determined by people and subject to morality.
Would you say openCV or the people working on it are evil? I wouldn’t. I would say that once someone takes that project for flock is evil.
I think this framing is more important when talking with the general public as they are likely to walk away thinking that its the tech that creates problems and not the for profit corporations who will be free to continue doing the same, so long as they don’t use that tech.
JollyG@lemmy.world 3 days ago
It is literally the case. People who have literally made tools to do bad things justified it by claiming that tech is neutral in an abstract sense. Find an engineer who is building a tool to do something they think is bad, they will tell you that bromide.
OpenCV is not, in itself, immoral. But openCV is, once again, actual tech that exists in the actual world. In fact, that is how I know it is not bad, I use the context of reality—rather than hypotheticals or abstractions—to assess the morality of the tech. The tech stack that makes up Flock is bad, once again I make that determination by using the actual world as a reference point. It does not matter that some of the tech could be used to do good. In the case of Flock, it is not, so it’s bad.
InternetCitizen2@lemmy.world 3 days ago
Bold a keyword there for you
JollyG@lemmy.world 3 days ago
At no point in this conversation have I ever said that tech in an abstract sense is inherently good or bad. The point that I am making— and this is the last time I will make it— is that it is not interesting to talk about the ethics of some technology in an abstraction in cases where the actual tech is as it is actually implemented is clearly bad.
Saying “tech is neutral” is a dodge. People say that to avoid thinking about the ethics of what it is they are doing.