spooky2092
@spooky2092@lemmy.blahaj.zone
- Comment on 1 day ago:
Just go base 2 and you can count to 31 without using knuckles
- Comment on Trying to avoid antitrust suits, Google senior executives told employees to destroy messages 2 days ago:
I think you’ve got it backwards. Like the other person said, this shit was built to make money, the power and control came later. Said power and control also came from the mobeypartially from the money, since money is just power coupons, and they used that to buy up competitors and regulators alike.
- Comment on Trying to avoid antitrust suits, Google senior executives told employees to destroy messages 2 days ago:
Not quite sure exactly what that slots into, tampering with evidence, obstruction of justice, not complying with the discovery process.
Pretty sure that’s a hat trick
- Comment on Well played, Todd: You can see Skyrim, or at least its tallest mountain, from the edge of Oblivion Remastered 2 days ago:
Call me when you can actually move between regions.
- Comment on Philosophy moment 3 days ago:
How can one high school eat so many phones?
- Comment on Things are about to finally change! 4 days ago:
- Comment on Things are about to finally change! 4 days ago:
I’m here for shitposts, not shit posts.
- Comment on Smh being fired for police work 4 days ago:
Who doesn’t enjoy a policeman’s ball? I know it’s called a “circle jerk” elsewhere, but let’s respect their culture
- Comment on *dies of cringe* 1 week ago:
🌎🧑🚀🔫👩🚀
- Comment on The therapy I can afford 1 week ago:
Goddamn you guys are the most paranoid people I’ve ever witnessed. What in the world do you think mega corps are going to do to me for sharing incoherent nonsense to Facebook?
You, 10-20 years ago. I heard these arguments from people in the early days, well before Facebook blew up or Cambridge Analytica was a name any normies knew.
This isn’t the early 00s anymore where we can pretend that every big corp isn’t vacuuming up every shred of data they can. Add on the fascistic government taking shape in the US and the general trend towards right leaning parties gaining power in governments across the world, and you’d have to be completely naive to not see the issues with using a ‘therapist’ that will save every datapoint to its training and could be mined to use against you or willingly handed over to an oppressive government to use however they so choose.
- Comment on What are some FOSS programs that are objectively better than their proprietary counterparts? 1 week ago:
It’s even better when tied to an automation app. I’ve got FileFlows sitting in my media library, so any time I drop new stuff in, it automatically gets converted to my preferred on disk format.
I still get some ones I have to touch manually, but most of it gets taken care of without even thinking about it.
- Comment on What are some FOSS programs that are objectively better than their proprietary counterparts? 1 week ago:
This is where I’m at. I may use a second note taking app, but I’ve always got vscodium up anyway, so may as well just make 1 more tab (probably in the 2nd window tho)
- Comment on Tesla speeds up odometers to avoid warranty repairs, U.S. lawsuit claims: Reuters 1 week ago:
The truck shorts out in the rain and catches fire instead of leaving the flaming tracks
- Comment on Tesla speeds up odometers to avoid warranty repairs, U.S. lawsuit claims: Reuters 1 week ago:
I’ll one up them, make it one of those shitty NFT monkey cartoons
- Comment on A reminder that the majority of anti-Reddit *hardliners* went back crawling to Reddit like the bitches they are 1 week ago:
So no, you don’t have any evidence other than your own conjecture.
- Comment on A reminder that the majority of anti-Reddit *hardliners* went back crawling to Reddit like the bitches they are 1 week ago:
Do you have any evidence that they went back to reddit and didn’t just account/instance hop? I’m in that thread under a previous account, but this is like my 4th or 5th account. I go back to some of them, but I like scoping out different sections of Lemmy/mbin to get a feel for where has the best vibe.
- Comment on This is why we have a defense budget 1 week ago:
[Y]our argument assumes the relationship goes both ways, which isn’t even close to true. An individual weeb may be a MAGAt, but all weebs aren’t.
So your argument pretty much falls apart, unless you’re just going after weebs for personal reasons.
- Comment on This is why we have a defense budget 1 week ago:
scratch a Musk-stan and you’ll likely find a Weeb who is also a MAGAt.
So your argument is that weebs are musk stans/MAGAts?
If not, what point are you trying to make? Because a weeb may be a musk stan/MAGAt, but let’s not target weebs just because some of them are shitheads.
These aren’t exclusive.
But your argument assumes the relationship goes both ways, which isn’t even close to true. An individual weeb may be a MAGAt, but all weebs aren’t.
So your argument pretty much falls apart, unless you’re just going after weebs for personal reasons.
- Comment on This is why we have a defense budget 1 week ago:
I would say let’s compare that to the MAGAts and see who has more, but MAGAts can get pardoned by Donvict #47, so it’s not a fair comparison.
- Comment on Can you believe it? 2 weeks ago:
Lousy smarch weather
- Comment on This is why we have a defense budget 2 weeks ago:
Nah, that weeb is probably fine. Save your missile strikes for the thin blue line/ammosexual/deranged trump supporter trucks
- Comment on The US Secretary of Education referred to AI as ‘A1,’ like the steak sauce 2 weeks ago:
Until she choke slams a different cabinet member and takes their
beltposition - Comment on Google give a 71% discount to US federal agencies for Workspace, as it looks to capitalize on the Trump administration's cost-cutting push. 2 weeks ago:
Microsoft’s support also suuuuuuuuucks. We paid $500 once for assistance on an issue with a specific piece of hardware and the OS, and it took them MONTHS to even respond to us. I’d been demanding a refund for at least a full quarter before they even gave me the first response…
- Comment on Google give a 71% discount to US federal agencies for Workspace, as it looks to capitalize on the Trump administration's cost-cutting push. 2 weeks ago:
Or maybe they just expect every state government (or worse, individual local municipalities) to roll their own personal cloud. Like have everybody set up a NextCloud server and just hope shit doesn’t fall over.
- Comment on Adobe Gets Bullied Off Bluesky 2 weeks ago:
I don’t necessarily agree that this means blue sky is toxic (and I’m speaking as someone who doesn’t use the app), I see it as a toxic company finding out what people think of them.
As you noted adobe is a dick (and that’s one hell of an understatement), and they regularly make anti-consumer choices with their software and pricing. This is just them seeing what you’ve been ignoring for a decade or more.
Maybe the multi billion dollar company should
growsubscribe to my monthly subscription to thicker skin. - Comment on POV: You're too shy to tell the medical staff that you just woke up during surgery. 3 weeks ago:
General anesthesia in a C-section means there’s some kind of emergency on the mother’s end, and once the drugs are administered the surgery needs to be done FAST because they can effect the baby.
Yeah… its a scary af time. Especially since the general can take a long time to wear off and the mother stabilize.
- Comment on Microsoft employee disrupts 50th anniversary and calls AI boss ‘war profiteer’ 3 weeks ago:
Holy shit, I remember playing NationStates as a youngling, and I think I read the book too? Idk, it’s been 20 years.
- Comment on Microsoft employee disrupts 50th anniversary and calls AI boss ‘war profiteer’ 3 weeks ago:
(continued)
China is joining in with AI
Last month, the New York Times reported on a new disinformation campaign. “Spamouflage” is an effort by China to divide Americans by combining AI with real images of the United States to exacerbate political and social tensions in the U.S. The goal appears to be to cause Americans to lose hope, by promoting exaggerated stories with fabricated photos about homeless violence and the risk of civil war.
As Ladislav Bittman, a former Czechoslovakian secret police operative, explained about Soviet disinformation, the strategy is not to invent something totally fake. Rather, it is to act like an evil doctor who expertly diagnoses the patient’s vulnerabilities and exploits them, “prolongs his illness and speeds him to an early grave instead of curing him.”
The influence networks are vastly more effective than platforms admit
Russia now runs its most sophisticated online influence efforts through a network called Fabrika. Fabrika’s operators have bragged that social media platforms catch only 1% of their fake accounts across YouTube, Twitter, TikTok, and Telegram, and other platforms.
But how effective are these efforts? By 2020, Facebook’s most popular pages for Christian and Black American content were run by Eastern European troll farms tied to the Kremlin. And Russia doesn’t just target angry Boomers on Facebook. Russian trolls are enormously active on Twitter. And, even, on Reddit.
It’s not just false facts
The term “disinformation” undersells the problem. Because much of Russia’s social media activity is not trying to spread fake news. Instead, the goal is to divide and conquer by making Western audiences depressed and extreme.
Sometimes, through brigading and trolling. Other times, by posting hyper-negative or extremist posts or opinions about the U.S. the West over and over, until readers assume that’s how most people feel. And sometimes, by using trolls to disrupt threads that advance Western unity.
As the RAND think tank explained, the Russian strategy is volume and repetition, from numerous accounts, to overwhelm real social media users and create the appearance that everyone disagrees with, or even hates, them. And it’s not just low-quality bots. Per RAND,
Russian propaganda is produced in incredibly large volumes and is broadcast or otherwise distributed via a large number of channels. … According to a former paid Russian Internet troll, the trolls are on duty 24 hours a day, in 12-hour shifts, and each has a daily quota of 135 posted comments of at least 200 characters.
What this means for you
You are being targeted by a sophisticated PR campaign meant to make you more resentful, bitter, and depressed. It’s not just disinformation; it’s also real-life human writers and advanced bot networks working hard to shift the conversation to the most negative and divisive topics and opinions.
It’s why some topics seem to go from non-issues to constant controversy and discussion, with no clear reason, across social media platforms. And a lot of those trolls are actual, “professional” writers whose job is to sound real.
So what can you do? To quote WarGames: The only winning move is not to play. The reality is that you cannot distinguish disinformation accounts from real social media users. Unless you know whom you’re talking to, there is a genuine chance that the post, tweet, or comment you are reading is an attempt to manipulate you – politically or emotionally.
Here are some thoughts:
-
Don’t accept facts from social media accounts you don’t know. Russian, Chinese, and other manipulation efforts are not uniform. Some will make deranged claims, but others will tell half-truths. Or they’ll spin facts about a complicated subject, be it the war in Ukraine or loneliness in young men, to give you a warped view of reality and spread division in the West.
-
Resist groupthink. A key element of manipulate networks is volume. People are naturally inclined to believe statements that have broad support. When a post gets 5,000 upvotes, it’s easy to think the crowd is right. But “the crowd” could be fake accounts, and even if they’re not, the brilliance of government manipulation campaigns is that they say things people are already predisposed to think. They’ll tell conservative audiences something misleading about a Democrat, or make up a lie about Republicans that catches fire on a liberal server or subreddit.
-
Don’t let social media warp your view of society. This is harder than it seems, but you need to accept that the facts – and the opinions – you see across social media are not reliable. If you want the news, do what everyone online says not to: look at serious, mainstream media. It is not always right. Sometimes, it screws up. But social media narratives are heavily manipulated by networks whose job is to ensure you are deceived, angry, and divided.
-
- Comment on Microsoft employee disrupts 50th anniversary and calls AI boss ‘war profiteer’ 3 weeks ago:
Copying their post over (with minimal formatting, unfortunately) for anyone that doesn’t care to go to that site (and to make sure it doesn’t randomly disappear)
r/self 5 mo. ago walkandtalkk You’re being targeted by disinformation networks that are vastly more effective than you realize. And they’re making you more hateful and depressed.
(I wrote this post in March and posted it on r/GenZ. However, a few people messaged me to say that the r/GenZ moderators took it down last week, though I’m not sure why. Given the flood of divisive, gender-war posts we’ve seen in the past five days, and several countries’ demonstrated use of gender-war propaganda to fuel political division in multiple countries, I felt it was important to repost this. This post was written for a U.S. audience, but the implications are increasingly global.)
TL;DR: You know that Russia and other governments try to manipulate people online. But you almost certainly don’t how just how effectively orchestrated influence networks are using social media platforms to make you – individually-- angry, depressed, and hateful toward each other. Those networks’ goal is simple: to cause Americans and other Westerners – especially young ones – to give up on social cohesion and to give up on learning the truth, so that Western countries lack the will to stand up to authoritarians and extremists.
And you probably don’t realize how well it’s working on you.
This is a long post, but I wrote it because this problem is real, and it’s much scarier than you think.
How Russian networks fuel racial and gender wars to make Americans fight one another
In September 2018, a video went viral after being posted by In the Now, a social media news channel. It featured a feminist activist pouring bleach on a male subway passenger for manspreading. It got instant attention, with millions of views and wide social media outrage. Reddit users wrote that it had turned them against feminism.
There was one problem: The video was staged. And In the Now, which publicized it, is a subsidiary of RT, formerly Russia Today, the Kremlin TV channel aimed at foreign, English-speaking audiences.
As an MIT study found in 2019, Russia’s online influence networks reached 140 million Americans every month – the majority of U.S. social media users.
Russia began using troll farms a decade ago to incite gender and racial divisions in the United States
In 2013, Yevgeny Prigozhin, a confidante of Vladimir Putin, founded the Internet Research Agency (the IRA) in St. Petersburg. It was the Russian government’s first coordinated facility to disrupt U.S. society and politics through social media.
Here’s what Prigozhin had to say about the IRA’s efforts to disrupt the 2022 election:
>Gentlemen, we interfered, we interfere and we will interfere. Carefully, precisely, surgically and in our own way, as we know how. During our pinpoint operations, we will remove both kidneys and the liver at once.
In 2014, the IRA and other Russian networks began establishing fake U.S. activist groups on social media. By 2015, hundreds of English-speaking young Russians worked at the IRA. Their assignment was to use those false social-media accounts, especially on Facebook and Twitter – but also on Reddit, Tumblr, 9gag, and other platforms – to aggressively spread conspiracy theories and mocking, ad hominem arguments that incite American users.
In 2017, U.S. intelligence found that Blacktivist, a Facebook and Twitter group with more followers than the official Black Lives Matter movement, was operated by Russia. Blacktivist regularly attacked America as racist and urged black users to rejected major candidates. On November 2, 2016, just before the 2016 election, Blacktivist’s Twitter urged Black Americans: “Choose peace and vote for Jill Stein. Trust me, it’s not a wasted vote.”
Russia plays both sides – on gender, race, and religion
The brilliance of the Russian influence campaign is that it convinces Americans to attack each other, worsening both misandry and misogyny, mutual racial hatred, and extreme antisemitism and Islamophobia. In short, it’s not just an effort to boost the right wing; it’s an effort to radicalize everybody.
Russia uses its trolling networks to aggressively attack men. According to MIT, in 2019, the most popular Black-oriented Facebook page was the charmingly named “My Baby Daddy Aint Shit.” It regularly posts memes attacking Black men and government welfare workers. It serves two purposes: Make poor black women hate men, and goad black men into flame wars.
MIT found that My Baby Daddy is run by a large troll network in Eastern Europe likely financed by Russia.
But Russian influence networks are also also aggressively misogynistic and aggressively anti-LGBT.
On January 23, 2017, just after the first Women’s March, the New York Times found that the Internet Research Agency began a coordinated attack on the movement. Per the Times:
>More than 4,000 miles away, organizations linked to the Russian government had assigned teams to the Women’s March. At desks in bland offices in St. Petersburg, using models derived from advertising and public relations, copywriters were testing out social media messages critical of the Women’s March movement, adopting the personas of fictional Americans. >They posted as Black women critical of white feminism, conservative women who felt excluded, and men who mocked participants as hairy-legged whiners.
But the Russian PR teams realized that one attack worked better than the rest: They accused its co-founder, Arab American Linda Sarsour, of being an antisemite. Over the next 18 months, at least 152 Russian accounts regularly attacked Sarsour. That may not seem like many accounts, but it worked: They drove the Women’s March movement into disarray and eventually crippled the organization.
Russia doesn’t need a million accounts, or even that many likes or upvotes. It just needs to get enough attention that actual Western users begin amplifying its content.
A former federal prosecutor who investigated the Russian disinformation effort summarized it like this:
>It wasn’t exclusively about Trump and Clinton anymore. It was deeper and more sinister and more diffuse in its focus on exploiting divisions within society on any number of different levels.
As the New York Times reported in 2022,
>There was a routine: Arriving for a shift, [Russian disinformation] workers would scan news outlets on the ideological fringes, far left and far right, mining for extreme content that they could publish and amplify on the platforms, feeding extreme views into mainstream conversations.
- Comment on Maybe it's just a human thing. 3 weeks ago:
Christian faith that only exists in church is a false faith
They’re probably going for how the religion is not what it’s supposed to be.