cross-posted from: lemmy.sdf.org/post/34850412
TikTok is failing to address serious risks of harm to young users’ mental and physical health almost 18 months after Amnesty International highlighted these risks in a groundbreaking report.
The 2023 research revealed that children were at risk of being drawn into toxic “rabbit holes” of depression and suicide related content on TikTok’s ‘For You’ feed.
In an investigation using accounts to simulate 13-year-olds online, Amnesty International found that within 20 minutes of starting a new account and signalling an interest in mental health, more than half of the videos in TikTok’s ‘For You’ feed related to mental health struggles. Multiple of these recommended videos in a single hour romanticised, normalised or encouraged suicide.
[…]
Despite TikTok’s growing user base, particularly in countries with young populations like Kenya, where the median age is 20, the platform is yet to conduct a basic child rights due diligence to address any risks posed to its youngest users.
TikTok’s response to our latest research questions on what it is doing to makes the app safer for young users reveals that seven years after becoming available internationally, the company is still waiting for an external provider to complete a child rights impact assessment for the platform, a key responsibility under international human rights standards for businesses.
[…]
TikTok states, “like other apps, TikTok collects information that users choose to provide, along with data that supports things like app functionality, security, and overall user experience” and that “viewing a video doesn’t necessarily implicate someone’s identity”.
And yet TikTok’s ‘For You’ feed clearly picks up on a person’s emotional state when it amplifies masses of depression and even suicide-related content and then uses their susceptibility to this content to recommend more of it, regardless of the potential harms.
[…]
technocrit@lemmy.dbzer0.com 6 hours ago
This is fairly biased propaganda. The most obvious signs are the focus on TikTok (instead of social media) and the focus on “simulated” 13yos instead of humans in general.
Amnesty should stick with actual human rights, etc. rather than pseudo-science, sinophobia, and save-the-children hysteria.