Comment on Discord walks back age verification fears for most users
ICastFist@programming.dev 1 day ago
The company’s deploying AI-powered inference to estimate user ages based on behavioral patterns, account history, and other signals already in its systems. Only when that automated prediction fails or flags uncertainty will users face requests for manual verification through ID uploads or facial scanning.
Oh, so it’s even easier to game it and pretend to be “an adult”. Simply avoid any minecraft and roblox communities to be an instant adult.
Discord didn’t specify exactly what percentage constitutes a “vast majority,” nor did it detail which signals feed into its age prediction models. That lack of transparency could become its own issue as regulators increasingly scrutinize how platforms handle youth safety versus privacy rights.
Big brother is watching you masturbate, for your own good, of course.
For the subset of users who do get flagged for manual verification, Discord says it’s partnering with third-party services that specialize in age verification tech. These vendors typically process ID documents or facial scans without permanently storing biometric data, though implementation details remain vague.
It’s not permanent storage if they delete after safely selling to interested buys!
As lawmakers ramp up age verification requirements globally, expect more platforms to walk this tightrope between compliance and community trust.
I wonder if this is what might actually push more people into “host your own shit” that’s easy to shut down and migrate as needed.
hector@lemmy.today 22 hours ago
They will never delete it anyway. They might park it in a different spot, sell it as well, then delete where they originally stored it. The new spot they part it might even have a degree of separation or so from the company. But they never delete information, don’t ever trust they do.