This is the best summary I could come up with:
While companies like BetterHelp are hoping to address some of the barriers that prevent people from seeking therapy, such as a dearth of trained practitioners in their area, or finding a therapist they can relate to, there is a concerning side to many of these platforms.
Last year, the US Federal Trade Commission handed BetterHelp a $7.8m (£6.1m) fine after the agency found that it had deceived consumers and shared sensitive data with third parties for advertising purposes, despite promising to keep such information private.
Independent watchdogs such as the Mozilla Foundation, a global nonprofit that attempts to police the internet for bad actors, have identified platforms exploiting opaque regulatory grey areas to either share or sell sensitive personal information.
Like many others who have researched this rapidly growing industry – the digital mental health apps market has been predicted to be worth $17.5bn (£13.8bn) by 2030 – Caltrider feels that tighter regulation and oversight of these many platforms, aimed at a particularly vulnerable segment of the population, is long overdue.
Holly Coole, senior manager for digital mental health at the MHRA, explains that while data privacy is important, the main focus of the project is to achieve a consensus on the minimum standards for safety for these tools.
But at the same time, experts still firmly believe that if regulated appropriately, mental health apps can play an enormous role in terms of improving access to care, collecting useful data that can aid in reaching an accurate diagnosis, and filling gaps left by overstretched healthcare systems.
The original article contains 1,773 words, the summary contains 256 words. Saved 86%. I’m a bot and I’m open source!
AtmaJnana@lemmy.world 9 months ago
BetterHelp is a scam.
wizardbeard@lemmy.dbzer0.com 9 months ago
Fug. Got any articles/videos on that? Was considering them due to convenience.
grabyourmotherskeys@lemmy.world 9 months ago
How does that make you feel? /s