Techno Blender
Digitally Yours.

Mental health apps have terrible privacy protections, report finds

0 95


As a category, mental health apps have worse privacy protections for users than most other types of apps, according to a new analysis from researchers at Mozilla. Prayer apps also had poor privacy standards, the team found.

“The vast majority of mental health and prayer apps are exceptionally creepy,” Jen Caltrider, the Mozilla *Privacy Not Included guide lead, said in a statement. “They track, share, and capitalize on users’ most intimate personal thoughts and feelings, like moods, mental state, and biometric data.”

In the latest iteration of the guide, the team analyzed 32 mental health and prayer apps. Of those apps, 29 were given a “privacy not included” warning label, indicating that the team had concerns about how the app managed user data. The apps are designed for sensitive issues like mental health conditions, yet collect large amounts of personal data under vague privacy policies, the team said in the statement. Most apps also had poor security practices, letting users create accounts with weak passwords despite containing deeply personal information.

The apps with the worst practices, according to Mozilla, are Better Help, Youper, Woebot, Better Stop Suicide, Pray.com, and Talkspace. The AI chatbot Woebot, for example, says it collects information about users from third parties and shares user information for advertising purposes. Therapy provider Talkspace collects user chat transcripts.

The Mozilla team said in a statement that it reached out to the companies behind these apps to ask about their policies multiple times, but only three responded.

In-person, traditional mental health care can be hard for many people to find — most therapists have long waiting lists, and navigating insurance and costs can be a major barrier to care. The problem got worse during the COVID-19 pandemic when more and more people started to need care. Mental health apps sought to fill that void by making resources more accessible and readily available. But that access could come with a privacy tradeoff, the report shows.

“They operate like data-sucking machines with a mental health app veneer,” said Mozilla researcher Misha Rykov in a statement. “In other words: A wolf in sheep’s clothing,”


As a category, mental health apps have worse privacy protections for users than most other types of apps, according to a new analysis from researchers at Mozilla. Prayer apps also had poor privacy standards, the team found.

“The vast majority of mental health and prayer apps are exceptionally creepy,” Jen Caltrider, the Mozilla *Privacy Not Included guide lead, said in a statement. “They track, share, and capitalize on users’ most intimate personal thoughts and feelings, like moods, mental state, and biometric data.”

In the latest iteration of the guide, the team analyzed 32 mental health and prayer apps. Of those apps, 29 were given a “privacy not included” warning label, indicating that the team had concerns about how the app managed user data. The apps are designed for sensitive issues like mental health conditions, yet collect large amounts of personal data under vague privacy policies, the team said in the statement. Most apps also had poor security practices, letting users create accounts with weak passwords despite containing deeply personal information.

The apps with the worst practices, according to Mozilla, are Better Help, Youper, Woebot, Better Stop Suicide, Pray.com, and Talkspace. The AI chatbot Woebot, for example, says it collects information about users from third parties and shares user information for advertising purposes. Therapy provider Talkspace collects user chat transcripts.

The Mozilla team said in a statement that it reached out to the companies behind these apps to ask about their policies multiple times, but only three responded.

In-person, traditional mental health care can be hard for many people to find — most therapists have long waiting lists, and navigating insurance and costs can be a major barrier to care. The problem got worse during the COVID-19 pandemic when more and more people started to need care. Mental health apps sought to fill that void by making resources more accessible and readily available. But that access could come with a privacy tradeoff, the report shows.

“They operate like data-sucking machines with a mental health app veneer,” said Mozilla researcher Misha Rykov in a statement. “In other words: A wolf in sheep’s clothing,”

FOLLOW US ON GOOGLE NEWS

Read original article here

Denial of responsibility! Techno Blender is an automatic aggregator of the all world’s media. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials, please contact us by email – [email protected]. The content will be deleted within 24 hours.

Leave a comment