When you tell a health care provider about mental health concerns, doctor-patient confidentiality protects those conversations. However, it doesn’t necessarily exist when you use one of the many mental health apps. Let’s look at why that’s the case.

Is Mental Health App Data Secure and Reliable?

The short answer is: it depends. One challenge relates to the sheer variety of mental health apps and how they work. They range from mood trackers and collections of meditations to chatbots and one-on-one virtual visits with trained therapists.

Each app has specific policies and associated data protection. There are also numerous cases of workers at mental health app companies mining or sharing data.

According to Salon, former employees of mental health app Talkspace said people at the company regularly looked through patient-therapist transcripts to find common phrases and use them to improve marketing to potential users.

The investigation also included the experience of a man named Ricardo Lori, a Talkspace user who took a job in the company’s customer service department. Executives asked him to read excerpts of his therapy chat logs, promising anonymity. However, word somehow got out that Lori was the patient described in the sessions.

These incidents highlight the need to always scrutinize privacy policies before signing up for a service. It’s quick and easy to agree to terms without doing that, but it could put your confidentiality at risk.

Mozilla Exposes Privacy Practices of Mental Health Apps

A Black woman sitting on a blue couch on a video call through her laptop

The Mozilla Foundation evaluated the privacy and security associated with 32 mental health and prayer apps. The results showed that 25 did not meet Mozilla’s minimum security standards, such as requiring strong passwords. The researchers also had strong concerns about how 28 apps handled user data.

Jen Caltrider, the project lead, said, The vast majority of mental health and prayer apps are exceptionally creepy. These apps track, share, and capitalize on users’ most intimate personal thoughts and feelings, like moods, mental state, and biometric data.

Turns out, researching mental health apps is not good for your mental health, as it reveals how negligent and craven these companies can be with our most intimate personal information.”

The Privacy Risks Don’t End With Data Sharing

a man in a beanie sitting and thinking in a cafe with his laptop

An unfortunate reality is that the health care industry is rampant with unscrupulous companies and individuals trying to take advantage of people who are often in desperate situations. For example, the U.S. Department of Justice charged 345 people with more than $6 billion in health care fraud allegations, as reported by the non-profit organization URAC.

Even when mental health apps keep data safe, they’re not always upfront about other aspects. According to Newsweek, Katie Mac posted an experience on TikTok about signing up for the Cerebral mental health app. She took issue with how the service required payment information before matching her with a therapist.

Mac did give her details, but after doing so, found that there were only two therapists available in her area. Neither of their expertise types met her needs. Mac then learned the app’s policy was to only issue 30 percent refunds. She called the app a “scam” and threatened to report it to state agencies.

Health Data Is Regularly Digitized

Illustration of health care data management

Individuals have plenty of ways to stay on top of their health through various apps. The DNA testing service 23andme can tell people about risk factors for late-onset Alzheimer’s disease.

On the other hand, people who use Apple Health can track everything from inhaler usage to asymmetrical walking patterns. Researchers at UCLA are also using volunteers’ data from Apple Watches and iPhones to get better insights into depression.

When installing a new app, you probably see settings for apps allowed to read data. Apple Health and similar services can pull information from other sources and compile it in one place. It’s then easier to review trends with your health care provider.

However, health data hacks are more common than you think. One study by Politico revealed that nearly three-quarters of health data breaches in 2021 involved hackers.

Another potential risk is how company acquisitions may give cybercriminals access to more data. Fitness company Fitbit, which Google acquired in 2019, has more than 30 million active users. Google itself has numerous health initiatives for both consumers and providers.

Learn Why Apps Need Your Data and What Happens to It

Given these unsettling revelations, what’s the most proactive thing you can do to keep your mental health app data secure?

Act carefully by taking the time to read how and why apps or service providers use your data. If anything in the privacy policies makes you feel uncomfortable, think twice before signing up for them.