It is no secret that social media sites and apps pose many alarming realizations when it comes to privacy, surveillance, and collection of personal information and data. This is a current debate that deserves all the attention it is receiving and more. Therapy apps are sharing your private data.

Awareness of what happens behind the scenes (or screens, if you will) helps people understand just how much information is collected, collated, stored, and utilized for targeted ads (i.e., you are being used, sold, and marketed by social media platforms to corporations and entities in order for them to make money off of you).

The users become both the consumer and the consumed. And it is very problematic. 

Your data and habits, mined by technology companies and their constituents, is used to track your behavior and predict/coerce your spending and consumption choices. So what happens when health data is mired in this mass of traceable, profitable information? Thanks to HIPAA, health info is supposed to stay anonymous.

Yet, in this digital world we live in – where everything is quantifiable and documented electronically – breadcrumbs lead back to the source of the data (which is you, the user/used). 

If Facebook is able to see when your last five therapy sessions were and how long each lasted and maybe even some of the topics discussed, it can use its algorithms to predict when you are most likely to be feeling depressed next (and maybe even what triggers you).

This leaves the user/used unaware, vulnerable, and very, very endangered. It all feels like something straight out of an episode of Black Mirror, and it should be terrifying. 

Promoting access to mental health care is important, and we support the growing trends in awareness and willingness to seek care. But data about every time you open a therapy app, the exact length of the conversation, etc. should never be shared or used to “personalize experiences.”

The health and wellness concerns that are sparked by the rise of therapy apps prove that more transparency and thoughtful regulation to protect patient interests is critical. 

This article goes more in depth about the lengths to which these invisible hands are able to reach through the tiny screens we house in our pockets. They share how they monitored the data that was being collected by the therapy apps and how this information was then being mined and utilized by other social media apps. It’s terrifying. 

Finding a solution to the mental health crisis our world is facing is urgent, but not so urgent that we should settle for tech companies to extort our health and use it against us to make money.

This conversation obviously encompasses the behemoth, multifaceted issues regarding social media and the worrisome trends of privacy oversteps that some with all the apps.

There is no one single answer, but being aware of what these companies are doing and the information they are collating and sharing about you is the right start. Don’t let therapy apps share your private data.

At Mindful, we are intentionally thoughtful about how we interact with our patients. We coach and support them on best practices to ensure we are being true to our professions’ ethics and guidelines.

Until we can build up an adequate mental health workforce of thoughtful clinicians, we urge everyone to err on the side of caution when it comes to sharing any personal, sensitive information online, no matter the medium, website, app, or privacy guarantees.