Why You Won’t Find Me On A Therapy App

While in the process of starting my private therapy practice, I had several people ask me if I’d considered using a tech platform like Betterhelp or Talkspace. I vehemently said no and tried to explain why. Simply put, their practices are against my personal values, and I have reason to believe they may be antithetical to my professional ethics. 

In an ideal world, it would be amazing to have a tech platform that allows me to easily sign up, get matched with clients who are a great fit for my expertise, be paid well, and help increase access to therapy for all those who need it. There’s a reason these companies exist, and that is that many people need access to convenient and affordable mental health support. However, it is vital to look at what any person or company actually does- not just what they say they do. The benefits these platforms offer come at a very high cost to both clients and therapists.

After some time has passed following an FTC lawsuit, it seems that one of these tech giants is having a comeback and is sponsoring many ads and social media posts. After seeing some of these in my social media feed, I felt compelled to prioritize writing this blog post, which I’ve been wanting to do for a while now.

My goal here is to give a short breakdown of the related healthcare legislation and regulations (and lack thereof), my personal and professional ethics on the topic, followed by a compilation of resources that have informed my thoughts and opinions on this matter that I hope will be helpful to others. I hope this is a helpful resource for those considering seeking out therapy services and for other professionals who are currently weighing their practice options.

My intent is not to shame anyone who has provided therapy through these platforms or accessed care this way. Many therapists turn to working for these platforms because their options are limited. Being a therapist isn’t the lucrative and glamorous career that we see portrayed in the media. There is a small percentage of therapists that are well off, however, most therapists aren’t doing well financially.

Spoiler Alert

I do want to highlight here first before diving in that the best way to access online therapy that protects your privacy and is more likely to be helpful is to access therapy directly through a person/company that is already a covered entity under HIPAA. This includes privately owned therapy practices, community mental health agencies, hospitals, clinics, and treatment programs. Currently, this is the only way to ensure all of your information is kept as private as possible.

Additionally, do you value small businesses and individual entrepreneurship? Then consider accessing therapy in a small business setting. By doing this, you are supporting the human healthcare provider directly, not a third-party tech company and their venture capitalist investors or the for-profit healthcare industrial complex. 

Disclaimer

First, I want to offer a disclaimer that everything stated here is my personal opinion, and nothing said here should be taken as irrefutable facts. I’ve never worked for any of these companies in any capacity, so I have no way of knowing their actual practices. I can, however, seek out and present evidence from investigative reports, which is what I’ve done here. I’m starting with this disclaimer because there have been lawsuits against people who have spoken out against these companies. Speaking truth to power does indeed come with a cost sometimes. However, I have to balance my right to free speech with the simple fact that I can’t afford to be sued by these wealthy and powerful companies, so I’m starting with a disclaimer :)

HIPAA + Tech

I’m going to try to sum up what I’ve learned about privacy and data in my extensive research into mental health apps and online therapy platforms.

The Health Insurance Portability and Accountability Act, called HIPAA, is the federal law that protects the privacy of a patient's health information. 

Many people assume that if they access an app or program that offers health-related information, advice, services, or treatment, their private information is automatically protected under HIPAA law. This is not always true, and not all information is protected even when HIPAA is in play. 

To qualify as a person or organization that must abide by HIPAA law, one must be either a licensed healthcare provider, a healthcare organization, a health insurance company, or a business associate of a healthcare provider/organization. A business associate is often a company that provides services that help with the operation of the company, such as an electronic health record or billing services.

When a healthcare provider/organization works with a business associate, that relationship begins once a business associate agreement (BAA) is provided. After that relationship is officially formed, the non-healthcare entity (person or company) has attested that they will abide by the rules and requirements of HIPAA and protect patient/client information as thoroughly as a healthcare provider would.

Before the onset of the internet and the apps we have today, this all used to be pretty straightforward. Most of the laws were written prior to the internet as we know it today and haven’t been updated to reflect the current reality. 

We now have literally thousands of “mental health apps” available at our fingertips. Mental health apps can offer wellness practices like meditation exercises, journals, self-help tools, and symptom trackers, but also mental health therapy and psychiatric services.

Most of these apps are not regulated at all, nor do they qualify as a covered entity. This means that they are not legally required to keep user information private or confidential. Only certain apps are covered under HIPAA’s protections, like those provided by a healthcare company or apps that are FDA-approved for the treatment of mental health conditions. 

Confusion and Privacy Concerns

This is all understandably confusing for consumers. Many people are led to believe that the information they provide on an app is protected and confidential. Companies often promise that they will not share/sell user data and, if they do, that it will be de-identified. 

De-identified data is information that has been randomized so that it (ideally) can’t be traced back to any specific individual. However, advocates are raising concerns that this isn’t always effective and that a concerning amount of data can be re-identified.

These privacy concerns in and of themselves are problematic enough when it comes to self-help and educational resources. However, the stakes become much higher once regulated healthcare services and sensitive health information are involved. 

One can think of most apps that provide mental health counseling and therapy services as more of a shopping mall than a traditional healthcare setting. This isn’t a perfect metaphor, but it's the best I have come up with.

Unlike once you walk into a clinic or hospital, by my understanding, this virtual mall is not instantly and automatically considered a covered entity under HIPAA as either a healthcare provider or a business associate. Many “therapy app” companies are very careful in how they create and operate their businesses to avoid this classification. They seem to intentionally remain a middleman or third party who is “just linking” the therapy seeker with the therapist. Therapy is provided by therapists who are independent contractors - not employees. This way, the app isn’t technically the entity providing healthcare services since it doesn’t meet the criteria of a healthcare organization.

Let’s go back to the entrance of this virtual mall (therapy app). Once a user “walks in” (downloads the app), someone from that mall (therapy app) can ask the user whatever they want. Users are often prompted to provide identifying demographic information (date of birth, race, etc.), as well as sensitive information about their mental and physical health. Users are told that this information will be used to match them with a therapist who can best meet their needs.

Again, this is just my understanding after doing all the research that I have, but it seems that up until the point where the user becomes a therapy client of a specific, individual licensed healthcare provider, all of the information provided by the user up until that point is fair game to be compiled and sold because no HIPAA covered entity has yet been involved. 

This is why BetterHelp was sued- because they were misleading consumers. They were not sued for their actual practice of selling their user’s data.

Your Data Is Profitable

They were not sued for the selling of data because it is legal to sell health-related data. It is legal at all times if you aren’t a HIPAA-covered entity, and if you are, it is legal once the user signs an agreement and the information is de-identified.

Therapy app platform users are often required to sign something saying they agree to the organization’s practices, usually called “terms of conditions” or “privacy practices.” This agreement often allows the user’s data to be used by the company for their purposes, and that data can be sold to other companies if they choose. 

This right here seems to be a legal and regulatory gray area that companies are using to profit from the sale of people’s personal and sensitive data. They get sellable data/information at the beginning of the user’s experience before HIPAA’s protection even applies, and they seem to gather more sellable data/information during the client’s treatment and are allowed to do so because users “agree” to this.

If I could recommend one report about this issue, I want to recommend the 2023 report by Stanford, Data Brokers and the Sale of Americans’ Mental Health Data The Exchange of Our Most Sensitive Data and What It Means for Personal Privacy by Joanne Kim. This report highlights the financial gain these companies have not by offering users access to therapy- but by compiling and creating data that can be sold for considerable profit.

“If you’re getting something for free, you are the product” is often said about social media platforms, but this sentiment can also be applied to these apps. If you are getting a highly specialized healthcare service for a very low cost, and it hasn’t also been funded through insurance or another way, that should raise some questions. You may be unknowingly paying in other ways.

Ethics

It is unclear if a user can access therapy services in these apps without agreeing to have their data mined and sold. If they cannot, in my opinion, this would go against my profession’s ethical informed consent requirements.

Here are a few privacy-related ACA Code of Ethics that I, as a licensed professional counselor, have the ethical and legal responsibility to adhere to:

  • Avoid causing harm to my clients.

  • Respect the privacy of “prospective and current clients.” 

  • Only disclose client information after getting the client’s consent or in the case of legal/ethical requirements (safety concerns, mandated reporting, etc.).

  • Explain any limitations to confidentiality.

  • Ensure any client information is protected when transmitted in any way (digitally, etc.).

  • Make sure that all records are confidential and only individuals with authorized access can view them.

For consent to be given that is truly informed, people have the right to know what information is being shared, who it is being shared with and why, how that information is being used, and the methods that are being taken to ensure their privacy. People can’t give informed consent without actually being informed. 

Therapy clients should always have a choice of if and how their information is shared (unless required by law) and have complete control over their personal information and healthcare data. 

Based on my interpretation of these ethical codes, I am responsible for doing everything I can to ensure the confidentiality and privacy of the people who are reaching out to inquire about my services and those who have become my clients. I interpret these codes to mean that prospective clients, ethically, must be able to explore accessing therapy services with as much privacy as those who have become someone’s therapy clients. 

Therefore, in my opinion, and by interpretation, working with a vendor or employer who compromises the privacy of any future and current clients is not something I can do if I am to remain in compliance with my profession’s code of ethics, which is something I am required to do so legally by my state licensing board.

Additional Concerns

Before you think me too harsh a critic of technology, I do use telehealth platforms to meet with my clients, and I would love to see technology be used to solve problems in a safe and ethical way. It does seem that the cart has gotten away from the horse and that there’s been a glaring lack of input from actual mental health professionals as the use of technology has expanded.

Privacy and therapy apps are just some of my concerns about technology and the mental health field. I’m also deeply concerned about artificial intelligence and other forms of therapy practice management companies that are also seeking out therapists as contract employees and providing insurance credentialing and other services for a fee or even for “free.” I hope to research and write about these in the future.

However, the topic of therapy apps does raise some additional concerns that I want to speak briefly about here.

I care deeply about access to therapy for all those who need it. I also deeply care about the labor rights of therapists and do not support the “uberification” of therapy. 

Therapists are not paid what they deserve on those platforms, and I cannot and will not condone any form of labor exploitation. The companies have huge advertising budgets and bring in multi-million-dollar rounds of investor funds; however, they underpay therapists and promote overwork and burnout by paying therapists more money the more sessions and client interactions they provide.

Therapy is complex, deep work, and therapists deserve to be compensated for their valuable labor. We need solutions to the accessibility problem that don’t come at the personal cost of therapists or clients, such as universal healthcare, parity for mental health treatment, increased reimbursement rates, etc. 

These privacy and labor concerns are just the tip of the iceberg. There are also many more quality, ethical, and legal reasons why I personally will not be providing therapy via an app. 

You’ll find a compilation of resources and articles below to learn more. I’m offering these because not only are “therapy apps” of concern, but there is essentially no regulation of “mental health apps” in general.

Until there is governmental regulation, oversight, and accountability of these apps, consumers are left up to the personal task of protecting their privacy. It’s up to us to do the research on any app or program before using it and to make informed decisions that protect our safety and privacy.

Please, do your research before providing any app or website with any personal information.

Final Thoughts

A thought that came to mind as I was doing the tedious task of compiling all these articles and resources. How wild it is that these companies continue to exist and do what they do- even in the face of such an extreme amount of criticism and concern from mental health professionals, journalists, and researchers? 

I could offer up my thoughts on why that is here, but instead, I’ll challenge my reader to read through some of these and then take a step back and look at the bigger picture. Then, ask yourself… 

Why is all this being ignored? 

Who and what stands to benefit and profit from the lack of regulation and oversight?

As I said at the beginning, my strongest recommendation for people looking for therapy is to seek out a therapist in a private practice setting. This is the best choice to protect your privacy and ensure you’re working with a competent and ethical provider. I understand finding a therapist isn’t always the easiest task. However, many therapists like myself go through the time-consuming, expensive, and difficult process of starting a practice because they care about your privacy and well-being - as well as their own. Obviously, I’m out here, but I am just one person, so there is a variety of therapist directories on my website’s resource page. 

If you’ve read this far, thank you! 

Resources

Professional Evaluation and Industry Reviews

These are two resources that I recommend for potential users of therapy apps that evaluate the privacy and safety of mental health apps. 

American Psychiatric Association App Advisor Initiative 

Mozilla Mental Health Apps 

Industry Reports and Professional Articles

Data Brokers and the Sale of Americans’ Mental Health Data The Exchange of Our Most Sensitive Data and What It Means for Personal Privacy 

Digital Therapy Apps: Good or Bad? | NAMI: National Alliance on Mental Illness 

FTC says online counseling service BetterHelp pushed people into handing over health information – and broke its privacy promises 

Mental health, meet venture capital 

Is This the End of the Private Practice Therapist? - Mental Health Match 

Therapy by App: A Clinical Psychologist Tries BetterHelp - Mad In America 

What psychologists need to know about online therapy services 

When Talkspace Sued PsiAN and Me 

Why BetterHelp is a Risk to Our Collective Mental Health 

Why YOU Shouldn’t Sell Out to BetterHelp: An Interview with Jeff Guenther, LPC - Therapy Reimagined

Online Articles 

At Talkspace, Start-Up Culture Collides With Mental Health Concerns

Alcohol recovery startups Monument and Tempest shared patients' private data with advertisers | TechCrunch

BetterHelp Therapy App Is Scrambling After Partnering with Travis Scott Post-Astroworld

The creepy secret behind online therapy 

Data Brokers Are Selling Long Lists of People With Depression and Anxiety 

Do Therapy Apps Really Protect Your Privacy?

Dramatic growth in mental-health apps has created a risky industry

Former employees claim Talkspace mined therapy transcripts for marketing

FTC to Ban BetterHelp from Revealing Consumers’ Data, Including Sensitive Mental Health Information, to Facebook and Others for Targeted Advertising

Gatekeepers need to tame 'Wild West' of mental health and other digital health therapeutics

Health apps share your concerns with advertisers. HIPAA can’t stop it. - The Washington Post 

How a dead veteran became the face of a therapy app's Instagram ad

How the BetterHelp scandal changed our perspective on influencer responsibility - Latest blog articles - Maastricht University 

Internal Talkspace memo illustrates challenge of delivering virtual mental health services in 50 states 

Is the “wild west” of online chat therapy doing more harm than good? 

Lyra Health, Which Provides Therapy For Google And Facebook Employees, Is Facing Concerns Over Privacy And Treatment 

Mental Health Apps Aren't All As Private As You May Think

Mental health apps might put your privacy at risk. Here's how to stay protected - ABC News 

Mental health app privacy language opens up holes for user data - The Verge

The Mental Health Therapy-App Fantasy 

Mental wellness apps are basically the Wild West of therapy

Now for sale: Data on your mental health 

Online therapy sites grapple with legal, ethical dilemmas

Overworked and Underpaid: UK Therapists Respond to US Online Platform BetterHelp

Personal user data from mental health apps being sold, report finds | PBS News Weekend

A researcher tried to buy mental health data. It was surprisingly easy. 

The struggle to make health apps truly private 

Talkspace Is a Business First and a Mental Health Resource Second, Critics Say | Truthout

Talkspace Reveals Clients' Email, Violating Clinical Confidentiality

Talkspace Shifts Dozens of Full-Time Therapy Jobs to Contracted Work - Behavioral Health Business

Talk therapy apps face new questions about data collection from senators - The Verge

Therapy apps are the Ubers of mental health

Therapy app Talkspace allegedly data-mined patients' private conversations with therapists

Therapy Sessions Exposed by Mental Health Care Firm’s Unsecured Database | WIRED

Top Mental Health and Prayer Apps Fail Spectacularly at Privacy, Security - Mozilla Foundation

The Potential Danger in Therapy Apps Like Talkspace 

The Problem With ‘Uber for Therapy’

The Spooky, Loosely Regulated World of Online Therapy

We're the researchers who looked into the privacy of 32 popular mental health apps and what we found is frightening. AMA! 

Why therapists are sounding the alarm on big box therapy companies - Upworthy 

P.S. Always read the employee reviews…

BetterHelp Reviews: What Is It Like to Work At BetterHelp? | Glassdoor 

What Is It Like to Work At Talkspace? | Glassdoor 

Working at Talkspace: 181 Reviews | Indeed.com 

Open Letter To Talkspace From Your Current Contracted Employees

Related

Anonymising personal data ‘not enough to protect privacy’, shows new study | Imperial News

Are mental health apps helpful or harmful? 

Data Brokerage - Tech Policy @ Sanford

Massive review shows the “science” behind most mental-health apps is wildly flawed 

Most mental health apps lack any scientific backing - Fast Company 

The Open Data Market and Risks to National Security | Lawfare 

We need a way to tell useful mental health tech from digital snake oil

What types of mental health apps actually work? A sweeping new analysis finds the data is sparse 

The 2023 Harvey Saferstein Consumer Protection Committee Student Contest Winning Essay: 'The Myth of Anonymity: De-Identified Data as Legal Fiction' 

Another disclaimer: All hyperlinks on this blog lead to external websites. I have reviewed and vetted them as they are presented at the time of this blog’s publication, but things on the internet can change. I can’t guarantee the links will work in perpetuity, that the content remains unaltered as they were at the time of this blog’s publication, or that these articles are free from a paywall- several are not. Additionally, all information found in these externally linked resources is not mine, and I am not responsible or liable for the information they contain.

Sources Cited

Currier, E. O. (n.d.). The 2023 Harvey Saferstein Consumer Protection Committee Student Contest Winning Essay: “The Myth of Anonymity: De-Identified Data as Legal Fiction.” Retrieved October 11, 2024, from https://www.americanbar.org/groups/antitrust_law/resources/newsletters/myth-of-anonymity/

Data Brokers and the Sale of Americans’ Mental Health Data - Tech Policy @ Sanford. (2023, February 6). Tech Policy @ Sanford. https://techpolicy.sanford.duke.edu/data-brokers-and-the-sale-of-americans-mental-health-data/

Ethics. (n.d.). www.counseling.org. Retrieved October 11, 2024, from https://www.counseling.org/resources/ethics

FTC to Ban BetterHelp from Revealing Consumers’ Data, Including Sensitive Mental Health Information, to Facebook and Others for Targeted Advertising. (2023, March 2). Federal Trade Commission. https://www.ftc.gov/news-events/news/press-releases/2023/03/ftc-ban-betterhelp-revealing-consumers-data-including-sensitive-mental-health-information-facebook

Moskowitz, P. E. (2022, February 6). Therapy apps are the Ubers of mental health — they promise to disrupt a broken system, but instead they shortchange therapists and offer patients mediocre care. Business Insider. https://www.businessinsider.com/betterhelp-talkspace-apps-uber-of-mental-health-text-therapy-2022-2

Office for Civil Rights (OCR). (2007, March 23). 256-Is a software vendor a business associate of a covered entity. HHS.gov. https://www.hhs.gov/hipaa/for-professionals/faq/256/is-software-vendor-business-associate/index.html

Office for Civil Rights (OCR). (2015, November 23). Covered Entities and Business Associates. HHS.gov; US Department of Health and Human Services. https://www.hhs.gov/hipaa/for-professionals/covered-entities/index.html

The Heard 2024 Financial State of Private Practice Report. (n.d.). Retrieved October 11, 2024, from https://www.joinheard.com/resources/downloads/the-heard-2024-financial-state-of-private-practice-report

Whitcomb, I. (2021, July 19). Mental wellness apps are basically the Wild West of therapy. Popular Science. https://www.popsci.com/science/mental-health-apps-safety/

Posted on October 15, 2024.