Millions of private messages, including text, video and audio conversations, may be at risk of being exposed on the popular therapy app, Talkspace. According to Proof News, concerns are rising as the company uses private data to train AI therapy bots. The company’s CEO, per Proof News, recently told investors that Talkspace has amassed “one of the largest mental health data banks in the world.” Psychologist Linda Michaels told Proof News that this is another “awful” example of data mining.

The concerns also come after a Proof News investigation revealed that a woman’s private Talkspace therapy transcripts were produced in court during a pregnancy discrimination lawsuit against her former employer. According to court records reviewed by Proof News, years of conversations from the app were turned over as evidence.

“Privacy and confidentiality: It’s in the code of ethics of every psychotherapist,” Michaels said. “It is really taking advantage of vulnerable people at a vulnerable time of their life.”

What are the criticisms Talkspace has faced in recent years?

Sens. Elizabeth Warren, Cory Booker and Ron Wyden spoke up against Talkspace in 2022 when they sent a letter to the company to express their concerns about client privacy. The senators said in their letter that the company may be allowing Google and Facebook to access private client data. However, the company’s chief legal officer, John C. Reilly, said “all data related to their treatment is strictly used for therapeutic purposes.”

Parents also followed up with similar concerns about Talkspace in the following years, saying the private information of some New York City teens may be at risk, Proof News reported. Following the complaints, Talkspace agreed to adjust its data collection policies. Still, the company continues to face criticism and complaints.

The use of chatbots is on the rise, but concerns continue

Data shows that more and more people are turning to chatbots for therapy. At the same time, there is growing concern about the use of clients’ private information. According to KFF, about 3 in 10 young people ages 18-29 said they are using AI chatbots for therapy. About 60% of adults also said they didn’t follow up with an in-person therapist after using a chatbot.

While people like the idea of getting therapy from a nonjudgmental chatbot, there is increasing concern about the lack of guardrails in the system.

John Torous, a psychiatrist at Beth Israel Deaconess Medical Center, said many of the apps “overrepresent themselves.”

“Deceiving people that they have received treatment when they really have not has many negative consequences,” Torous told The Washington Post.