Content Moderation Case Study: Sensitive Mental Health Information Is Also A Content Moderation Challenge (2020)
from the tricky-questions dept
Summary: Talkspace is a well known app that connects licensed therapists with clients, usually by text. Like many other services online, it acts as a form of “marketplace” for therapists and those in the market for therapy. While there are ways to connect with those therapists by voice or video, the most common form of interaction is by text messages via the Talkspace app.
A recent NY Times profile detailed many concerns about the platform, including claims that it generated fake reviews, lied about events like the 2016 election leading to an increase in usage, and that there were conflicts between growing usage and providing the best mental health care for customers. It also detailed how Talkspace and similar apps face significant content moderation challenges as well -- some unique to the type of content that the company manages.
Considering that so much of Talkspace’s usage includes text based communications, there are questions concerning how Talkspace handles that information and how it protects that information.
The article also reveals that the company would sometimes review therapy sessions and act on the information learned. While the company claims it only does this to make sure that therapists are doing a good job, the article suggests it is often used for marketing purposes as well.
Karissa Brennan, a New York-based therapist, provided services via Talkspace from 2015 to 2017, including to Mr. Lori. She said that after she provided a client with links to therapy resources outside of Talkspace, a company representative contacted her, saying she should seek to keep her clients inside the app.
“I was like, ‘How do you know I did that?’” Ms. Brennan said. “They said it was private, but it wasn’t.”
The company says this would only happen if an algorithmic review flagged the interaction for some reason — for example, if the therapist recommended medical marijuana to a client. Ms. Brennan says that to the best of her recollection, she had sent a link to an anxiety worksheet.
There was also a claim that researchers at the company would share information gleaned from looking at transcripts with others at the company:
The anonymous data Talkspace collects is not used just for medical advancements; it’s used to better sell Talkspace’s product. Two former employees said the company’s data scientists shared common phrases from clients’ transcripts with the marketing team so that it could better target potential customers.
The company disputes this. “We are a data-focused company, and data science and clinical leadership will from time to time share insights with their colleagues,” Mr. Reilly said. “This can include evaluating critical information that can help us improve best practices.”
He added: “It never has and never will be used for marketing purposes.”
Decisions to be made by Talkspace:
- How should private conversations between clients and therapists be handled? Should those conversations be viewable by employees of Talkspace?
- Will reviews (automated or by human) of these conversations raise significant privacy concerns? Or is it needed to provide quality therapeutic results to clients?
- What kinds of employee access rules and controls need to be put on therapy conversations?
- How should any research by the company be handled?
- What kinds of content need to be reviewed on the platform, and should it be reviewed by humans, technology, or both?
- Should the company even have access to this data at all?
- What tradeoffs are there behind providing more access to therapy in an easier format and the privacy questions raised by storing this information?
- How effective is this form of treatment for clients?
- What kinds of demands does this put on therapists -- and does being monitored change (for better or for worse) the kind of support they provide?
- Are current regulatory frameworks concerning mental health information appropriate for app-based therapy sessions?
The company also argued that it is IPAA/HITECH and SOC2 approved and has never had a malpractice claim in its network. The company insists that access to the content of transcripts is greatly limited:
To be clear; only the company’s Chief Medical Officer and Chief Technology Officer hold the “keys” to access original transcripts, and they both need to agree to do so. This has happened just a handful of times in the company’s history, typically only when a client points to particular language when reporting a therapist issue that cannot be resolved without seeing the original text. In these rare cases, Talkspace gathers affirmative consent from the client to view that original text: both facts which were made clear to the Times in spoken and written interviews. Only Safe-Harbor de-identified transcripts (A “safe harbor” version of a transcript removes any specific identifiers of the individual and of the individual’s relatives, household members, employers and geographical identifiers etc.) are ever used for research or quality control.
Filed Under: case study, content moderation, mental health information
Companies: talkspace