Techdirt's think tank, the Copia Institute, is working with the Trust & Safety Professional Association and its sister organization, the Trust & Safety Foundation, to produce an ongoing series of case studies about content moderation decisions. These case studies are presented in a neutral fashion, not aiming to criticize or applaud any particular decision, but to highlight the many different challenges that content moderators face and the tradeoffs they result in. Find more case studies here on Techdirt and on the TSF website.

Content Moderation Case Study: Sensitive Mental Health Information Is Also A Content Moderation Challenge (2020)

from the tricky-questions dept

Summary: Talkspace is a well known app that connects licensed therapists with clients, usually by text. Like many other services online, it acts as a form of “marketplace” for therapists and those in the market for therapy. While there are ways to connect with those therapists by voice or video, the most common form of interaction is by text messages via the Talkspace app.

A recent NY Times profile detailed many concerns about the platform, including claims that it generated fake reviews, lied about events like the 2016 election leading to an increase in usage, and that there were conflicts between growing usage and providing the best mental health care for customers. It also detailed how Talkspace and similar apps face significant content moderation challenges as well -- some unique to the type of content that the company manages.

Considering that so much of Talkspace’s usage includes text based communications, there are questions concerning how Talkspace handles that information and how it protects that information.

The article also reveals that the company would sometimes review therapy sessions and act on the information learned. While the company claims it only does this to make sure that therapists are doing a good job, the article suggests it is often used for marketing purposes as well.

Karissa Brennan, a New York-based therapist, provided services via Talkspace from 2015 to 2017, including to Mr. Lori. She said that after she provided a client with links to therapy resources outside of Talkspace, a company representative contacted her, saying she should seek to keep her clients inside the app.

“I was like, ‘How do you know I did that?’” Ms. Brennan said. “They said it was private, but it wasn’t.”

The company says this would only happen if an algorithmic review flagged the interaction for some reason — for example, if the therapist recommended medical marijuana to a client. Ms. Brennan says that to the best of her recollection, she had sent a link to an anxiety worksheet.

There was also a claim that researchers at the company would share information gleaned from looking at transcripts with others at the company:

The anonymous data Talkspace collects is not used just for medical advancements; it’s used to better sell Talkspace’s product. Two former employees said the company’s data scientists shared common phrases from clients’ transcripts with the marketing team so that it could better target potential customers.

The company disputes this. “We are a data-focused company, and data science and clinical leadership will from time to time share insights with their colleagues,” Mr. Reilly said. “This can include evaluating critical information that can help us improve best practices.”

He added: “It never has and never will be used for marketing purposes.”

Decisions to be made by Talkspace:

  • How should private conversations between clients and therapists be handled? Should those conversations be viewable by employees of Talkspace?
  • Will reviews (automated or by human) of these conversations raise significant privacy concerns? Or is it needed to provide quality therapeutic results to clients?
  • What kinds of employee access rules and controls need to be put on therapy conversations?
  • How should any research by the company be handled?
  • What kinds of content need to be reviewed on the platform, and should it be reviewed by humans, technology, or both?
  • Should the company even have access to this data at all?
Questions and policy implications to consider:
  • What tradeoffs are there behind providing more access to therapy in an easier format and the privacy questions raised by storing this information?
  • How effective is this form of treatment for clients?
  • What kinds of demands does this put on therapists -- and does being monitored change (for better or for worse) the kind of support they provide?
  • Are current regulatory frameworks concerning mental health information appropriate for app-based therapy sessions?
Resolution: Talkspace insists that it is working hard to provide a better service to clients who are looking to communicate with therapists, and challenges many of the claims made in the article. Talkspace’s founders also wrote a response to the article that, while claiming to “welcome” scrutiny, also questioned the competency of the reporter who wrote the NY Times story. They also argued that most of the negative claims in the Times piece came from disgruntled former workers -- and that some of it is outdated and no longer accurate.

The company also argued that it is IPAA/HITECH and SOC2 approved and has never had a malpractice claim in its network. The company insists that access to the content of transcripts is greatly limited:

To be clear; only the company’s Chief Medical Officer and Chief Technology Officer hold the “keys” to access original transcripts, and they both need to agree to do so. This has happened just a handful of times in the company’s history, typically only when a client points to particular language when reporting a therapist issue that cannot be resolved without seeing the original text. In these rare cases, Talkspace gathers affirmative consent from the client to view that original text: both facts which were made clear to the Times in spoken and written interviews. Only Safe-Harbor de-identified transcripts (A “safe harbor” version of a transcript removes any specific identifiers of the individual and of the individual’s relatives, household members, employers and geographical identifiers etc.) are ever used for research or quality control.

Hide this

Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.

Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.

While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.

–The Techdirt Team

Filed Under: case study, content moderation, mental health information
Companies: talkspace


Reader Comments

Subscribe: RSS

View by: Time | Thread


  1. identicon
    Anonymous Coward, 2 Oct 2020 @ 4:10pm

    If it's possible for the company to access users' supposedly "private" communications at will, they are not actually private at all.

    link to this | view in thread ]

  2. identicon
    Anonymous Coward, 2 Oct 2020 @ 6:57pm

    HIPAA violations

    There are clear stipulaltions for when medical information is protected and when it is not. Transcriptions of mental health interactions are clearly in the protected arena. These guys are in for a huge lawsuit.

    link to this | view in thread ]

  3. identicon
    Anonymous Coward, 2 Oct 2020 @ 9:36pm

    The article also reveals that the company would sometimes review therapy sessions...

    No, nuuuuu, no, hard nope. I don't care how or why they review sessions, they shouldn't be doing that.

    The company disputes this. “We are a data-focused company,...

    Well, that's a problem and conflict of interest right there. You should be a communications platform, facilitating the connecting of clients with therapists. This is a whole new downhill trend for corporate-run medicine.

    • They also argued that most of the negative claims in the Times piece came from disgruntled former workers...*

    I always love this form of deflection. There are, more frequently than not, valid reasons people are disgruntled. It isn't like some negative personality trait with only an internal source.

    has never had a malpractice claim in its network.

    That is due to the quality of your therapists, not your bullshit company. This also would indicate that the same therapist-employees (or former) are less likely to be full of shit that you are. Thanks for pointing it out.

    Finally: Talkspace insists that it is working hard to provide a better service to clients who are looking to communicate with therapists, and challenges many of the claims made in the article.

    Ha, we don't even have to look at those claims, only your own, to see highly questionable practices and motives.

    link to this | view in thread ]

  4. identicon
    MathFox, 3 Oct 2020 @ 2:05am

    They also argued that most of the negative claims in the Times piece came from disgruntled former workers...*

    I always love this form of deflection. There are, more frequently than not, valid reasons people are disgruntled. It isn't like some negative personality trait with only an internal source.

    I have left companies for the unethical practices they had. And I consider (structurally) breaking the confidentiality that your customers expect significantly worse than what I've encountered before. In all likelihood I would have not just become disgruntled, but have picked up a whistle to blow too.

    link to this | view in thread ]

  5. icon
    That Anonymous Coward (profile), 3 Oct 2020 @ 7:52am

    We can't even help people without making sure we get paid a bit extra.
    We want people to stay on our platform, even if there are better resources available out there.
    We want the data so we can show potential 'partners' we are worthy of those nice drug rep visits where our staff gets lunch.

    Anyone who has anything bad to say is always disgruntled, we are always perfect & they are jealous.

    The platform and concept might be the best thing since sliced bread, but the go go go attitude to increasing income sources will always ruin things. Profit overcomes the desire to help & profit drives everything. Those who are supposed to be helped are just a means to a paycheck.

    link to this | view in thread ]

  6. icon
    PaulT (profile), 4 Oct 2020 @ 12:17am

    Re: Fake name generator

    You're heavily promoting a fake name generator and given carte blanche the best you could come up with was David09? I hope you don't spend too much time wondering why you're a failure at your craft, as it's quite clear.

    link to this | view in thread ]


Follow Techdirt
Essential Reading
Techdirt Insider Discord

The latest chatter on the Techdirt Insider Discord channel...

Loading...
Recent Stories

Close

Email This

This feature is only available to registered users. Register or sign in to use it.