Suicide Hotline Collected, Monetized The Data Of Desperate People, Because Of Course It Did
from the money-matters dept
Another day, another privacy scandal that likely ends with nothing changing.
Crisis Text Line, one of the nation's largest nonprofit support options for the suicidal, is in some hot water. A Politico report last week highlighted how the company has been caught collecting and monetizing the data of callers... to create and market customer service software. More specifically, Crisis Text Line says it "anonymizes" some user and interaction data (ranging from the frequency certain words are used, to the type of distress users are experiencing) and sells it to a for-profit partner named Loris.ai. Crisis Text Line has a minority stake in Loris.ai, and gets a cut of their revenues in exchange.
As we've seen in countless privacy scandals before this one, the idea that this data is "anonymized" is once again held up as some kind of get out of jail free card:
"Crisis Text Line says any data it shares with that company, Loris.ai, has been wholly “anonymized,” stripped of any details that could be used to identify people who contacted the helpline in distress. Both entities say their goal is to improve the world — in Loris’ case, by making “customer support more human, empathetic, and scalable."
But as we've noted more times than I can count, "anonymized" is effectively a meaningless term in the privacy realm. Study after study after study has shown that it's relatively trivial to identify a user's "anonymized" footprint when that data is combined with a variety of other datasets. For a long time the press couldn't be bothered to point this out, something that's thankfully starting to change.
Also, just like most privacy scandals, the organization caught selling access to this data goes out of its way to portray it as something much different than it actually is. In this case, they're acting as if they're just being super altruistic:
"We view the relationship with Loris.ai as a valuable way to put more empathy into the world, while rigorously upholding our commitment to protecting the safety and anonymity of our texters,” Rodriguez wrote. He added that "sensitive data from conversations is not commercialized, full stop."
Obviously there are layers of dysfunction that have helped normalize this kind of stupidity. One, it's 2021 and we still don't have even a basic privacy law for the internet era that sets out clear guidelines and imposes stiff penalties on negligent companies, nonprofits, and executives. And we don't have a basic law because it's hard (though writing any decent law certainly isn't easy), but because a parade of large corporations, lobbyists, and revolving door regulators don't want the data monetization party to suffer even a modest drop in revenues from the introduction of modest accountability, transparency, and empowered end users. It's just boring old greed. There's a lot of tap dancing that goes on to pretend that's not the reason, but it doesn't make it any less true.
We also don't adequately fund mental health care in the states, forcing desperate people to reach out to startups that clearly don't fully understand the scope of their responsibility. We also don't adequately fund and resource our privacy regulators at agencies like the FTC. And even when the FTC does act (which it often can't in terms of nonprofits), the penalties and fines are often pathetic in scale of the money being made.
Even before these problems are considered, you have to factor that the entire adtech space reaches across industries from big tech to telecom, and is designed specifically to be a convoluted nightmare making oversight as difficult as possible. The end result of this is just about what you'd expect. A steady parade of scandals (like the other big scandal last week in which gay/bi dating and Muslim prayer apps were caught selling user location data) that briefly generate a few headlines and furrowed eyebrows without any meaningful change.
Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.
Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.
While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.
–The Techdirt Team
Filed Under: crisis text line, data brokers, data collection, failures, privacy
Companies: crisis text line, loris.ai
Reader Comments
Subscribe: RSS
View by: Time | Thread
There's indifference, there's evil, and then there's THIS...
One, it's 2021 and we still don't have even a basic privacy law
Typo or article just been sitting in the queue for a long time?
Moment of levity aside it takes a truly spectacular type of scumbag to look at a suicide prevention hotline job and ask yourself 'How can we not only monetize this but do so in a way that has a very real chance of decreasing the willingness of people to contact us, potentially with lethal results?', so it looks like barely a month into the year and we've already got a two-for-one contender for 'Biggest Asshole of 2022'.
Whatever greedy wastes of flesh that were involved need to be publicly fired, with an announcement that such behavior is absolutely out of bounds because as it stands they just made clear that as much as they might claim to care about the people who contact them they have no problem exploiting those people for a quick buck.
[ link to this | view in thread ]
Plus side
Next time i get asked why i don’t call a hotline i can’t be called paranoid anymore.
[ link to this | view in thread ]
O_O
I...just...fuck this.
"by making “customer support more human, empathetic, and scalable.""
We sell our product to Comcast so that we can tell when we've finally pushed the customer to hard & might end up being sued by a distraught family after we push them over the edge.
So lets see...
Lets cash in on people in one of the worst days of their lives.
Lets record keywords and how many times they tell us they feel worthless & hopeless and then SELL that data.
Let us TOTALLY betray any trust these people might have had in our desire to help them.
Did they keep me talking so they would have more data to sell?
(Look at what we KNOW they were doing & tell me you've never heard about phone bank workers milking customers on the phones.)
We're here to help... as long as we can make a few bucks from your suffering.
You can trust us, we're part owner of the company we sell your worst day to and boy this pandemic has made us rich!
This is one of those moments I regret not being the ELE required to cleanse the world of the human race in the past.
Once upon a time 'bad guys' had a code...
No women, no children.
They had respect for themselves.
Preying on suicidal people for a buck... they found a way to be worse than Prosperity Preachers.
Of course nothing will happen, except the next asshole doing this will keep it quiet longer & use more shell companies to distance themselves.
Loris.ai turning your suicidal thoughts into customer service.
I admit this is a moment when I wish I could still tweet, pretty sure the people I hung out with would be as horrified as I am & would have no problem finding every scrap of public information about the bastards who did this & making their lives real pleasant.
You thought Tucker Carlsons advertisers fled quick think how fast Loris.ai would be losing customers when a couple thousand people ask corporations if exploiting suicidal people to deliver better customer service was worth it.
You think the Wordle backlash was bad... you ain't seen nothing yet.
[ link to this | view in thread ]
I heart moderation...
Did we mention all of the counselors are volunteers?
30 hours of training & sometimes Nancy would offer suggestions like just listening to some Taylor Swift. o_o
Did we mention Nancy Lubin got run out of CTL?
Teen Vogue did.
google "nancy lublins crisis text line what happened"
Mentioning weight watchers to plump employees.
Running a Happiness Survey then suggesting those unhappy leave.
But its AI and it trains CSRs how to handle hard calls.
something something lacking depth and warmth
[ link to this | view in thread ]
I intend to open a Paranoia Hotline, with a mission statement stating a goal to provide psychological counselling to paranoid people. I likely won't be able to earn the trust of my target market, but think of the revenue I shall be able to generate by monetising their personally-identifiable information!
[ link to this | view in thread ]
Re:
I don't care that its a joke. Its deadly serious. Paranoia is a large driver of suicide and fears over this approach is something the program has been working on for years. This revelation will completely undermine trust in the suicide prevention hotline program among groups that absolutely need it. Have a fucking flag.
[ link to this | view in thread ]
I’m sure that, on some planet, this was funny.
Your problem is, this is Earth.
[ link to this | view in thread ]
Re: There's indifference, there's evil, and then there's THIS...
... no idea why I went with 'levity' there, guess I was just looking for something less depressing than the article and latched onto whatever I could find, poor word choice though.
[ link to this | view in thread ]
If only privacy actually were what you thought it to be...
[ link to this | view in thread ]
Here's the test for good anonymizing:
Once the data is anonymized to the extent that the buyer is no longer interested in paying for it, the anonymization is good enough for the data to remain anonymized.
[ link to this | view in thread ]
Altruistic, are we?
FTFA:
Sure, great. Then do it for free. Publish the data publicly and slap a CC0 on it.
[ link to this | view in thread ]
Re: There's indifference, there's evil, and then there's THIS...
I wouldn't just say whoever did such a thing is merely an "asshole", but what they really are is a monster.
[ link to this | view in thread ]
Re: There's indifference, there's evil, and then there's THIS...
March 2020 had a brief hiccup in the second half of 2021 when ppl got their vaccines and thought we could finally move on with our lives. But they forgot greed (African and other poor countries with almost no vaccination) and anti-vaxxers so we are now stuck in the second half of 2021.
[ link to this | view in thread ]
Re: Re:
Your words fail to convene how huge of a disaster this actually is. I hope a friend of mine struck by heavy depression never comes across any information about this. Or at least share with me so I can help work out the inevitable paranoia this will generate.
[ link to this | view in thread ]
Re: There's indifference, there's evil, and then there's THIS...
Whatever greedy wastes of flesh that were involved need to be publicly fired,
from a cannon. into the sun.
[ link to this | view in thread ]
Re:
what the actual fuck
[ link to this | view in thread ]
Re: Re: Re:
As someone who didnt use a suicide line when i really needed to because of paranoia, i understand how damaging this information is. But even beginning to try to express it sends me into incoherent rage.
[ link to this | view in thread ]
In the case of mental health...
Actually there is a privacy law. This would seem to fall under HIPAA, as they are dealing with people who are suffering a mental health crisis. Just how the crisis line could get informed, rational consent to release health information from a person suffering a mental health crisis is questionable at best.
[ link to this | view in thread ]
Re: In the case of mental health...
Suffering from a mental health crisis doesn't indicate a lack of competence to consent to something. Such a person may or may not be able to give consent, and that is something that would have to be evaluated in each case, regardless of whether they called a hotline, walked into an emergency room, or whatever else (incidentally refusing treatment also cannot be used to declare someone incompetent).
[ link to this | view in thread ]
Re: In the case of mental health...
HIPAA is useless as a privacy law. I don't think privacy was supposed to be a major element of HIPAA anyway. The HIPAA Privacy Rule doesn't apply to employers, many government officials, the media, data brokers, FAANGM/MAMAA (Facebook, Google, etc.), sellers of wearables, and any manner of people/companies which should have no business with your medical information.
From the US Department of Health and Human Services:
Crisis Text Line is a non-profit organization which offers a texting service. It's definitely not covered by HIPAA.
The US needs a privacy law which is stronger than the GDPR. The GDPR wasn't enough. All it has done was increase the number of "consent to data collection" popups on websites. There's no guarantee that the "reject" buttons do anything. The GDPR has also failed to stop Google's deplorable real-time bidding (RTB).
[ link to this | view in thread ]
The only way to achieve data anonymization
is to not collect data at all, or to collect as little data as possible to fulfill whatever service the client needs and to purge that data as soon as possible. Without a federal general privacy law stronger than the GDPR, almost no company, organization, or government agency would do any of that voluntarily.
[ link to this | view in thread ]
Re: Re: In the case of mental health...
HIPAA's privacy provisions work just fine. People often misunderstand the scope and intent of the law. The law is and only ever was designed to cover healthcare providers and the associated admin infrastructure. And mostly deals with who, how, and when a party can access medical records. I can only speak for myself and what I've seen, but my healthcare organization was serious as a heart-attack about protected health information.
[ link to this | view in thread ]
Re: Re: In the case of mental health...
No law can ever *prevent* anything. All the law can do is define rules and reparations for breaking those rules.
The only option is to never give the data to them in the first place. Period. Full stop. I still can't figure out why despite decades of everyone chanting "Only put it online, if you are ready for everyone in the world to know about it." People still expect that putting crap on Facebook / YouTube / Twitter / Instagram / etc. is somehow exempt from the rule.
As for the compelled disclosure, read: mandatory surveillance, that has infected so much of the IT industry and everything else... That crap needs to be outlawed in it's entirety. Along with a mandatory minimum 10% of global revenue per instance penalty for any company found doing it. This crap needs to die, and the only way that will happen is to poison the fountain of profits these companies have found for themselves.
The only option there is to install a javascript blocker. They can't use your electricity and bandwidth if you deny them the CPU cycles they need to do it with. If they want to run the damn code, then they can pay for it themselves.
[ link to this | view in thread ]
Re: Re: Re: In the case of mental health...
"decades of everyone chanting"
Lol, yeah, right...
Everyone you know maybe if you're in a particularly small and knowledgable group of friends, but that's sure as hell not been a message stated in the mainstream.
[ link to this | view in thread ]
Re: Re: There's indifference, there's evil, and then there's THI
We were working on that tech but they murdered John Bull so that tech died with him.
May I suggest crocodiles or one of those railcar brush burners.
[ link to this | view in thread ]
Re: Re: There's indifference, there's evil, and then there's THI
No, no..."levity" has become appropriate.
I've been revisiting old George Carlin videos on Youtube and found that today they are, if anything, often more appropriate than ever. The man was a fscking prophet...
[ link to this | view in thread ]
Re: Altruistic, are we?
I paraphrase but she was trying to turn her b level into a unicorn.
[ link to this | view in thread ]
Re: Re: There's indifference, there's evil, and then there's THI
It is a defense mechanism, not a poor word choice.
In the stream of horrible shit in the world, this company's actions actually found something lower than we thought was possible.
My brain was trying to puzzle out what her next company would be but even I couldn't string together inflicting pain on small children to train an AI if its a cry of pain or something else.
We run a free daycare!
Your child might come home with some scrapes and bruises, ignore those its fine.
New from Nani.ai smart devices to make sure your nanny isn't actually harming your child by listening.
[ link to this | view in thread ]
Re: In the case of mental health...
Hippo wouldn't apply here even on a good day.
They are voluntarily contacting a volunteer, not an actual medical professional.
I'm sure buried in the fine print, that on a good day most people ignore so can't we fault someone in crisis not checking it, is some obscure way of saying that maybe something might be used elsewhere but we'll totes keep your name out of it.
[ link to this | view in thread ]
Now I feel like a terrible person for constantly telling my friends to contact the chat line.
[ link to this | view in thread ]
Re:
You shouldn't, you meant well and judging by your comment I'm guessing you had no idea that they would violate the trust of the callers like this which leaves all the blame on them.
[ link to this | view in thread ]
Re: Re:
Thanks.
[ link to this | view in thread ]