Damned If You Do; Damned If You Don't: ProPublica's Bizarre Reporting On WhatsApp Abuse Reports
from the come-on-guys dept
I've been struck over the years by how much reporting on technology involves attacking companies for what they do -- even if for totally contradictory reasons. Everything is viewed through the lens of assuming the worst possible intentions. And, yes, sometimes perhaps that's deserved. Companies act badly and no one should give them the benefit of the doubt if they can't show reasons it ought not to be. But sometimes, it just gets ridiculous, as is clear in a recent ProPublica piece that attacked WhatsApp for its "report" feature. Now, I like ProPublica a lot and feel that they do some of the best investigative reporting around. But this was not that.
ProPublica itself has reported on how WhatsApp can be abused by those with nefarious intent -- criticizing the company for failing to do anything about it. But this new article is basically the opposite. It's attacking WhatsApp because it has a feature that allows users to "report" a message they received to WhatsApp. ProPublica dangerously incorrectly used this to claim that WhatsApp (which offers end-to-end encryption) is somehow bad about privacy. The title of the article reads -- incorrectly -- "How Facebook Undermines Privacy Protections for Its 2 Billion WhatsApp Users." The (since edited) article contains this bullshit section:
Zuckerberg’s vision centered on WhatsApp’s signature feature, which he said the company was planning to apply to Instagram and Facebook Messenger: end-to-end encryption, which converts all messages into an unreadable format that is only unlocked when they reach their intended destinations. WhatsApp messages are so secure, he said, that nobody else — not even the company — can read a word. As Zuckerberg had put it earlier, in testimony to the U.S. Senate in 2018, “We don’t see any of the content in WhatsApp.”
WhatsApp emphasizes this point so consistently that a flag with a similar assurance automatically appears on-screen before users send messages: “No one outside of this chat, not even WhatsApp, can read or listen to them.”
Those assurances are not true. WhatsApp has more than 1,000 contract workers filling floors of office buildings in Austin, Texas, Dublin and Singapore, where they examine millions of pieces of users' content. Seated at computers in pods organized by work assignments, these hourly workers use special Facebook software to sift through millions of private messages, images and videos. They pass judgment on whatever flashes on their screen — claims of everything from fraud or spam to child porn and potential terrorist plotting — typically in less than a minute.
The false implication here is that WhatsApp (and Facebook) are lying about end-to-end encryption. Except, that's bullshit. The people reviewing content are only reviewing content that has been reported. I don't understand why this is hard for people to comprehend, but even with end-to-end encryption one of those "ends" can forward the contents to someone else, and those people can see it. And that's all that's happening here. When you "report" content in Facebook, it is the functional equivalent of forwarding the message.
If it's "undermining" privacy (and it's not), then the person who is reporting the content is the one undermining privacy by forwarding the message and saying it might be problematic. This is actually quite a reasonable approach to dealing with questionable content on an encrypted messaging program, but ProPublica decides to spread a very misleading report suggesting that it's undermining privacy.
Alec Muffett does a nice job dismantling the argument. As he notes, it's really bad when journalists try to redefine end-to-end encryption to mean something it is not. It does not mean that recipients of messages cannot forward them or cannot share them. And, in fact, pretending that's true, or insisting that forwarding messages and reporting them is somehow an attack on privacy is dangerous. It actually undermines encryption by setting up false and dangerous expectations about what it actually entails.
Alex Stamos similarly has a great thread on the many problems in the article, but here's the key bit:
I know I'm going to get crap for this, because journalists always circle the wagons in these situations, but the tech-skeptical media really needs to consider what change they want to promote with their framing of these issues. You also have a responsibility to balance equities.
— Alex Stamos (@alexstamos) September 7, 2021
But, really, this gets back to a larger point that I keep trying to make with regards to reporting on "privacy" violations. People differ (greatly!) on what they think a privacy violation really entails, and because of that, we get very silly demands -- often from the media and politicians -- about "protecting privacy" when many of those demands would do tremendous harm to other important ideas -- such as harming competition or harming free speech.
And this is especially troubling when perfectly reasonable (and in fact, quite good) systems like WhatsApp "report" feature are portrayed incorrectly as "undermining privacy" when what it's actually trying to do is help deal with the other issue that the media keeps attacking WhatsApp for: enabling people to abuse these tools to spread hatred, disinformation, or other dangerous content.
Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.
Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.
While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.
–The Techdirt Team
Filed Under: bad journalism, bad reporting, encryption, end to end encryption, forwarding, privacy, reporting
Companies: facebook, propublica, whatsapp
Reader Comments
Subscribe: RSS
View by: Time | Thread
Poorly setting up user expectations
Perhaps the issue is one of not setting user expectations correctly. If you laud the fact that no one can read the contents of these messages, but forget to mention that we've set up an easy to use system to forward unencrypted copies of those self same messages to Facebook (not a company exactly known for respecting people's privacy) you can see how that might freak out some people.
Ars Technica has an article covering the same subject. The interesting part in their reporting was how groups were abusing the system to get the AI to ban groups left and right.
More troubling, in my opinion, is speculation that What's App may have the undisclosed ability to scan decrypted messages and automatically flag them for FaceBook review.
The most troubling is the unencrypted metadata that Facebook appears to be storing and forwarding/reporting as it sees fit. From the above mentioned Ars Technica article:
It appears that WhatsApp is a lot less secure than some competitors, such as Signal, and a lot less secure than Facebook likes to admit or users are lead to believe.
[ link to this | view in chronology ]
Re: Poorly setting up user expectations
Well said. WhatsApp has major issues, it seems, and I believe that Mike is taking his consternation at ProPublica’s reporting a bit too far.
[ link to this | view in chronology ]
Re: Poorly setting up user expectations
It appears that WhatsApp is a lot less secure than some competitors, such as Signal, and a lot less secure than Facebook likes to admit or users are lead to believe.
I just don't think that's even remotely accurate, though. You can always forward decrypted messages or take a screenshot of them. This reporting makes people believe things that just aren't true.
[ link to this | view in chronology ]
Re: Re: Poorly setting up user expectations
Mike, while it's true that you can usually forward or at least screen shot messages on the destination device, how many people are aware of this?
You, me, probably lots of the folks who read this site, sure. Is little Jonny, Aunt Mable, Uncle Bob aware? Probably not.
A commenter (Jim Salter) of the Ars Technica article summed it up when he posted:
Which is why I titled my original response; "Poorly setting up user expectations". I think this is especially important when you are talking about the capabilities of a security application.
ProPublica's article may come across a little reactionary to folks like you and me, but if it helps raise awareness about the ways that WhatsApp (or any other application) isn't secure, that can only be seen as a good thing. Excepting law enforcement and Facebook shareholders of course.
[ link to this | view in chronology ]
Re: Re: Re: Poorly setting up user expectations
"Mike, while it's true that you can usually forward or at least screen shot messages on the destination device, how many people are aware of this?"
How about anybody who's ever forwarded or screenshotted a WhatsApp message? Surely that's got to be a significant proportion of users.
[ link to this | view in chronology ]
Re: Re: Re: Poorly setting up user expectations
"while it's true that you can usually forward or at least screen shot messages on the destination device, how many people are aware of this?"
Why? Does people being aware of it change its existence or something?
"ProPublica's article may come across a little reactionary to folks like you and me, but if it helps raise awareness about the ways that WhatsApp (or any other application) isn't secure, that can only be seen as a good thing"
Except, that's a lie. It IS secure, unless you take the deliberate step of bypassing the security by forwarding a message. Spreading FUD about Whatsapp will not help anyone, and may in fact harm if people decide to move to less secure solutions as a response.
[ link to this | view in chronology ]
And here I thought ProPublica was a reputable news outlet.
[ link to this | view in chronology ]
Non-Sensational
It must have been a slow news day for ProPublica. It's tough to get eyeballs with the headline "Working As Intended".
[ link to this | view in chronology ]
Re: Non-Sensational
Makes a change from the constant "white supremacists told they're not welcome on someone's property, Koby and his red brigade demand communist takeover" stories.
[ link to this | view in chronology ]
This is why you are struggling
Who wants to pay for arrogant, bad faith, sensationalist bullcrap? Next to nobody. To the old media I say "Just die already you bastards!".
[ link to this | view in chronology ]
Re: This is why you are struggling
Where's your proof?
[ link to this | view in chronology ]
You say:
But if you read that article it wasn't criticising, it was reporting.
So your setup of ProPublica is false.
[ link to this | view in chronology ]
Re:
Reporting can be criticism as well if it's not objectively neutral.
[ link to this | view in chronology ]
Even if it is objectively neutral, it can constitute criticism. Simply setting forth the numbers on my property tax notices is criticism of our profligate officials.
[ link to this | view in chronology ]