Damned If You Do; Damned If You Don't: ProPublica's Bizarre Reporting On WhatsApp Abuse Reports
from the come-on-guys dept
I've been struck over the years by how much reporting on technology involves attacking companies for what they do -- even if for totally contradictory reasons. Everything is viewed through the lens of assuming the worst possible intentions. And, yes, sometimes perhaps that's deserved. Companies act badly and no one should give them the benefit of the doubt if they can't show reasons it ought not to be. But sometimes, it just gets ridiculous, as is clear in a recent ProPublica piece that attacked WhatsApp for its "report" feature. Now, I like ProPublica a lot and feel that they do some of the best investigative reporting around. But this was not that.
ProPublica itself has reported on how WhatsApp can be abused by those with nefarious intent -- criticizing the company for failing to do anything about it. But this new article is basically the opposite. It's attacking WhatsApp because it has a feature that allows users to "report" a message they received to WhatsApp. ProPublica dangerously incorrectly used this to claim that WhatsApp (which offers end-to-end encryption) is somehow bad about privacy. The title of the article reads -- incorrectly -- "How Facebook Undermines Privacy Protections for Its 2 Billion WhatsApp Users." The (since edited) article contains this bullshit section:
Zuckerberg’s vision centered on WhatsApp’s signature feature, which he said the company was planning to apply to Instagram and Facebook Messenger: end-to-end encryption, which converts all messages into an unreadable format that is only unlocked when they reach their intended destinations. WhatsApp messages are so secure, he said, that nobody else — not even the company — can read a word. As Zuckerberg had put it earlier, in testimony to the U.S. Senate in 2018, “We don’t see any of the content in WhatsApp.”
WhatsApp emphasizes this point so consistently that a flag with a similar assurance automatically appears on-screen before users send messages: “No one outside of this chat, not even WhatsApp, can read or listen to them.”
Those assurances are not true. WhatsApp has more than 1,000 contract workers filling floors of office buildings in Austin, Texas, Dublin and Singapore, where they examine millions of pieces of users' content. Seated at computers in pods organized by work assignments, these hourly workers use special Facebook software to sift through millions of private messages, images and videos. They pass judgment on whatever flashes on their screen — claims of everything from fraud or spam to child porn and potential terrorist plotting — typically in less than a minute.
The false implication here is that WhatsApp (and Facebook) are lying about end-to-end encryption. Except, that's bullshit. The people reviewing content are only reviewing content that has been reported. I don't understand why this is hard for people to comprehend, but even with end-to-end encryption one of those "ends" can forward the contents to someone else, and those people can see it. And that's all that's happening here. When you "report" content in Facebook, it is the functional equivalent of forwarding the message.
If it's "undermining" privacy (and it's not), then the person who is reporting the content is the one undermining privacy by forwarding the message and saying it might be problematic. This is actually quite a reasonable approach to dealing with questionable content on an encrypted messaging program, but ProPublica decides to spread a very misleading report suggesting that it's undermining privacy.
Alec Muffett does a nice job dismantling the argument. As he notes, it's really bad when journalists try to redefine end-to-end encryption to mean something it is not. It does not mean that recipients of messages cannot forward them or cannot share them. And, in fact, pretending that's true, or insisting that forwarding messages and reporting them is somehow an attack on privacy is dangerous. It actually undermines encryption by setting up false and dangerous expectations about what it actually entails.
Alex Stamos similarly has a great thread on the many problems in the article, but here's the key bit:
I know I'm going to get crap for this, because journalists always circle the wagons in these situations, but the tech-skeptical media really needs to consider what change they want to promote with their framing of these issues. You also have a responsibility to balance equities.
— Alex Stamos (@alexstamos) September 7, 2021
But, really, this gets back to a larger point that I keep trying to make with regards to reporting on "privacy" violations. People differ (greatly!) on what they think a privacy violation really entails, and because of that, we get very silly demands -- often from the media and politicians -- about "protecting privacy" when many of those demands would do tremendous harm to other important ideas -- such as harming competition or harming free speech.
And this is especially troubling when perfectly reasonable (and in fact, quite good) systems like WhatsApp "report" feature are portrayed incorrectly as "undermining privacy" when what it's actually trying to do is help deal with the other issue that the media keeps attacking WhatsApp for: enabling people to abuse these tools to spread hatred, disinformation, or other dangerous content.
Filed Under: bad journalism, bad reporting, encryption, end to end encryption, forwarding, privacy, reporting
Companies: facebook, propublica, whatsapp