No Good Deed Goes Unpunished: Google/Apple Criticized... For Seeking To Protect Privacy In UK Gov't Covid Contact Tracing
from the always-a-complaint dept
There are plenty of legitimate things to complain about regarding some of the big internet companies -- but so many people these days view things through a weird prism in which every single action absolutely must be for evil intent, even when it's actually for a good reason. Sometimes this leads to crazy reactions in which the companies are criticized for doing the exact opposite things, with both approaches being framed as nefarious.
The latest is a very odd piece by Rory Cellan-Jones in the UK. The National Health Service (NHS) there had a contact tracing app early in the pandemic, but last summer, recognizing the limitations of its own system, switched to the framework developed by Apple and Google early on. As you may recall, Google and Apple (somewhat surprisingly) came together early on to set up a framework for contact tracing -- and the two companies put privacy front and center in the development of the system, with both recognizing (1) the inherent privacy concerns of medical information, and (2) the fact that many people already were skeptical of the two companies.
And, pretty quickly we saw some weird pushback, like the Washington Post whining that the app was too protective of privacy, keeping your health information out of the hands of government officials.
When the UK decided to switch over to Apple/Google's system, it agreed to abide by the privacy rules that Apple and Google established. But, it appears the NHS tried to push the boundaries and go beyond the privacy framework. Specifically, under the updated version, if a user tested positive for COVID, the app asked the user to upload their "venue" history (all the places they had "checked in" to according to the app). But a core part of the privacy setup was that your location info was designed to be kept decentralized and on your phone. The fear being that if you're uploading your locations it becomes a prime surveillance tool. Thus, Google and Apple rejected the updated app.
And that leads to the BBC piece that explains all of this, but then concludes by complaining about Google and Apple's ability to block these privacy-invasive feature:
What this underlines is that governments around the world have been forced to frame part of their response to the global pandemic according to rules set down by giant unelected corporations.
At a time when the power of the tech giants is under the microscope as never before, that will leave many people feeling uncomfortable.
Really? It seems odd that this should be the point that leaves people feeling uncomfortable. It set up rules to help keep everyone's data private. The government tried to violate those rules. Google and Apple said no. If we should feel uncomfortable about anything it's about the government trying to sneak around the clearly established privacy framework.
And, no, governments are not being "forced" to frame part of their response according to the rules set down by "giant unelected corporations" (I'm separately unclear who elected the NHS officials working on this app, but alas...). After all, the NHS had its own app before, but decided that the Google/Apple framework was a better one to adapt.
So what a bizarre stance to take to argue that this effort to better protect privacy somehow makes those two companies look bad.
The thing that gets me the most about stories like this is that they undermine stories in which real concerns and real bad behavior are called out. When you automatically lump all actions into the "ooooh, evil big company" pile, without determining whether there are legitimate, non-nefarious reasons for their actions (or, as in this case, concepts that are designed to better protect end-user privacy), it makes it that much harder to focus in on the real concerns.
Filed Under: contact tracing, covid, privacy, uk
Companies: apple, google