The Tech Policy Greenhouse is an online symposium where experts tackle the most difficult policy challenges facing innovation and technology today. These are problems that don't have easy solutions, where every decision involves tradeoffs and unintended consequences, so we've gathered a wide variety of voices to help dissect existing policy proposals and better inform new ones.

Unbridled Surveillance Will Not Save Us From COVID-19

from the surveillance-in-the-times-of-a-pandemic dept

We all share the fervent desire to reopen society, to hug our friends and loved ones, to jump start the economy, and to return to the many activities that have been off limits since COVID-19 engulfed our communities.

For many, there may be a temptation to turn to invasive technologies – from temperature screening devices to contact tracing apps – that promise to stem the virus’ spread while permitting us to return to our normal routines. Many of these technologies collect the intimate details of our lives: our health status and symptoms, our associations, our locations and movements, and in some cases, even the details of our faces.

Surveillance technologies are not panaceas, and without appropriate safeguards and community trust, many technologies will cause more harm than good. In fact, some surveillance tech is simply public health theater that offers a false sense of security and provides no actual protection from the coronavirus.

When your employer, your gym, your local grocery store, or your local government suggests a new COVID surveillance gadget, here are some questions to ask, as well as some answers to keep in your back pocket.

Does it work?

Tucked into this question is another, threshold question: what does it mean to “work”? What is the goal the technology aims to achieve? What metrics will be used to measure effectiveness? What level of false positives or false negatives will be tolerated? These questions are best answered in conjunction with public health experts. In the meantime, here is what we know about some of the most popular technologies out there:

Temperature Screening

Putting aside the remarkable variability in accuracy of various temperature screening devices (pro-tip: standoff fever detectors are particularly unreliable), using elevated temperature as a proxy for COVID-19 status is both woefully under- and over-inclusive.

COVID-19 is contagious before symptoms appear, and many people remain asymptomatic for the entire course of infection. Others may suppress a fever by taking Tylenol or ibuprofen. The fact that an individual lacks a fever does not mean that that individual is COVID-negative.

At the same, many individuals run a fever because of conditions that have nothing to do with COVID and are not contagious such as cancer, urinary-tract infections, or simply stress. When temperature screens are used to determine who can return to work or enter a store or a dentist’s office, healthy – or at least non-contagious – individuals will be excluded from participation in society.

Technology Assisted Contact Tracing Apps

Technology assisted contact tracing apps, broadly, fall into two categories: those that rely on cell phone location information, and Bluetooth proximity tracing. The former is both extremely invasive, because where you go says a lot about who you are, and likely to be ineffective for contact tracing, because the location information cell phones generate is not precise enough to determine whether two individuals are sufficiently close to risk exposure. The same is true of the location information advertisers and data brokers have been volunteering to national, state, and local governments since the pandemic began.

By contrast, Bluetooth proximity tracing, if done right, can be achieved without revealing location information, associations, or even the identities of the individuals involved. (For a deep dive on Bluetooth proximity tracing, check out this whitepaper.)

At the same time, even Bluetooth proximity tracing cannot determine whether two individuals within six feet of each other were, in fact, separated by a wall nor, of course, can any technology capture when COVID might move from one individual to another by temporarily resting on surfaces that are handled by multiple people.

Who is being left out?

Many of the people in communities that are most vulnerable to coronavirus are among the least likely to have a smartphone capable of running a contact tracing app. For example, over 40 percent of those over 65 do not own a smartphone, yet the 65-and-over population accounts for more than 75 percent of COVID-related deaths.

Nearly 30 percent of those who earn less than $30,000 annually lack a smartphone; these individuals are also more likely to be frontline workers who must endure increased COVID exposure simply to make a living. Similarly, people with disabilities are 20 percent less likely to own a smartphone than the general population. Although these individuals are not more likely than others to contract the coronavirus, because of their underlying health conditions, the virus may be more dangerous for them.

Even those who do own a smartphone may not have the know how to use a contact tracing app.

Armed with this knowledge, some countries supplement contact tracing apps with credit card transaction histories and closed-circuit video footage. But credit card transaction records will not reach those who pay cash or the unbanked, who are disproportionately poorer and people of color.

The idea of running video footage through facial recognition software to identify individuals is particularly pernicious; such systems are notoriously bad at recognizing women and Black people at a time when Black people are among those disproportionately likely to suffer from COVID-19.

Between the technological flaws and the people who will be left behind by tech solutions, there is a substantial risk that relying too much on technology could lull individuals into a false sense of security and undercut more effective COVID-prevention measures. For these reasons, it is imperative that any technological intervention be coupled with well-designed analog measures, such as traditional contact tracing, robust access to testing and treatment, support for those who need to isolate at home, the availability of PPE, and social distancing.

Who is being harmed?

Even when traditional contact tracing techniques are used, there are myriad individuals – such as undocumented immigrants, LGBTQ youth who come from unsafe homes, people who live in apartments with more people than they have on the lease, survivors of sexual violence and domestic violence – who could be at risk if their location, associations, or health status is released.

Without proper safeguards, such as those that accompany many Bluetooth proximity tracing apps, the introduction of contact tracing technologies and surveillance technologies simply ups the ante by permitting more of this information to be collected and pooled more rapidly, creating treasure troves for data thieves and law or immigration enforcement.

Other technologies are equally pernicious. For example, imprecise technologies too often become excuses for racial profiling: when risk-detection systems produce ambiguous or unreliable results, their operators fill the vacuum with their own judgments.

There is reason to believe devices like standoff temperature scanners will produce similar biases and misuse. And, just last week, the world learned that an all-too-predictable facial recognition mismatch led to the false arrest of a Black man, turning his life and his family’s lives upside down.

Given the profound risks of harm here, it is imperative that participation in any technology-assisted COVID mitigation be voluntary, which means that important public benefits, like food stamps or housing assistance, must not be conditioned on the adoption of any particular surveillance tech nor should such tech be a condition of employment or access to public transportation or other essential services.

If temperature scanners are to be used at the gateways to businesses, doctor’s offices, or public transportation, they must be the more accurate one-to-one, properly operated, clinical grade type, and anyone who is turned away must be provided with an alternate means to access the service.

This is important, because individuals are in the best position to judge their own circumstances and safety needs. Moreover, public health experts frequently find that coercive health measures backfire, because a distrustful public is likely to resist participation.

What legal and technological safeguards are there to mitigate harm?

Perhaps the most important way to build public trust and encourage individuals to voluntarily participate in contact tracing is to build in the appropriate legal and technical safeguards.

Unfortunately, the law in this area still comes up short. We have no nationwide law governing privacy in the digital age that might regulate some of these technologies. In my home state of New York, our Governor has been insisting that the Health Insurance Portability and Accountability Act (HIPAA) covers contact tracing information.

But it is not clear that HIPAA applies to traditional contact tracers, and it is pretty clear that it does not apply to many of the technological COVID interventions. Moreover, the law contains numerous exceptions that permit law enforcement to access a person’s HIPAA-covered information without their consent.

To fill this gap – at least for contact tracing information (the analog kind and the technological kind) – here in New York, a broad coalition that includes public defenders, health care providers, and civil rights, privacy, health care, and immigration advocates is working to pass contact tracing confidentiality legislation. The bill ensures that contact tracing information will be kept confidential, will only be used for contact tracing purposes, and will be deleted once its purpose has been served.

Importantly, the bill permits the use of aggregate, de-identified information to track the spread of the virus and to identify disparities among New York communities.

And, most crucially, it prevents law enforcement and immigration enforcement from acting as contact tracers or accessing contact tracing information. It also makes clear that a person’s contact tracing information cannot be used against them in a court or administrative proceeding.

Law and immigration enforcement access was an obvious place to start building in privacy protections. These authorities have, time and time again, given New Yorkers, particularly Black and Brown communities – the very communities hardest hit by COVID-19reason for distrust. One need only look at the brutal law enforcement reaction to the ongoing protests to understand why. If individuals have any reason to believe that sharing these details of their lives will expose them or their loved ones to criminalization or deportation, they simply will not participate.

The risks associated with law enforcement participation in contact tracing are not conjecture. In response to the recent protests in Minnesota, law enforcement there began using contact tracing techniques to track protesters – and public health officials immediately lamented that the police’s activities hampered their efforts to build trust and participation in contact tracing.

Here in New York State, sheriffs’ departments have been deputized as contact tracers in Nassau County and Erie County. And, in New York City, when the contact tracing program had identified 5,000 cases, 85 percent had a phone number, and contact tracers reached 94 percent of those individuals, but only 1,800 shared contacts, underscoring the distrust New Yorkers feel about contact tracing.

The contact tracing confidentiality legislation is a start to building in the legal safeguards that must undergird any technology-assisted coronavirus intervention. There is certainly space for additional legislation, and app and device developers also have a role to play: they should be building robust privacy protections into both their products and their terms of service.

And, of course, any technological interventions must be term limited to the current pandemic. Already, some participants in the industry are endeavoring to entrench the technologies for all time. As one manufacturer wrote, “Just like 9/11 and how it impacted and changed air travel forever, this too will change the way we live and work for a long time to come.”

If that sounds Orwellian, it should. It’s not hard to imagine, for example, a network of thermal cameras that were deployed during COVID-19 repurposed to conduct suspicion-less thermal body searches – perhaps to identify those suspected of drug use.

Finally, members of the most impacted communities must be involved in contact tracing, as well as in developing the technologies that will be used to mitigate COVID-19. These individuals are more likely to understand and serve their communities’ needs.

Just as community members have been more effective at convincing their neighbors to wear masks and adhere to social distancing, community members are more likely than outsiders to convince their neighbors to identify their contacts, to get tested, to self-quarantine when necessary, and to adopt new COVID-era tech when appropriate.

***

We all want to safely re-open our communities. As we contemplate which technologies to employ to help us do that, we must remember that many of these technologies offer a devil’s bargain: the illusion of safety in return for the intimate details of your life – your health status, your associations, and your location and movements.

We should be careful about which technologies we choose to adopt, and we must put in place appropriate privacy protections to build community trust and ensure safety.

These protections are not just privacy and civil rights necessities; they are public health imperatives.

Allie Bohm is a policy counsel at the New York Civil Liberties Union, focusing on legislative and government affairs. She has deep expertise on women’s rights and privacy and technology. She also advocates on the full range of the NYCLU’s issues.

Hide this

Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.

Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.

While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.

–The Techdirt Team

Filed Under: covid-19, greenhouse, pandemic, privacy, surveillance


Reader Comments

Subscribe: RSS

View by: Thread


  • icon
    Upstream (profile), 7 Jul 2020 @ 12:44pm

    It may be too late for contact tracing to be a viable concept

    I am not sure if this is true or not, but it does make sense. At some point you have so many infections that even with a theoretical ideal contact tracing system the answer would be "Everyone has potentially been exposed."

    link to this | view in chronology ]

    • identicon
      Anonymous Coward, 7 Jul 2020 @ 7:36pm

      Re: It may be too late for contact tracing to be a viable concep

      At some point you have so many infections that even with a theoretical ideal contact tracing system the answer would be "Everyone has potentially been exposed."

      The point isn't to identify infected individuals. The point is to make Joe Sixpack feel better about companies and the government ignoring the virus and have something readily available to point to for political arguments against ignoring it and reopening. To have an excuse to handwave away concerns about spreading COVID. "Oh, well the app will tell me if I get too close to the infected. So I can do as I like otherwise. Responsibility? What's that?" After all, Joe Sixpack desperately needs to work. It's not like he's a multinational corporation and can just ask the government for bail money. He's got bills to pay, and bootstraps to pull!

      It also just so happens to allow the government, and it's corporate backers, even more invasion into the lives of everyday people. After all, what authoritarian nightmare wouldn't like to mandate government sponsored monitoring software on everyone's devices? Do you honestly think that once COVID-19 is over that they would allow the public to remove that software? Hell, $10.00 says they try to claim it as a new "Digital National ID Card" of some such, and that the mandatory tracking was DRM to prevent "unauthorized copying" of them. (Real ID Act maybe?)

      link to this | view in chronology ]

  • identicon
    Anonymous Coward, 7 Jul 2020 @ 5:14pm

    Another thing with using temperature as a proxy for fever as a proxy for illness as a proxy for covid infection, is that 98.6F is hardly the normal core temperature for everyone.

    You may have some involved with contact tracing looking to help individuals and society as a whole, but too many want to or have already co-opted this to expand the surveillance state and surveillance capitalism.

    link to this | view in chronology ]

    • identicon
      Anonymous Coward, 8 Jul 2020 @ 9:28am

      Re:

      To quote Wikipedia:

      Normal human body temperature varies slightly from person to person and by the time of day. Consequently, each type of measurement has a range of normal temperatures. The range for normal human body temperatures, taken orally, is 36.8±0.5 °C (98.2±0.9 °F). This means that any oral temperature between 36.3 and 37.3 °C (97.3 and 99.1 °F) is likely to be normal.

      In the 19th century, most books quoted "blood heat" as 98 °F, until a study published the mean (but not the variance) of a large sample as 36.88 °C (98.38 °F). Subsequently, that mean was widely quoted as "37 °C or 98.4 °F" until editors realised 37 °C is equal to 98.6 °F, not 98.4 °F. The 37 °C value was set by German physician Carl Reinhold August Wunderlich in his 1868 book, which put temperature charts into widespread clinical use. Dictionaries and other sources that quoted these averages did add the word "about" to show that there is some variance, but generally did not state how wide the variance is.

      link to this | view in chronology ]

  • identicon
    Anonymous Coward, 8 Jul 2020 @ 7:43am

    even the details of our faces

    not to be pedantic, but you misspelled feces.

    link to this | view in chronology ]


Follow Techdirt
Essential Reading
Techdirt Insider Discord

The latest chatter on the Techdirt Insider Discord channel...

Loading...
Recent Stories

This site, like most other sites on the web, uses cookies. For more information, see our privacy policy. Got it
Close

Email This

This feature is only available to registered users. Register or sign in to use it.