The Tech Policy Greenhouse is an online symposium where experts tackle the most difficult policy challenges facing innovation and technology today. These are problems that don't have easy solutions, where every decision involves tradeoffs and unintended consequences, so we've gathered a wide variety of voices to help dissect existing policy proposals and better inform new ones.

Can You Protect Privacy If There's No Real Enforcement Mechanism?

from the enforcement-matters dept

Privacy laws can have a lot of moving pieces from notices and disclosures, opt-in and opt-out consent requirements to privacy defaults and user controls. Over the past few years, there has been significant progress on these issues because privacy advocates, consumer groups, industry voices, and even lawmakers have been willing to dive into definitional weeds, put options on the table, and find middle ground. But this sort of thoughtful debate has not happened when it comes to how privacy laws should be enforced and what should happen when companies screw up, families are hurt, and individuals’ privacy is invaded.

Instead, when it comes to discussing private rights of action and agency enforcement, rigid red lines have been drawn. Consumer groups and privacy advocates say let individuals sue in court -- and call it a day. Business interests, when they talk about “strong enforcement,” often mean letting an underfunded Federal Trade Commission and equally-taxed state Attorneys General handle everything. Unfortunately, this binary, absolutist dispute over policing privacy rights threatens to sink any progress on privacy legislation.

It happened in Washington state, which failed to enact a comprehensive privacy framework in March because of a single sentence that could have let some consumers sue to enforce their rights under the state’s general Consumer Protection Act. Private rights of action have stymied state privacy task forces, and the issue is consuming efforts by the Uniform Law Commission to craft a model privacy bill. This is but a microcosm of what we’ve seen at the federal level, where lawmakers are at “loggerheads” over private rights of action.

This impasse is ridiculous. Advocacy groups share some blame here, but industry voices have failed to put any creativity into putting an alternative path forward. Company after company and trade association after trade association have come out in favor of privacy rules, but the response to any concern about how to ensure those rules are followed has been crickets. Few seem to have given much thought into what enforcement could look like beyond driving a Brinks truck full of money up to the FTC. That is not good enough. If industry is serious about working toward clear privacy rules, business interests have two obligations: (1) they should offer up some new ideas to boost enforcement and address legitimate concerns about regulatory limitations and capture; and (2) they need to explain why private rights of action should be a non-starter in areas where businesses already are misbehaving.

First, while we can acknowledge the good work that the FTC (and state Attorney Generals) has done, we should also concede that agencies cannot address every privacy problem and have competing consumer protection priorities. Commentators laud the FTC’s privacy work but have not suggested how an FTC with more resources will not just do more of what it’s already doing. There are outstanding considerations animating efforts to create an entirely new federal privacy agency (and that’s on top of a proposal in California to set up its own entirely new “Privacy Protection Agency”). Improving the FTC’s privacy posture will require more than more money and personnel.

Part of this will be creating mechanisms that ensure individuals can get redress. One idea would be to require the FTC to help facilitate complaint resolutions. The Consumer Financial Protection Bureau already does this to some extent with respect to financial products and services. The CFPB welcomes consumer complaints -- and then works with financial companies to get consumers a direct response about problems. These complaints also help the CFPB identify problems and prioritize work, and then CFPB publishes (privacy friendly) complaint data. This stands in contrast to the FTC’s Consumer Sentinel Network, which is a black box to the public.

Indeed, the FTC’s complaint system is opaque even to complainants themselves. The black box nature of the FTC is, fairly or not, a constant criticism by privacy advocates. A group of advocates began the Trump administration by calling for more transparency from the Commission about how it handles complaints and responds to public input. I can speak to this issue, submitting my own complaint to the FTC about the privacy and security practices of VPNs in 2017. Months later, the FTC put out a brief blog post on the issue, which I took to be the end of the matter on their end. Some sort of dualtrack informal and formal complaint process like the Federal Communications Commission could be one way to ensure the FTC better communicates with outsiders raising privacy concerns.

These are mostly tweaks to FTC process, however, and while they address some specific complaints about privacy enforcement, they don’t address concerns that regulators have been missing -- or avoiding -- some of the biggest privacy problems we face. This is where the rigid opposition to private rights of action and failure to acknowledge the larger concern is so frustrating.

Sensitive data types present a good example. Unrestrained collection and use of biometrics and geolocation data have become two of the biggest privacy fights of the moment. There has been a shocking lack of transparency or corporate accountability around how companies collect and use this information. Their use could be the key to combating the ongoing pandemic; their misuse a tool for discrimination, embarassment, and surveillance. If ever there were data practices where more oversight is needed, these would be it.

Yet, the rapid creep of facial recognition gives us a real-world test case for how agency enforcement can be lacking. While companies have been calling for discussions about responsible deployment of facial recognition even as they pitch this technology to every school, hospital, and retailer in the world, Clearview AI just up and ignored existing FTC guidance and state law. Washington state has an existing biometric privacy law, which the state Attorney General admitted has never been the basis of an enforcement action. To my knowledge, the Texas Attorney General also has never brought a case under that state’s law. Meanwhile, the Illinois Biometric Privacy Act (BIPA) may be the one legal tool that can be used to go after companies like Clearview.

BIPA’s private right of action has been a recurring thorn in the sides of major social media companies and theme parks rolling out biometrics technologies, but no one has really cogently argued that companies aren’t flagrantly violating the law. Let’s not forget that facial recognition settings were an underappreciated part of the FTC’s most recent settlement with Facebook, too. However, no one can actually discuss how to tweak or modernize BIPA because industry groups have had a single-minded focus on stripping the law of all its private enforcement components.

Industry has acted in lockstep to insist it is unfair for companies to be subject to limitless liability by the omnipresent plaintiffs bar for every minor or technical violation of the law. And that’s the rub!

There is no rule that says a private right of action must encompass the entirety of a privacy law. One of the compromises that led to the California Consumer Privacy Act was the inclusion of a private right of action for certain unreasonable data breaches. Lawmakers can take heed and go provision-by-provision and specify exactly what sorts of activities could be subject to private litigation, what the costs of the litigation might be, and what remedies can ultimately be obtained.

The U.S. Chamber of Commerce has been at the forefront of insisting that private rights of action are poor tools for addressing privacy issues, because they can “undermine appropriate agency enforcement” and hamper the ability of “expert regulators to shape and balance policy and protections.” But what’s the objection then in areas where that’s not true?

The sharing and selling of geolocation information has become especially pernicious, letting companies infer sensitive health conditions and facilitating stalking. Can any industry voice argue that companies have been well-behaved when it comes to how they use location information? The FTC clearly stated in 2012 that precise geolocation data was sensitive information warranting extra protections. Flash forward to 2018 and 2019, where The New York Times is engaged in annual exposés on the wild west of apps and services buying and selling “anonymous” location data. Meanwhile, the Communications Act requires carriers to protect geolocation data, and yet the FCC fined all four major wireless carriers a combined $200 million for sharing their subscribers’ geolocation data with bounty hunters and stalkers in February of this year.

Businesses do not need regulatory clarity when it comes to location data -- companies need to put in a penalty box for an extended timeout. Giving individuals the ability for private injunctive relief seems hardly objectionable given this track record. Permitting class actions for intentional violations of individuals’ geolocation privacy should be on the table, as well.

There should be more to discuss than a universe where trial attorneys sue every company for every privacy violation or a world where lawmakers hand the FTC a blank check. Unfortunately, no one has yet put forward a vision for what the optimum level of privacy enforcement should be. Privacy researchers, advocates, and vulnerable communities have forcefully said the status quo is not sufficient. If industry claims it understands the importance of protecting privacy but just needs more clarity about what the rules are, companies should begin by putting forward some plans for how they will help individuals, families, and communities when they fall short.

Joseph Jerome, CIPP/US, is a privacy and cybersecurity attorney based in Washington, D.C. He currently is the Director of Multistate Policy for Common Sense Media.

Hide this

Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.

Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.

While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.

–The Techdirt Team

Filed Under: enforcement, ftc, privacy, privacy laws, private rights of action


Reader Comments

Subscribe: RSS

View by: Thread


  • icon
    Koby (profile), 29 May 2020 @ 2:00pm

    The Deception Option

    One nefarious possibility might be to make it legal to provide false identity information to some corporations. People might be able to pay (in cash!) to get a fake ID card made, and buy a cell phone. The carrier is creating a profile, but for who? Continue to pay the subscription with one of those prepaid debit cards. And then burn that phone after about a year, and get a new one. Of course, law enforcement won't like this, because of course they are also taking advantage of the lack of privacy. But without enforcement, we might have an option to fight back, if some minor tweaks to existing law are made.

    link to this | view in chronology ]

  • identicon
    Anonymous Coward, 29 May 2020 @ 2:13pm

    Yeah, the lawsuits have played out over a long time.

    Scientific expertise and rational prohumanity "agency" are required to solve the network design problem.

    I'm not hopeful.

    link to this | view in chronology ]

  • This comment has been flagged by the community. Click here to show it
    identicon
    Anonymous Coward, 29 May 2020 @ 2:15pm

    there oughta be a law

    quite a screed

    author puts astonishing faith in law and strict enforcements to address his pet peeve; must be highly ignorant of how law making and enforcement actually works in the real world.

    link to this | view in chronology ]

    • icon
      Anonymous Anonymous Coward (profile), 29 May 2020 @ 3:12pm

      Re: there oughta be a law

      "Joseph Jerome, CIPP/US, is a privacy and cybersecurity attorney based in Washington, D.C. He currently is the Director of Multistate Policy for Common Sense Media."

      Then you would be wrong. Try reading the whole article. All of it, the next time.

      link to this | view in chronology ]

      • identicon
        Anonymous Coward, 29 May 2020 @ 7:58pm

        Re: Re: there oughta be a law

        I think they were reading something else, maybe in the theatre of their head, since they mentioned a "screed".

        link to this | view in chronology ]

  • icon
    Upstream (profile), 29 May 2020 @ 5:00pm

    companies need to put in a penalty box for an extended timeout.

    No, the CEO's, COO's, presidents, vice-presidents, etc need to be put in a penalty box for an extended timeout.

    It is unfortunate, but many people don't care about their company, stockholders or whatever, as long as they can count on their golden parachutes. Prison or poverty are the only things that mean anything to these types of people. And by poverty, I mean "living in public housing and bagging groceries for a living" type poverty. Fining companies is usually meaningless, and giving victims the "opportunity" to waste time, and possibly spend lots of money, over many years of civil litigation is equally nonsensical.

    link to this | view in chronology ]

  • icon
    That Anonymous Coward (profile), 29 May 2020 @ 5:06pm

    "Your privacy is important to us"
    The check is in the mail, the government is your freind and no baby I won't something something in your mouth...

    Little lies we all seem to accept, & refuse to demand change.

    Data Broker collects your entire life & leaks it all over...
    oooh credit monitoring, faux apology, & you get the bill to fix your life.
    Fine to the company... less than $5 a person who got screwed.

    Cory Doctrow has compared our personal data to toxic waste, something that should be locked away because allowing it out will cause huge harm. The downside is we keep accepting the "SuperFund" solution where we let it leak, slap the leaker on the wrist, & all the costs for clean up are paid by the public.

    These companies are making huge amounts from collecting it all, but it seems the law wants to protect their profits over us. We need to rethink how this works & put people first profits maybe 3rd.

    link to this | view in chronology ]

  • This comment has been flagged by the community. Click here to show it
    identicon
    Anonymous Coward, 29 May 2020 @ 5:30pm

    They still don't have a good way to deal with the foreign fugitive hostage-takers problem.

    People who show up in regular courts and do reasonable things aren't the problem.

    link to this | view in chronology ]

  • identicon
    Anonymous Coward, 29 May 2020 @ 6:39pm

    Then, on top of that, I think you'd really need to have an anti-SLAPP like mechanism to ward off frivolous/vexatious/nuisance suits. But I haven't seen anyone talk about such a feature within the privacy context.

    One of the problems here is the requirements to bring a "valid" privacy lawsuits.

    Scenario A is where someone has managed to catch a company red handed, through whistleblowing, testimony in another lawsuit, etc. Evidence is already in hand, the fact a SLAPP-like mechanism cuts short discovery would not affect this scenario.

    Scenario B is where plaintiff only has indirect evidence of privacy problems. e.g. "I have only ever supplied that particular (name+address, or email, or phone, or whatever) to (defendant)" While it can be clear that the defendant is involved, there is no evidence that defendant acted improperly. The plaintiff has a proper complaint, but without Discovery, they have no hope of proceeding.

    ...Unless the defendant is held to a fiduciary-equivalent standard. Is THAT happening?

    link to this | view in chronology ]

  • identicon
    Anonymous Coward, 29 May 2020 @ 7:51pm

    I imagine it would rather be like enforcing environmental regulations, or anything else designed to protect the general populace.

    link to this | view in chronology ]

  • identicon
    Anonymous Coward, 29 May 2020 @ 8:08pm

    The U.S. Chamber of Commerce has been at the forefront of insisting that private rights of action are poor tools for addressing privacy issues, because they can “undermine appropriate agency enforcement” and hamper the ability of “expert regulators to shape and balance policy and protections.”

    I'm just going to straight-up go for an argumentum ad hominem-style thought and say, "As if anyone should listen to a Chamber of commerce, ever".

    Since i listened anyway... If there were agency enforcement, people probably would not be suing, particularly if said putative agency were to extract proper compensation as needed and deliver it to harmed parties. This generally doesn't happen anyway, even if violators are somehow punished and forced to comply, so private rights of action only make sense. On the second note, what expert regulators, and assuming there are any which aren't simply coreligionists or puppets of the corporations, why have they not done anything up to now?

    Thanks, U.S. Chamber of Commerce, you're as useful as ever.

    That said, i think Mike is correct in his comment that, "I think you'd really need to have an anti-SLAPP like mechanism to ward off frivolous/vexatious/nuisance suits." But then, i think there should be a mechanism like that for all lawsuits.

    link to this | view in chronology ]

  • icon
    ECA (profile), 30 May 2020 @ 2:21pm

    In all of this...

    Over the years there has been a few problems..
    Few if any, know Who to contact/complain to If there is/were a problem.
    Its like the old complaint, 'Its made in China, so its crap', And I suggest to them, that the US company over there, asked China to make it to THEIR spec.. Then brought it to you and sold it. Who is at fault?

    How many Gov. Consumer protection agencies are there and whats their numbers? Oh! I forgot, we have the internet.. And the Gov. made sites KINDA suck.

    link to this | view in chronology ]


Follow Techdirt
Essential Reading

New To Techdirt?

Explore some core concepts:

read all »

Techdirt Insider Discord

The latest chatter on the Techdirt Insider Discord channel...

Loading...
Recent Stories

This site, like most other sites on the web, uses cookies. For more information, see our privacy policy. Got it
Close

Email This

This feature is only available to registered users. Register or sign in to use it.