Can You Protect Privacy If There's No Real Enforcement Mechanism?
from the enforcement-matters dept
Privacy laws can have a lot of moving pieces from notices and disclosures, opt-in and opt-out consent requirements to privacy defaults and user controls. Over the past few years, there has been significant progress on these issues because privacy advocates, consumer groups, industry voices, and even lawmakers have been willing to dive into definitional weeds, put options on the table, and find middle ground. But this sort of thoughtful debate has not happened when it comes to how privacy laws should be enforced and what should happen when companies screw up, families are hurt, and individuals’ privacy is invaded.
Instead, when it comes to discussing private rights of action and agency enforcement, rigid red lines have been drawn. Consumer groups and privacy advocates say let individuals sue in court -- and call it a day. Business interests, when they talk about “strong enforcement,” often mean letting an underfunded Federal Trade Commission and equally-taxed state Attorneys General handle everything. Unfortunately, this binary, absolutist dispute over policing privacy rights threatens to sink any progress on privacy legislation.
It happened in Washington state, which failed to enact a comprehensive privacy framework in March because of a single sentence that could have let some consumers sue to enforce their rights under the state’s general Consumer Protection Act. Private rights of action have stymied state privacy task forces, and the issue is consuming efforts by the Uniform Law Commission to craft a model privacy bill. This is but a microcosm of what we’ve seen at the federal level, where lawmakers are at “loggerheads” over private rights of action.
This impasse is ridiculous. Advocacy groups share some blame here, but industry voices have failed to put any creativity into putting an alternative path forward. Company after company and trade association after trade association have come out in favor of privacy rules, but the response to any concern about how to ensure those rules are followed has been crickets. Few seem to have given much thought into what enforcement could look like beyond driving a Brinks truck full of money up to the FTC. That is not good enough. If industry is serious about working toward clear privacy rules, business interests have two obligations: (1) they should offer up some new ideas to boost enforcement and address legitimate concerns about regulatory limitations and capture; and (2) they need to explain why private rights of action should be a non-starter in areas where businesses already are misbehaving.
First, while we can acknowledge the good work that the FTC (and state Attorney Generals) has done, we should also concede that agencies cannot address every privacy problem and have competing consumer protection priorities. Commentators laud the FTC’s privacy work but have not suggested how an FTC with more resources will not just do more of what it’s already doing. There are outstanding considerations animating efforts to create an entirely new federal privacy agency (and that’s on top of a proposal in California to set up its own entirely new “Privacy Protection Agency”). Improving the FTC’s privacy posture will require more than more money and personnel.
Part of this will be creating mechanisms that ensure individuals can get redress. One idea would be to require the FTC to help facilitate complaint resolutions. The Consumer Financial Protection Bureau already does this to some extent with respect to financial products and services. The CFPB welcomes consumer complaints -- and then works with financial companies to get consumers a direct response about problems. These complaints also help the CFPB identify problems and prioritize work, and then CFPB publishes (privacy friendly) complaint data. This stands in contrast to the FTC’s Consumer Sentinel Network, which is a black box to the public.
Indeed, the FTC’s complaint system is opaque even to complainants themselves. The black box nature of the FTC is, fairly or not, a constant criticism by privacy advocates. A group of advocates began the Trump administration by calling for more transparency from the Commission about how it handles complaints and responds to public input. I can speak to this issue, submitting my own complaint to the FTC about the privacy and security practices of VPNs in 2017. Months later, the FTC put out a brief blog post on the issue, which I took to be the end of the matter on their end. Some sort of dualtrack informal and formal complaint process like the Federal Communications Commission could be one way to ensure the FTC better communicates with outsiders raising privacy concerns.
These are mostly tweaks to FTC process, however, and while they address some specific complaints about privacy enforcement, they don’t address concerns that regulators have been missing -- or avoiding -- some of the biggest privacy problems we face. This is where the rigid opposition to private rights of action and failure to acknowledge the larger concern is so frustrating.
Sensitive data types present a good example. Unrestrained collection and use of biometrics and geolocation data have become two of the biggest privacy fights of the moment. There has been a shocking lack of transparency or corporate accountability around how companies collect and use this information. Their use could be the key to combating the ongoing pandemic; their misuse a tool for discrimination, embarassment, and surveillance. If ever there were data practices where more oversight is needed, these would be it.
Yet, the rapid creep of facial recognition gives us a real-world test case for how agency enforcement can be lacking. While companies have been calling for discussions about responsible deployment of facial recognition even as they pitch this technology to every school, hospital, and retailer in the world, Clearview AI just up and ignored existing FTC guidance and state law. Washington state has an existing biometric privacy law, which the state Attorney General admitted has never been the basis of an enforcement action. To my knowledge, the Texas Attorney General also has never brought a case under that state’s law. Meanwhile, the Illinois Biometric Privacy Act (BIPA) may be the one legal tool that can be used to go after companies like Clearview.
BIPA’s private right of action has been a recurring thorn in the sides of major social media companies and theme parks rolling out biometrics technologies, but no one has really cogently argued that companies aren’t flagrantly violating the law. Let’s not forget that facial recognition settings were an underappreciated part of the FTC’s most recent settlement with Facebook, too. However, no one can actually discuss how to tweak or modernize BIPA because industry groups have had a single-minded focus on stripping the law of all its private enforcement components.
Industry has acted in lockstep to insist it is unfair for companies to be subject to limitless liability by the omnipresent plaintiffs bar for every minor or technical violation of the law. And that’s the rub!
There is no rule that says a private right of action must encompass the entirety of a privacy law. One of the compromises that led to the California Consumer Privacy Act was the inclusion of a private right of action for certain unreasonable data breaches. Lawmakers can take heed and go provision-by-provision and specify exactly what sorts of activities could be subject to private litigation, what the costs of the litigation might be, and what remedies can ultimately be obtained.
The U.S. Chamber of Commerce has been at the forefront of insisting that private rights of action are poor tools for addressing privacy issues, because they can “undermine appropriate agency enforcement” and hamper the ability of “expert regulators to shape and balance policy and protections.” But what’s the objection then in areas where that’s not true?
The sharing and selling of geolocation information has become especially pernicious, letting companies infer sensitive health conditions and facilitating stalking. Can any industry voice argue that companies have been well-behaved when it comes to how they use location information? The FTC clearly stated in 2012 that precise geolocation data was sensitive information warranting extra protections. Flash forward to 2018 and 2019, where The New York Times is engaged in annual exposés on the wild west of apps and services buying and selling “anonymous” location data. Meanwhile, the Communications Act requires carriers to protect geolocation data, and yet the FCC fined all four major wireless carriers a combined $200 million for sharing their subscribers’ geolocation data with bounty hunters and stalkers in February of this year.
Businesses do not need regulatory clarity when it comes to location data -- companies need to put in a penalty box for an extended timeout. Giving individuals the ability for private injunctive relief seems hardly objectionable given this track record. Permitting class actions for intentional violations of individuals’ geolocation privacy should be on the table, as well.
There should be more to discuss than a universe where trial attorneys sue every company for every privacy violation or a world where lawmakers hand the FTC a blank check. Unfortunately, no one has yet put forward a vision for what the optimum level of privacy enforcement should be. Privacy researchers, advocates, and vulnerable communities have forcefully said the status quo is not sufficient. If industry claims it understands the importance of protecting privacy but just needs more clarity about what the rules are, companies should begin by putting forward some plans for how they will help individuals, families, and communities when they fall short.
Joseph Jerome, CIPP/US, is a privacy and cybersecurity attorney based in Washington, D.C. He currently is the Director of Multistate Policy for Common Sense Media.
Re: Re: A Response to Mike
Fair that John doesn't say behavioral advertising, but to my understanding, the entire online advertising ecosystem is premised on automating "targeting" and "reach." The opacity of the entire stack invites mischief, so for purposes of John's proposal, I don't think it matters how what we're terming the functionality being provided by Google/Facebook and the major ad networks and exchanges.
/div>A Response to Mike
This is hardly a fleshed out response, but I find the constant argument that we cannot do X, whether it's privacy regulation or tweaks to Section 230 here, because it will assuredly further entrench Facebook and Google to be tired and played out.
It also assumes that their underlying business model -- OBA -- is legitimate, and that we must do what we can to prop up competition that can deliver targeted ads. I think there's mounting evidence that targeted advertising is fundamentally problematic. I'd rather we dramatically increase the costs to Google and Facebook of their business model than prop up equally problematic competition.
/div>When do groups negotiate on this in public?
Very much appreciate the response! So I put some thoughts into a Twitter thread (https://twitter.com/joejerome/status/1273341100154130432), but you may find me generally far more sympathetic to your position than Ernesto.
In general, I think your suggestion to permit some sort of privacy non-profit focused class action is a good one, and might work to formalize and expand the capacity of privacy groups to push for stronger changes within companies. I also think the idea of individualized recourse is a good one -- and really we should be speaking more about "recourse" than "enforcement" generally.
The issue is that industry voices have been largely silent. For the better part of the past year, I've raised at every salon/panel/working group meeting with industry reps what sort of non-PRA type recourse they'd be open to, and this has been largely met with confused silence. I'd like to see more industry voices signal they'd be open to discussing these sorts of ideas.
Now perhaps this a form of negotiating chicken, but I've seen no evidence that Republican lawmakers are getting any sort of message that there's room for discussing recourse mechanisms beyond the FTC and state AGs. But I haven't seen anything in paper from anyone.
I also appreciate you were responding to my two provocations, but I also think it's also worth addressing Cam Kerry's proposal around private rights of action, as well (https://www.brookings.edu/research/bridging-the-gaps-a-path-forward-to-federal-privacy-legislation/ ). I'm not sure he's addressed your concerns in full, but it is an attempt to try to cabin the most trolling of privacy litigation.
/div>(untitled comment)
Very much appreciate the comments!
Mike, I agree that any private enforcement right needs to be scoped such that it isn't overly abused. My solution is to go provision-by-provision in the privacy bill of your choice and have it out as to (1) who should have standing to sue; (2) which party should bear the cost of litigation; (3) what relief could be available; and (4) what the liability rules and burden of proof should be. Cam Kerry and John Morris at Brookings have done yeoman's work on private enforcement in a recent Brookings report: https://www.brookings.edu/research/bridging-the-gaps-a-path-forward-to-federal-privacy-legislation/
Jim, first, I'm entirely to blame for not giving Techdirt some better options for a title. I'm not sure "enforcement" is the best term for what I want more discussion about. The better term might be user recourse. So much of the privacy debate has turned to corporate accountability, but I'd rather know what companies will do when I get harmed.
As you note, that leads to a discussion of what's the "real harm" here, and we'll likely need to agree to disagree. I think there are real dignity interests that have been violated by data-hungry companies, but I'd love to go back to Helen Nissenbaum's notion of contextual integrity. Before that concept was reduced to a sort of disclosure regime, actually having a public discussion about what sorts of data practices should (or should not) be in bounds would have been very useful. I think the whole point of a good privacy law is that lawmakers have to make some value judgements about where the harms lie.
The big problem has been that privacy bills that focus on additional transparency requirements avoid that tough conversation. I would start with secondary uses of sensitive information. Most of the biggest privacy snafus are there. To the extent we're arguing over ad tracking disclosures, we're missing the mark.
/div>(untitled comment)
With all the respect in the world to Kate, my major critique of all pieces like this is that they never describe the sort of privacy regulation that would actually be suitable. It ends with a call for Congress to pass "one set of strong, sensible, and straightforward privacy protections," but the piece has already crapped on both the GDPR and the CCPA as places to start. So what then?
Both of Mike's questions get at the fact that industry groups, even as they call for strong federal privacy laws, don't really want anything that would change the status quo, by placing limits around secondary uses of information and limiting monetization avenues. Companies cannot abide by that, which is why they cannot get behind a national proposal that would seriously move the ball forward. (Compare, for example, ITIF's proposal for a "grand privacy bargain" with Brooking's recent paper on bridging privacy divides. From the surface, both papers are addressing the same issues and it looks like compromise might be possible, but they're a million miles apart on the stuff that would actually improves privacy.)
While it is unfair to shower the start-up community with huge privacy compliance costs, data-driven start-ups are often the biggest privacy problem children. Clearview says what?
/div>Techdirt has not posted any stories submitted by Joseph Jerome.
Submit a story now.
Tools & Services
TwitterFacebook
RSS
Podcast
Research & Reports
Company
About UsAdvertising Policies
Privacy
Contact
Help & FeedbackMedia Kit
Sponsor/Advertise
Submit a Story
More
Copia InstituteInsider Shop
Support Techdirt