evan_engstrom’s Techdirt Profile

evan_engstrom

About evan_engstrom




Posted on Techdirt Greenhouse - 15 June 2020 @ 12:00pm

Can You Build A Privacy Law That Doesn't Create Privacy Trolls?

from the finding-the-balance dept

It’s safe to assume that most people recognize patent trolling as a problem, one that arises from our uniquely inefficient legal system. If we were given the opportunity to redesign patent litigation rules from scratch, why would anyone intentionally create a system so costly and uncertain that bad faith actors have a financial incentive to file weak lawsuits against cash-strapped startups that are better off paying extortive settlements than actually winning the case? And yet, some privacy advocates are openly calling for policymakers to replicate this disastrous state of affairs through a new regime of private enforcement for proposed state and federal comprehensive consumer privacy laws.

In two recent posts on the Greenhouse, Joe Jerome and Ernesto Falcon outlined the debate about how to optimally enforce privacy laws; arguing that private rights of action in some form are necessary to make companies take consumer privacy seriously and claiming that industry’s preference for public agency enforcement reflects a desire to make privacy rules functionally toothless. While I disagree with their ultimate conclusion, I understand the skepticism of leaving enforcement up to chronically underfunded regulatory agencies with a subpar track record of curtailing privacy violations. I also agree with Jerome’s conclusion that a strict private vs. public enforcement dichotomy has created a “ridiculous impasse.” As an advocate for startups, I’m just the sort of industry representative positioned to answer his call for a more nuanced conversation around privacy enforcement:

If industry is serious about working toward clear privacy rules, business interests have two obligations: (1) they should offer up some new ideas to boost enforcement and address legitimate concerns about regulatory limitations and capture; and (2) they need to explain why private rights of action should be a non-starter in areas where businesses already are misbehaving.

I’ll address the second request first.

Simply put, comprehensive consumer privacy laws of the kind passed in recent years at the state level and currently under consideration at the federal level are uniquely susceptible to abusive private litigation. These laws feature all the hallmarks of legal regimes with trolling problems: damages wholly unrelated to any actual harm suffered, ambiguous liability standards that preclude early dismissal of weak claims, and high discovery costs that fall disproportionately on defendants.

Privacy harms are under-enforced in large part because, generally speaking, the U.S. legal system allows plaintiffs to recover only to the extent they’ve suffered economically quantifiable injuries. Privacy injuries, on the other hand, tend to be intangible. It may be morally reprehensible for a company to disseminate a user’s browsing history, but it’s unlikely to have caused the user any direct economic injury. Since people are usually somewhat economically rational, they don’t often file lawsuits that are likely to cost significantly more than they can recover. Consequently, the efficacy of any private enforcement regime for intangible privacy harms hinges on the availability of statutory damages. Indeed, the California Consumer Privacy Act and other proposed state laws offer statutory damages to plaintiffs—up to $750 per aggrieved user in data breach cases under CCPA, regardless of whether those users suffered any economic harms at all. As anyone familiar with copyright law knows, high statutory damages awards makes litigation incredibly lucrative for unscrupulous plaintiffs. Since privacy laws tend to support class litigation, private actions for privacy harms have the potential to be incredibly high-stakes, incentivizing plaintiffs to bring whatever claims they can, regardless of how substantively weak they may be.

Of course, the potential for massive judgements alone doesn’t create a trolling problem. If a plaintiff bringing a meritless claim faces possible sanctions or high litigation costs, the expected value of bringing a weak lawsuit decreases alongside its likelihood of success. But, under the American legal system, each party has to pay its own litigation costs, and sanctions for bad behavior are vanishingly rare. As such, a defendant is better off paying a settlement for any value lower than the cost of defense, even if the lawsuit is effectively meritless.

Fortunately for plaintiffs’ attorneys, privacy litigation is likely to be incredibly expensive for defendants, making extortive settlements a lucrative business model. New comprehensive consumer privacy laws are, as the name suggests, expansive. In order to cover as many business models, technologies, and unforeseen situations as possible, these laws are typically written with very general liability standards. Under the CCPA, for example, a defendant company is liable for a data breach if it failed to implement “reasonable security procedures.” What sorts of security practices are reasonable? The law doesn’t provide a definition, so the only way a company can ever know if its security practices are legally reasonable is to have a judge or jury so declare at the end of a lawsuit. Thus, even if a company has the best possible security measures in place but nevertheless suffers a data breach, it will have to spend massive sums of money to get rid of a lawsuit alleging that its security was insufficient. Vexatious plaintiffs will quickly figure this out and file lawsuits any time a company suffers a data breach, without regard as to whether users suffered injury or whether the company did anything wrong.

The same pattern plays out again and again whenever there are high litigation costs and ambiguous liability standards that prevent early dismissal of bad-faith lawsuits. Stock-drop lawsuits, patent trolling, TCPA abuse, FCRA litigation, the list goes on and on. Given the expansive scope of comprehensive privacy laws, a private right of action in this context could create a trolling problem that dwarfs anything we’ve seen before. Patent trolling is limited in some sense by the number of patents available, whereas virtually any activity involving consumer information could spark a privacy trolling lawsuit. In the first year of GDPR, European data regulators received more than 200,000 complaints, nearly 65,000 of which involved data breaches. Under U.S. rules, enterprising plaintiffs’ attorneys would have every incentive to turn all of these into lawsuits, regardless of their merits.

There are all sorts of theoretical reasons why centralizing enforcement power in a well-functioning expert regulatory agency is the optimal way to effectively enforce privacy law. For one, privacy law involves highly technical issues beyond the expertise of most judges. Similarly, since privacy harms like data breaches typically impact a large, geographically dispersed population without direct access to the underlying facts, a central governmental regulator is better positioned to gather the information necessary to bring successful enforcement actions. But, I take the point that leaving it solely up to federal and state regulators alone is likely to result in some under-enforcement, even with the budget increases that virtually everyone supports.

Thankfully, designing an optimal privacy enforcement regime doesn’t come down to a binary choice between relying exclusively on underfunded (and potentially captured) central regulators or creating a free-for-all of dubious private litigation, which brings me to my response to Jerome’s first demand to industry.

To expand enforcement capacity beyond federal and state agencies while preventing litigation abuse, we propose a multi-layered scheme of public and private enforcement that will empower non-financially motivated private attorneys general to bring class action lawsuits, allow injured individuals to obtain redress from companies, and create a direct private right of action for monetary damages against companies that flout the law.

First, to supplement FTC and state attorney general enforcement, Congress should take a page from the GDPR playbook and allow privacy-focused non-profits to bring class actions seeking injunctive relief on behalf of aggrieved users. Limiting class actions to non-profits seeking injunctive relief forecloses the possibility of financially motivated lawsuits and nuisance-value settlements while increasing the number of entities with privacy expertise available to enforce the law. We would also support giving the FTC the authority to levy fines in the first instance, allowing for financial penalties against companies subject to injunctions arising from non-profit lawsuits.

Second, we recognize that some privacy harms are too individualized and small for class enforcement, so we propose allowing individual users to bring direct claims against companies for violations where injunctive relief / specific performance largely rectifies the harm. For example, most comprehensive privacy proposals give users the right to request deletion of their personal information. If a company refuses to comply with such a request, it’s unlikely that the FTC or a non-profit will bring a lawsuit to force the company to comply with a single ignored deletion request. Without a private right of action in this circumstance, a user will have no recourse unless and until the company ignores enough user deletion requests to draw regulator scrutiny. In this case, the appropriate remedy would be an order forcing the company to comply with the deletion request. Simply responding to a lawsuit to enforce a user deletion request would cost a company far more than following through with the request, so these types of lawsuits are unlikely to be prevalent.

Third, anyone injured by violation of a previously issued injunction mandating compliance with a comprehensive federal consumer privacy law should have the right to bring a lawsuit—individually or as a part of a class—for monetary damages. Basically, if a company violates the law, gets hit with an injunction and continues to commit the same violations, aggrieved users should be able to sue. Critically, while injured users should be able to seek actual economic damages, we also propose allowing users to obtain monetary damages for intangible injuries if the injunction at issue establishes a damages schedule for future violations. Giving the FTC or non-profit litigants the ability to seek injunctions that specify a form of liquidated damages for future violations of the injunction deters future violations and creates far more flexibility in appropriately compensating users for intangible privacy harms than would be afforded through a fixed statutory damages calculation. The FTC could determine that a narrow, technical violation is too minor to warrant a high damages award for future violations and tailor the consent decree to reflect the seriousness of the offense and desired level of deterrence.

This system would satisfy the concerns of privacy advocates and industry alike. It ensures that enforcement power isn’t solely vested in an overwhelmed agency and allows for individuals to hold companies directly accountable while preventing abusive litigation by building some checkpoints into the system to make sure that there’s no financial incentive to bring meritless claims.

In the end, there’s a lot of common ground between consumer advocates and tech companies on what a federal comprehensive consumer privacy law should look like, but the window for reaching agreement is closing fast. To move beyond the stalemate over how such a law should be enforced, we need to learn from the lessons of other areas of the law and avoid creating a wave of bad faith litigation that would disproportionately hurt smaller platforms and cement the power of the larger companies.

14 Comments


This site, like most other sites on the web, uses cookies. For more information, see our privacy policy. Got it