Can You Build A Privacy Law That Doesn't Create Privacy Trolls?
from the finding-the-balance dept
It’s safe to assume that most people recognize patent trolling as a problem, one that arises from our uniquely inefficient legal system. If we were given the opportunity to redesign patent litigation rules from scratch, why would anyone intentionally create a system so costly and uncertain that bad faith actors have a financial incentive to file weak lawsuits against cash-strapped startups that are better off paying extortive settlements than actually winning the case? And yet, some privacy advocates are openly calling for policymakers to replicate this disastrous state of affairs through a new regime of private enforcement for proposed state and federal comprehensive consumer privacy laws.
In two recent posts on the Greenhouse, Joe Jerome and Ernesto Falcon outlined the debate about how to optimally enforce privacy laws; arguing that private rights of action in some form are necessary to make companies take consumer privacy seriously and claiming that industry’s preference for public agency enforcement reflects a desire to make privacy rules functionally toothless. While I disagree with their ultimate conclusion, I understand the skepticism of leaving enforcement up to chronically underfunded regulatory agencies with a subpar track record of curtailing privacy violations. I also agree with Jerome’s conclusion that a strict private vs. public enforcement dichotomy has created a “ridiculous impasse.” As an advocate for startups, I’m just the sort of industry representative positioned to answer his call for a more nuanced conversation around privacy enforcement:
If industry is serious about working toward clear privacy rules, business interests have two obligations: (1) they should offer up some new ideas to boost enforcement and address legitimate concerns about regulatory limitations and capture; and (2) they need to explain why private rights of action should be a non-starter in areas where businesses already are misbehaving.
I’ll address the second request first.
Simply put, comprehensive consumer privacy laws of the kind passed in recent years at the state level and currently under consideration at the federal level are uniquely susceptible to abusive private litigation. These laws feature all the hallmarks of legal regimes with trolling problems: damages wholly unrelated to any actual harm suffered, ambiguous liability standards that preclude early dismissal of weak claims, and high discovery costs that fall disproportionately on defendants.
Privacy harms are under-enforced in large part because, generally speaking, the U.S. legal system allows plaintiffs to recover only to the extent they’ve suffered economically quantifiable injuries. Privacy injuries, on the other hand, tend to be intangible. It may be morally reprehensible for a company to disseminate a user’s browsing history, but it’s unlikely to have caused the user any direct economic injury. Since people are usually somewhat economically rational, they don’t often file lawsuits that are likely to cost significantly more than they can recover. Consequently, the efficacy of any private enforcement regime for intangible privacy harms hinges on the availability of statutory damages. Indeed, the California Consumer Privacy Act and other proposed state laws offer statutory damages to plaintiffs—up to $750 per aggrieved user in data breach cases under CCPA, regardless of whether those users suffered any economic harms at all. As anyone familiar with copyright law knows, high statutory damages awards makes litigation incredibly lucrative for unscrupulous plaintiffs. Since privacy laws tend to support class litigation, private actions for privacy harms have the potential to be incredibly high-stakes, incentivizing plaintiffs to bring whatever claims they can, regardless of how substantively weak they may be.
Of course, the potential for massive judgements alone doesn’t create a trolling problem. If a plaintiff bringing a meritless claim faces possible sanctions or high litigation costs, the expected value of bringing a weak lawsuit decreases alongside its likelihood of success. But, under the American legal system, each party has to pay its own litigation costs, and sanctions for bad behavior are vanishingly rare. As such, a defendant is better off paying a settlement for any value lower than the cost of defense, even if the lawsuit is effectively meritless.
Fortunately for plaintiffs’ attorneys, privacy litigation is likely to be incredibly expensive for defendants, making extortive settlements a lucrative business model. New comprehensive consumer privacy laws are, as the name suggests, expansive. In order to cover as many business models, technologies, and unforeseen situations as possible, these laws are typically written with very general liability standards. Under the CCPA, for example, a defendant company is liable for a data breach if it failed to implement “reasonable security procedures.” What sorts of security practices are reasonable? The law doesn’t provide a definition, so the only way a company can ever know if its security practices are legally reasonable is to have a judge or jury so declare at the end of a lawsuit. Thus, even if a company has the best possible security measures in place but nevertheless suffers a data breach, it will have to spend massive sums of money to get rid of a lawsuit alleging that its security was insufficient. Vexatious plaintiffs will quickly figure this out and file lawsuits any time a company suffers a data breach, without regard as to whether users suffered injury or whether the company did anything wrong.
The same pattern plays out again and again whenever there are high litigation costs and ambiguous liability standards that prevent early dismissal of bad-faith lawsuits. Stock-drop lawsuits, patent trolling, TCPA abuse, FCRA litigation, the list goes on and on. Given the expansive scope of comprehensive privacy laws, a private right of action in this context could create a trolling problem that dwarfs anything we’ve seen before. Patent trolling is limited in some sense by the number of patents available, whereas virtually any activity involving consumer information could spark a privacy trolling lawsuit. In the first year of GDPR, European data regulators received more than 200,000 complaints, nearly 65,000 of which involved data breaches. Under U.S. rules, enterprising plaintiffs’ attorneys would have every incentive to turn all of these into lawsuits, regardless of their merits.
There are all sorts of theoretical reasons why centralizing enforcement power in a well-functioning expert regulatory agency is the optimal way to effectively enforce privacy law. For one, privacy law involves highly technical issues beyond the expertise of most judges. Similarly, since privacy harms like data breaches typically impact a large, geographically dispersed population without direct access to the underlying facts, a central governmental regulator is better positioned to gather the information necessary to bring successful enforcement actions. But, I take the point that leaving it solely up to federal and state regulators alone is likely to result in some under-enforcement, even with the budget increases that virtually everyone supports.
Thankfully, designing an optimal privacy enforcement regime doesn’t come down to a binary choice between relying exclusively on underfunded (and potentially captured) central regulators or creating a free-for-all of dubious private litigation, which brings me to my response to Jerome’s first demand to industry.
To expand enforcement capacity beyond federal and state agencies while preventing litigation abuse, we propose a multi-layered scheme of public and private enforcement that will empower non-financially motivated private attorneys general to bring class action lawsuits, allow injured individuals to obtain redress from companies, and create a direct private right of action for monetary damages against companies that flout the law.
First, to supplement FTC and state attorney general enforcement, Congress should take a page from the GDPR playbook and allow privacy-focused non-profits to bring class actions seeking injunctive relief on behalf of aggrieved users. Limiting class actions to non-profits seeking injunctive relief forecloses the possibility of financially motivated lawsuits and nuisance-value settlements while increasing the number of entities with privacy expertise available to enforce the law. We would also support giving the FTC the authority to levy fines in the first instance, allowing for financial penalties against companies subject to injunctions arising from non-profit lawsuits.
Second, we recognize that some privacy harms are too individualized and small for class enforcement, so we propose allowing individual users to bring direct claims against companies for violations where injunctive relief / specific performance largely rectifies the harm. For example, most comprehensive privacy proposals give users the right to request deletion of their personal information. If a company refuses to comply with such a request, it’s unlikely that the FTC or a non-profit will bring a lawsuit to force the company to comply with a single ignored deletion request. Without a private right of action in this circumstance, a user will have no recourse unless and until the company ignores enough user deletion requests to draw regulator scrutiny. In this case, the appropriate remedy would be an order forcing the company to comply with the deletion request. Simply responding to a lawsuit to enforce a user deletion request would cost a company far more than following through with the request, so these types of lawsuits are unlikely to be prevalent.
Third, anyone injured by violation of a previously issued injunction mandating compliance with a comprehensive federal consumer privacy law should have the right to bring a lawsuit—individually or as a part of a class—for monetary damages. Basically, if a company violates the law, gets hit with an injunction and continues to commit the same violations, aggrieved users should be able to sue. Critically, while injured users should be able to seek actual economic damages, we also propose allowing users to obtain monetary damages for intangible injuries if the injunction at issue establishes a damages schedule for future violations. Giving the FTC or non-profit litigants the ability to seek injunctions that specify a form of liquidated damages for future violations of the injunction deters future violations and creates far more flexibility in appropriately compensating users for intangible privacy harms than would be afforded through a fixed statutory damages calculation. The FTC could determine that a narrow, technical violation is too minor to warrant a high damages award for future violations and tailor the consent decree to reflect the seriousness of the offense and desired level of deterrence.
This system would satisfy the concerns of privacy advocates and industry alike. It ensures that enforcement power isn’t solely vested in an overwhelmed agency and allows for individuals to hold companies directly accountable while preventing abusive litigation by building some checkpoints into the system to make sure that there’s no financial incentive to bring meritless claims.
In the end, there’s a lot of common ground between consumer advocates and tech companies on what a federal comprehensive consumer privacy law should look like, but the window for reaching agreement is closing fast. To move beyond the stalemate over how such a law should be enforced, we need to learn from the lessons of other areas of the law and avoid creating a wave of bad faith litigation that would disproportionately hurt smaller platforms and cement the power of the larger companies.
Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.
Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.
While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.
–The Techdirt Team
Filed Under: frivolous lawsuits, ftc, innovation, privacy, privacy trolls, private right of action, startups, torts
Reader Comments
Subscribe: RSS
View by: Thread
Always had this thought..
That Going to court COSTS MONEY..
A regular person can not goto court based on his own education and get much of anything done.
Part of the system has gotten abit Stupid, in that Everything must cost money.
Then there is the idea of fighting the state, and Almost every Judge Must be on the states Side, not unbiased. There have been tons of people that have gone to state courts, to be denied, and then Above that or on Appeal they Finally get the proper Judgements. Or even the reverse, as a corp goes up the ladder, after the state decides against them, and gets a Judgement reversed.
There are parts in this that Could stop some of this, but for some odd reasoning, it costs MORE to pursue, then to just take it in the shorts. The idea of the loser paying the winners fee's. Esp when its the consumer getting it in the shorts. OR at the very least that the Citizen, winner can then fight back to get things straightened out, FULLY.. Not a slap on the wrist for the corps we are fighting.
There are Some laws about False accusations, for the corps, but when it comes down to the citizens we have little to no recourse, that wouldnt cost us MORE or would take so much time out of our lives to process, the Monetary damages wouldnt even cover the time we have taken nor the legal representation, we NOW pay for.
[ link to this | view in thread ]
A Cycle
In my experience that touches upon TCPA abuse, the vexatious lawsuits from the troll lawyers will try to grab money from the lowest hanging fruit. That then causes an industry response, which results in the troll lawyers having to work harder, and the lawyers will present more bizarre legal theories with more desperate court cases. Certainly, there is an up-front litigation cost.
But, if the industry dots its I's and crosses its T's, and the courts shoot down the crazy legal theories and impose court costs, the troll lawyers go away. Unfortunately, sometimes a bad court ruling opens a window for a flood of news litigious lawyers. But if after a few rounds of troll litigation, and the claimants lose so badly that they end up paying the court costs, then the troll industry goes away. The cost to industry shifts from litigation to ongoing compliance.
Possibly we could have a private enforcement mechanism IF courts and legislators are responsive enough. But how can you ensure that there is a speedy resolution when problems emerge? Can there be an ongoing legislative committee ready to get involved in disputes and update the law to clearly and properly define technical terms? Require some kind of technical competency of judges before they make a decision? I don't have a good answer, but any system is probably only as good as the people that make up the system. I'm willing to give the private enforcement system and statutory damages a shot, especially if we can somehow mandate refinement into the process. I've seen some adjacent TCPA businesses come close, even without much help from judges or lawmakers.
[ link to this | view in thread ]
Disbursement
I would add a caveat for those class actions, whomever is prosecuting them, and that is that the attorneys receive a minimal compensation and that the class member receive the vast majority of any rewards. This should not be yet another vector for the lawyers to win much more than the harmed.
[ link to this | view in thread ]
This part of the enforcement framework seems promising:
First violation -> injunction / specific performance with penalties for second violation to be determined by severity of first violation. Second violation -> enforce previously determined penalties.
I am not so sure about this. Unfortunately, non-profit does not necessarily mean non-financially motivated. A lot of non-profits have many very highly paid administrators and employees, all of whom are highly motivated to maintain an income stream to support their salaries, expenses, perks, pensions, etc.
And, while this enforcement framework seems to be thinking in the right direction, the devil is always in the details, which in this case might be the actual privacy law itself. Who is covered or exempted? What is covered or exempted? Can you be required to waive some or all rights to use a product or service? Here are three very simple questions that do not have any simple answers. But those are topics for future discussions.
[ link to this | view in thread ]
Honestly, i could go for a decade of potential privacy trolling litigation before the laws are reformed reasonably to cover whatever language was abused. (We really need more evidence-based legislation subject to evidence-based review after a number of years, period.)
There are very good points and ideas in this article, and the other privacy-related articles i've read in the Greenhouse. And while i am by no means a member of the "do something" (i.e., anything that passes even if stupid and counterproductive) crowd, i am honestly not so worried about these companies, startups or incumbents, with their awful-from-the-get business models. Anything that goes wrong with initially well-thought legislation can be fixed in post. In fact, anyone advocating for any particular legislation should not stop once legislation has passed, but should keep advocating for change to improve such legislation against bad consequences (foreseen and unforeseen), and increase the good consequences.
Obviously bad legislation should be stopped at the door.
[ link to this | view in thread ]
Third Party Data
In the interest of privacy, it should be made very clear that data collected by third parties belongs to the individual, not the third party, and that access to that data requires a warrant with sufficient specificity to name exactly the who, what, when, where, why, and how for what they are looking for. Their purpose should be to confirm already known information, rather than find leads. No fishing expeditions by law enforcement.
[ link to this | view in thread ]
Re:
The limitation to injunctive relief minimises that, since they can't get a giant cash award and spend it all on lawyer's fees.
[ link to this | view in thread ]
Re: A Cycle
"But, if the industry dots its I's and crosses its T's, and the courts shoot down the crazy legal theories and impose court costs, the troll lawyers go away. Unfortunately, sometimes a bad court ruling opens a window for a flood of news litigious lawyers."
In other words, if pigs had wings, and hell freezes over then troll lawyers will go away?
Here's the problem. The industry has no incentive at all to do better. Even if courts wanted to change the way they operate that's a fundamental legislative issue...and the legislators are, from both sides of the aisle, completely disinclined to change the status quo.
In fact it's the other way around, as Evan says in the OP; "...why would anyone intentionally create a system so costly and uncertain that bad faith actors have a financial incentive to file weak lawsuits against cash-strapped startups that are better off paying extortive settlements than actually winning the case?"
Drink that in for a while - the US patent system is either built explicitly to incentivize bad behavior or the bug in the original system has long since become a feature which both parties consider useful.
It's not hard to see why, either. It's a steady stream of campaign funding to any candidate up for election who agrees to not reform the obviously broken system.
[ link to this | view in thread ]
"Reasonable" security
A simple way to resolve the vagueness problems would be to specify that certain types of information be treated according to the federal requirements for various grades of classified data.
[ link to this | view in thread ]
Re: Disbursement
The only problem I can see with that is it limits the pool of attorneys to those willing to take the case for minimal compensation. I don't know that much about the profession, but I guess that would be civic minded or non profit lawyers (EFF, ACLU, and the like) and lawyers who aren't able to charge more, whether due to skill, experience, connections or what have you. Hiring a high priced top notch private attorney to represent the class would not be an option. Maybe that would be a problem and maybe not, but I'm not sure it would be worth it.
[ link to this | view in thread ]
I can't say I have a magic solution to this, but in general, it simply has to come in the form of not being profitable to out litigate the other party so that litigation is affordable. Attorneys' fees have gone out of sight. My niece graduated from law school a year ago. She got $500.00/hr as her starting salary. With the corona virus keeping people working remotely, she's getting about 14k/month to not do a whole lot. I'm thrilled that she's done so well and very proud of her. She's really smart and is a hard worker, but seriously, how can most people afford to pay at least this much for someone who has actually litigated cases?
[ link to this | view in thread ]