from the enforcement-matters dept
Privacy laws can have a lot of moving pieces from notices and
disclosures, opt-in and opt-out consent requirements to privacy
defaults and user controls. Over the past few years, there has been
significant
progress
on these issues because privacy advocates, consumer groups, industry
voices, and even lawmakers have been willing to dive into
definitional weeds, put options on the table, and find middle ground.
But this sort of thoughtful debate has not happened when it comes to
how privacy laws should be enforced and what should happen when
companies screw up, families are hurt, and individuals’ privacy
is invaded.
Instead, when it comes to discussing
private rights of action and agency enforcement, rigid red lines have
been drawn. Consumer groups and privacy advocates say let
individuals sue in court -- and call
it a day. Business interests, when they talk about
“strong
enforcement,” often mean letting an underfunded
Federal Trade Commission and equally-taxed state Attorneys General
handle everything. Unfortunately, this binary,
absolutist dispute over policing privacy rights
threatens to sink any progress on privacy legislation.
It happened in Washington state, which
failed to enact a comprehensive privacy framework in March because
of a single sentence that could have let some
consumers sue to enforce their rights under the state’s general
Consumer Protection Act. Private rights of action have stymied state
privacy task forces, and the issue is consuming efforts by the
Uniform
Law Commission to craft a model privacy bill. This is
but a microcosm of what we’ve seen at the federal level, where
lawmakers are at “loggerheads”
over private rights of action.
This impasse is ridiculous. Advocacy
groups share some blame here, but industry voices have failed to put
any creativity into putting an alternative path forward. Company
after company
and trade
association after trade
association have come out in favor of privacy rules,
but the response to any concern about how to ensure those rules are
followed has been crickets. Few seem to have given much thought into
what enforcement could look like beyond driving a Brinks truck full
of money up to the FTC. That is not good enough. If industry is
serious about working toward clear privacy rules, business interests
have two obligations: (1) they should offer up some new ideas to
boost enforcement and address legitimate concerns about regulatory
limitations and capture; and (2) they need to explain why private
rights of action should be a non-starter in areas where businesses
already are misbehaving.
First, while we can acknowledge the
good work that the FTC (and state Attorney Generals) has done, we
should also concede that agencies cannot address every privacy
problem and have competing consumer protection priorities.
Commentators laud
the FTC’s privacy work but have not suggested how an FTC with
more resources will not just do more of what it’s
already doing. There are outstanding considerations animating efforts
to create an entirely
new federal privacy agency (and that’s on top of
a proposal in California to set up its own entirely new “Privacy
Protection Agency”). Improving the FTC’s privacy posture
will require more than more money and personnel.
Part of this will be creating
mechanisms that ensure individuals can get redress. One idea would be
to require the FTC to help facilitate complaint resolutions. The
Consumer Financial Protection Bureau already
does this to some extent with respect to financial
products and services. The CFPB welcomes consumer complaints -- and
then works with financial companies to get consumers a direct
response about problems. These complaints also help the CFPB identify
problems and prioritize work, and then CFPB publishes (privacy
friendly) complaint data. This stands in contrast to the FTC’s
Consumer
Sentinel Network, which is a black box to the public.
Indeed, the FTC’s complaint
system is opaque even to complainants themselves. The black box
nature of the FTC is, fairly or not, a constant criticism by privacy
advocates. A group of advocates began the Trump administration by
calling
for more transparency from the Commission about how it
handles complaints and responds to public input. I can speak to this
issue, submitting my
own complaint to the FTC about the privacy and
security practices of VPNs in 2017. Months later, the FTC put out a
brief
blog post on the issue, which I took to be the end of
the matter on their end. Some sort of dualtrack
informal and formal complaint process like the Federal
Communications Commission could be one way to ensure the FTC better
communicates with outsiders raising privacy concerns.
These are mostly tweaks to FTC process,
however, and while they address some specific complaints about
privacy enforcement, they don’t address concerns that
regulators have been missing -- or avoiding -- some of the biggest
privacy problems we face. This is where the rigid opposition to
private rights of action and failure to acknowledge the larger
concern is so frustrating.
Sensitive data types present a good
example. Unrestrained collection and use of biometrics
and geolocation
data have become two of the biggest privacy fights of
the moment. There has been a shocking lack of transparency or
corporate accountability around how companies collect and use this
information. Their use could be the key to combating the ongoing
pandemic; their misuse a tool for discrimination, embarassment, and
surveillance. If ever there were data practices where more oversight
is needed, these would be it.
Yet, the rapid creep of facial
recognition gives us a real-world test case for how agency
enforcement can be lacking. While companies have been calling for
discussions about responsible deployment of facial recognition even
as they pitch this technology to every school, hospital, and retailer
in the world, Clearview AI just up and ignored existing FTC
guidance and state law. Washington state has an
existing biometric privacy law, which the state Attorney General
admitted has never been the basis of an enforcement action. To my
knowledge, the Texas Attorney General also has never brought a case
under that state’s law. Meanwhile, the Illinois Biometric
Privacy Act (BIPA) may
be the
one legal tool that can be used to go after companies
like Clearview.
BIPA’s private right of action
has been a recurring thorn in the sides of major social media
companies and theme parks rolling out biometrics technologies, but no
one has really cogently argued that companies aren’t flagrantly
violating the law. Let’s not forget that facial recognition
settings were an
underappreciated part of the FTC’s most recent
settlement with Facebook, too. However, no one can actually discuss
how to tweak or modernize BIPA because industry groups have had a
single-minded focus on stripping the law of all its private
enforcement components.
Industry has acted in lockstep to
insist it is unfair for companies to be subject to limitless
liability by the omnipresent plaintiffs bar for every minor or
technical violation of the law. And that’s the rub!
There is no rule that says a private
right of action must encompass the entirety of a privacy law. One of
the compromises that led to the California Consumer Privacy Act was
the inclusion of a private right of action for certain unreasonable
data breaches. Lawmakers can take heed and go provision-by-provision
and specify exactly what sorts of activities could be subject to
private litigation, what the costs of the litigation might be, and
what remedies can ultimately be obtained.
The U.S. Chamber of Commerce has been
at the forefront of insisting
that private rights of action are poor tools for addressing privacy
issues, because they can “undermine appropriate agency
enforcement” and hamper the ability of “expert regulators
to shape and balance policy and protections.” But what’s
the objection then in areas where that’s not true?
The sharing and selling of geolocation
information has become especially pernicious, letting companies infer
sensitive health conditions and facilitating stalking. Can any
industry voice argue that companies have been well-behaved when it
comes to how they use location information? The FTC clearly stated in
2012 that precise geolocation data was sensitive information
warranting extra protections. Flash forward to 2018
and 2019,
where The New York Times is engaged in annual exposés
on the wild west of apps and services buying and selling “anonymous”
location data. Meanwhile, the Communications Act requires carriers to
protect geolocation data, and yet the FCC fined
all four major wireless carriers a combined $200
million for sharing their subscribers’ geolocation data with
bounty hunters and stalkers in February of this year.
Businesses do not need regulatory
clarity when it comes to location data -- companies need to put in a
penalty box for an extended timeout. Giving individuals the ability
for private injunctive relief seems hardly objectionable given this
track record. Permitting class actions for intentional violations of
individuals’ geolocation privacy should be on the table, as
well.
There should be more to discuss than a
universe where trial attorneys sue every company for every privacy
violation or a world where lawmakers hand the FTC a blank check.
Unfortunately, no one has yet put forward a vision for what the
optimum level of privacy enforcement should be. Privacy researchers,
advocates, and vulnerable communities have forcefully said the status
quo is not sufficient. If industry claims it understands the
importance of protecting privacy but just needs more clarity about
what the rules are, companies should begin by putting forward some
plans for how they will help individuals, families, and communities
when they fall short.
Joseph Jerome, CIPP/US, is a privacy and cybersecurity attorney based in Washington, D.C. He currently is the Director of Multistate Policy for Common Sense Media.
Filed Under: enforcement, ftc, privacy, privacy laws, private rights of action