Many Think Internet Privacy Is Lost, But That's Because You Can't Sue Anyone Who Violates It
from the trust,-privacy,-and-liability dept
Over 90% of Americans feel like they have no control over their online privacy. It is not hard to understand why so many of us feel so powerless when it comes to using the Internet, nor is the solution to such a pervasive feeling all that complicated.
It just boils down to rules and liability—or, in other words, making sure if a company violates your privacy per the law that there is an inescapable penalty. The clearer and more direct the path to holding a company accountable for violating your privacy—much like your physical health, property rights, emotional wellbeing, or other things held in legally enforceable trusts—the more confidence will return to the Internet marketplace.
But we don't have these clear enforceable rights in today's American consumer privacy legal system for a vast majority of Internet privacy related activity. In fact, when the next Google or Facebook scandal rolls around in terms of user privacy, think back to the last one—likely just a few months old—and ask how much in damages the company paid and whether the company had to compensate individual people for the violation.
In many cases, that answer is going to be no penalty, which then feeds into users' sense of powerlessness. But the fact that companies often have to pay no penalty, and the fact that we do not have laws in place to remedy these privacy harms, is a choice we have made. It is not the natural order of things, and it is not inevitable.
We have, as a society, made decisions under our intellectual property laws where absolutely no liability is allowed to promote another profitless value, namely our freedom of expression. For example, the practice of criticizing a film on YouTube while playing portions of it in the background is considered a fair use. This means, despite copyright holders having the exclusive rights over the public performance of their work, we have decided to extinguish liability when it involves the expression of criticism.
In the absence of fair use, the critic using the film, as well as YouTube, would be directly liable for a lot of money for playing portions of it. However, we counterbalance and limit the economic right of the filmmaker in order to promote free speech values through fair use. In essence, we keep a liability-free zone for criticism and that is generally seen as a net positive for users. It also promotes the creation of open platforms, allowing those speakers to discover audiences and build engagement.
But in consumer privacy we have not seen nearly the same benefit yielded back to consumers in exchange for the mostly liability-free zone. There is no race to the top in guarding consumer’s personal information, because the profit maximizing effort isn’t about augmenting our privacy, it is about tearing it down as much as possible for profit. This is why we keep getting these privacy scandals. There is no need to apply morality to the analysis as often happens when people observe corporate behavior, but rather the simple question of how profit maximization (which corporations have to pursue under the law) is being countered by law to reflect our expectations.
When we look at the problem of consumer privacy from this angle, it becomes fairly clear that private rights of action for consumer personal privacy would be transformative. No longer would a corporation view experiments with handling personal information as a generally risk free profit making proposition if financial damages and a loss of profit were involved.
Industry wants a Consumer Privacy Law—Just so Long As You Can’t Sue Them
The long road of industry opposition—and the extreme hypocrisy of now pretending to endorse passage of a comprehensive consumer privacy law—is worth reflecting on in order to understand why in fact we have no law today.
If we go back a little over a decade to a privacy scandal that launched a series of congressional hearings, we find a little company called NebuAd that specialized in deep packet inspection.
NebuAd’s premise was scary in that it proposed to allow your ISP to record everything you do online and then monetize it with advertisers. I was a staffer on Capitol Hill when NebuAd came to Congress to explain their product, and still remember the general shock at the idea being proposed. In fact, the idea was so offensive it garnered bi-partisan opposition from House leaders and ultimately led to the demise of NebuAd.
The legislative hearings that followed the growing understanding of “deep packet inspection” led to discussions of a comprehensive privacy bill in 2009. But despite widespread concern with developing industry practices as the technology was evolving, we never got anywhere out of concern for the freemium model of Internet products. It is hard to remember this time, but back then the Internet industry was still a fairly new thing to the public and Congress.
The iPhone had just launched two years earlier, and the public was still in the process of transitioning from flip phones to smartphones. Only three years prior had Facebook become available to the general public. Google had only a small handful of vertical products, the newest being Google Voice—which allowed people to text for free at a time when each text you sent cost a fee.
All of these things were seen as net positive to users, yet all hinged on the monetization of personal information being relatively liability free. So for years policymakers, including an all out effort by the White House in 2012, searched for a means to balance privacy with innovation. Companies generally known as “big tech” today were still very sympathetic entities in that the innovations they continued to produce were seen as both novel and useful to people. Therefore, their involvement was actively solicited by the White House in trying to jointly draft a means to promote privacy while allowing the industry to flourish.
Ultimately, it was a wasted effort because what industry actually wanted was the liability free zone to be baked into law with little regard to increasing degradation of user privacy. It used to be that most of the Internet companies still had competitors with each other forcing them to try to be more attractive to users with greater privacy settings. Even Google Search was facing a direct assault by Microsoft with their fairly new Bing product.
As efforts to figure out a privacy regime for Internet applications and services were being stalled by the Internet companies, progress was being made with the substantially more mature and the already regulated Internet Service Provider (ISP) industry.
Congress had already passed a set of privacy laws for communications companies under Section 222 of the Communications Act, so a great many ISPs, being former telephone companies, had a comprehensive set of privacy laws applicable to them (including private rights of action). But their transition into broadband companies began to muddy the waters, particularly as the Federal Communications Commission started to say in 2005 that broadband was magically different and therefore should be quasi-regulated.
Having learned nothing from the fiasco of NebuAd and potentially having “deep packet inspection” banned for ISPs, other privacy invasive ideas kept getting rolled out by the broadband industry. Things such as “search hijacking”—where your search queries were monitored and rerouted—became a thing. AT&T began forcibly injecting ads into WiFi hotpots at airports, wireless ISPs preinstalled “Carrier IQ” on phones to track everything you did (which ended when people sued them directly under a class action lawsuit), and Verizon invented the “super-cookie,” prompting a privacy enforcement response from the FCC in 2014.
Even after the FCC stopped treating broadband as uniquely different from other communications access technology in 2015, the industry continued to push the line. In that same year telecom carriers partnered with SAP Consumer Insight 365[9] to “ingest” data from 20 to 25 million mobile subscribers close to 300 times every day (we do not know which mobile telephone companies participate in this practice, as that information is kept a secret). That data is used to inform retailers about customer browsing, geolocation, and demographic data.
So unsurprisingly, the FCC came out with strong, clear ISP privacy rules in 2016 that continued the long tradition of privacy protections for our communication networks.
However, the heavily captured Congress, which had never taken a major pro-privacy vote on Internet policy in close to a decade, quickly took action to repeal the widely supported FCC privacy rules on behalf of AT&T and Comcast. Ironically, the creation of ISP privacy rules by the FCC only happened because Congress created a series of privacy laws, including private rights of action, for various aspects of our communication industry more than a decade prior.
While many of the leaders of the ISP privacy repeal effort claim to be foes of big tech, they have done literally next to nothing to move a consumer privacy law. In fact, all they did was solidify the capture of Congress by giving AT&T and Comcast a reason to team up with Google and Facebook in opposing real privacy reform.
EFF witnessed this joint industry opposition first hand as we attempted to rectify the damage Congress did to broadband privacy with a state law in California. In fact, between ISPs and big tech we had absolutely no new privacy laws passed in the states in 2017 in response to Congress repealing ISP privacy rules.
Despite the arrogant belief they could sustain perpetual capture at the legislative level, along came an individual named Alastair McTaggert who personally financed a ballot initiative on personal privacy that later became the California Consumer Privacy Act (CCPA).
While they could “convince” a legislator of the righteousness of their cause with political contributions, they had no real means to convince the individual that the status quo was good. After Cambridge Analytica and wireless carriers selling geolocation to a black market for bounty hunters, virtually no one thinks this industry should be unregulated on privacy.
So rather than continue to publicly oppose real privacy protections, the industry has opted to pretend it supports a law just so long as it gets rid of state laws (including state private rights of action), putting all our eggs into the basket of a captured regulator. In other words, they only will support a federal privacy law if it further erodes our personal privacy rather than enhance it.
This opening offer from industry is a wild departure from other privacy statutes that have all included an individual right to sue such as wiretaps, stored electronic communications, video rentals, driver’s licenses, credit reporting, and cable subscriptions. Not to miss their marching orders, industry-friendly legislators were quick to put together a legislative hearing on consumer privacy that literally had no one representing consumers.
But this game by industry, where so long as they can hold and finance enough legislators to prevent any real law from passing, will only last so long. Afterall, their effort to ban states from passing privacy laws is effectively dead once the Speaker of the House from California made it clear she would not undermine her own state’s law on behalf of industry.
Furthermore, Senator Cantwell, a leader on the Senate Commerce Committee, has introduced comprehensive legislation that includes a private right of action and more than a dozen Senators led by Senator Schatz have endorsed the concept supported by EFF of creating an information fiduciary. As more and more legislators make publicly clear the parameters of what they consider a good law, it becomes harder for industry to sustain the behind the scenes opposition. But we are still far away from the end, which means more has to be done in the states until enough of Congress can break free of the industry shell game.
If We Do Not Restore Trust in Internet Products, People Will Make Less Use of the Internet and That Comes with Serious Consequences
As we wrestle with containing COVID-19, a solution being proposed by Apple and Google in the form of contact tracing is facing a serious hurdle. A majority of Americans do not want to use health data applications and services from these companies because they do not trust what they will do with their information. Since they can’t directly punish these companies for abusing their personal health data, they are exercising the only real choice they have left: not to use them at all.
Numerous federal studies from federal agencies such as the Department of Commerce, the Federal Trade Commission, and the FCC all point to the same end result if we do not have real privacy protections in place for Internet activity. People will simply refrain from using applications and services that involve sensitive uses such as healthcare or finances. In fact, lack of trust in how our personal information is handled has a detrimental impact on broadband adoption in general. Meaning a growing number of people will just not use the Internet at all in order to keep their personal information to themselves.
Given the systemic powerlessness users feel about their personal information when they use the Internet, the dampening effect it has on fully utilizing the Internet, and the loss of broadband adoption, it is fairly conclusive that the near liability free zone is an overall net negative as a public policy. Congress should be working to actively give users back their control, instead of letting the companies with the worst privacy track records dictate users’ legal rights. Any new federal data privacy law must not preempt stronger state data privacy rules and contain a private right of action.
While special tailoring has to be done for startups and new entrants with limited finances to ensure they can enter the market under the same conditions Google and Facebook launched, this is not true for big tech.
Establishing clear lines of liability and rules for major corporate entities, efforts to launch the next privacy invasive tech will be scrutinized by corporate counsel eager to shield the company from legal trouble. That ultimately is the point of having a private right of action in law. It is not to flood companies with lawsuits, but rather for them to operate in a manner that avoids lawsuits.
As users begin to understand that they have an inalienable legal right to privacy when they use the Internet, they will begin to trust the products with greater and more sensitive uses that will benefit them. This will open new lines of commerce as a growing number of users willingly engage in deeply personal interactions with next generation applications and services. For all the complaints industry has about consumer privacy laws, the one thing they never take into account is the importance of trust. Without it we start to lose the full potential of what the 21st century Internet can bring.
Ernesto Falcon is Senior Legislative Counsel at the Electronic Frontier Foundation with a primary focus on intellectual property, open Internet issues, broadband access, and competition policy.
Filed Under: greenhouse, internet, liability, privacy, trust