California Bill Could Introduce A Constitutionally Questionable 'Right To Be Forgotten' In The US
from the well-meaning-but-poorly-thought-out dept
As we've pointed out concerning the General Data Protection Regulation (GDPR) in the EU, the thinking behind the regulation is certainly well-meaning and important. Giving end users more control over their own data and increasing privacy controls is, generally speaking, a good idea. However, the problem is in the drafting of the GDPR, which is done in a manner that will lead to widespread censorship. A key part of the problem is that when you think solely in terms of "privacy" or "data protection" you sometimes forget about speech rights. I have no issue with giving more control over actually private information to the individuals whose information is at stake. But the GDPR and other such efforts take a much more expansive view of what information can be controlled, including public information about a person. That's why we've been troubled by the GDPR codifying a "right to be forgotten." We've already seen how the RTBF is leading to censorship, and doing more of that is not a good idea.
But now the idea is spreading. Right here in California, Assemblymember Mark Levine has introduced a local version of the GDPR, called the California Data Protection Authority, which includes two key components: a form of a right to be forgotten and a plan for regulations "to prohibit edge provider Internet Web sites from conducting potentially harmful experiments on nonconsenting users." If you're just looking from the outside, both of these might sound good as a first pass. Giving end users more control over their data? Sounds good. Preventing evil websites from conducting "potentially harmful experiments"? Uh, yeah, sounds good.
But, the reality is that both of these ideas, as written, seem incredibly broad and could create all sorts of new problems. First, on the right to be forgotten aspect, the language is painfully vague:
It is the intent of the Legislature to ensure that personal information can be removed from the database of an edge provider, defined as any individual or entity in California that provides any content, application, or service over the Internet, and any individual or entity in California that provides a device used for accessing any content, application, or service over the Internet, when a user chooses not to continue to be a customer of that edge provider.
Any content? Any application? At least the bill does limit "personal information" to a limited category of topics, so we're not just talking about "embarrassing" information, a la the EU's interpretation of the right to be forgotten. But "personal information" is still somewhat vague. It does include "medical information" which is further defined as "any individually identifiable information, in electronic or physical form, regarding the individual’s medical history or medical treatment or diagnosis by a health care professional." So, would that mean that if we wrote about SF Giants pitcher Madison Bumgarner, and the fact that his broken pinky required pins and he won't be able to pitch for a few weeks... we'd be required to take that information down if he requested it? That seems like a pretty serious First Amendment problem.
This is the problem with writing broad legislation that doesn't take into account the reality that sometimes this kind of information is made public for perfectly good reasons.
Similarly, the prohibition on "potentially harmful experiments." How does one define "potentially harmful"? Websites are in a never-ending state of experimentation. That's how they work. Everyone gets a different view on sites like Amazon and Netflix and Facebook and Google, because they're all trying to customize how they look for you. Is that "potentially harmful"? Maybe? It's also potentially very, very helpful. Before just throwing out the ability of websites to try to build better products, it seems like we should have a lot more of an exploration of the issue than just saying nothing "potentially harmful" is allowed. Because almost anything can be "potentially harmful."
Again, I'm quite sure that Levine's intentions here are perfectly good. There are very good reasons (obviously!) why so many people are concerned about the data that companies like Facebook, Google, Amazon and others are collecting on people. And these are big companies with a lot of power. But these rules seem vague and "potentially harmful" themselves. Beyond blocking perfectly natural "experimenting" in terms of how websites are run, these rules won't just impact those giants, but every website, including small ones like, say, this blog. Can we experiment in how we display our information? Or is that "potentially harmful" in that it might upset some of our regulars? That may sound silly, but under this law, it's not at all clear what is meant by "potentially harmful."
There are important discussions to be had about protecting individuals' privacy, and about experiments done by large companies with lots of data. But the approach in this bill seems to just rush into the fray without bothering to consider the actual consequences of these kinds of broad regulations.
Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.
Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.
While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.
–The Techdirt Team
Filed Under: california, gdpr, privacy, right to be forgotten
Reader Comments
Subscribe: RSS
View by: Time | Thread
[ link to this | view in chronology ]
(I'll bet that the former gets a specific exemption written into the law, but the latter does not.)
I don't know how daycare centers and others do background checks on the mental health of prospective employees, but I'll bet that in at least some cases private databases are involved. Now with handy web-based lookups.
[ link to this | view in chronology ]
Re:
After a certain number of years, a credit reporting company is actually required to remove negative information. For example, they can only keep a bankruptcy on your report for 10 years.
[ link to this | view in chronology ]
winston chruchill just rolled over in his grave
[ link to this | view in chronology ]
Re: winston chruchill just rolled over in his grave
And that's being optimistic. The potential for misuse of this law has me more worried about a variation of the saying:
"Those who would repeat the past must control the teaching of history."
- Frank Herbert, Dune
"Right to be forgotten" laws are how you would do it.
[ link to this | view in chronology ]
Re: winston chruchill just rolled over in his grave
[ link to this | view in chronology ]
Isn't that the way it always works?
[ link to this | view in chronology ]
Someone does something awful in a small village that gets public, said person gets his/her punishment and lives on normally. After a few decades that knowledge is still held by part of the people but immigrants and some kids don't know what happened. They then ask the older ones about that first someone and are told the story about the awful thing that person did so the knowledge keeps being relived somehow.
The question: how is this different from searching for a name on the internet? Are we going to hit people on the head to make them forget some stupid things someone did just because it might soil their reputation?
There right to be forgotten laws are trying to solve a more fundamental social/human issue. We tend not to believe people can change and redeem themselves. We tend to attach guilt for life and punish them for life. See how people who spent some time in jail simply can't get decent jobs after they are reinserted into social life. People make mistakes and get punished for them. We should only punish them again if they make other mistakes. Maybe then you won't need to erase your past but keep it there as a reminder of how you improved.
[ link to this | view in chronology ]
[ link to this | view in chronology ]
Potentially harmful?
If you want an example of how they'd probably interpret that, just look at their cancer warnings, now extending to things like coffee.
[ link to this | view in chronology ]
Matter of Public Significance
“None of these opinions directly controls this case; however, all suggest strongly that if a newspaper lawfully obtains truthful information about a matter of public significance then state officials may not constitutionally punish publication of the information, absent a need to further a state interest of the highest order.”
——Smith v. Daily Mail (1979)
“Our refusal to construe the issue presented more broadly is consistent with this Court's repeated refusal to answer categorically whether truthful publication may ever be punished consistent with the First Amendment.”
——Bartnicki v Vopper (2001)
A few questions…
[ link to this | view in chronology ]
This isn't the Right to be Forgotten TD's been complaining about
The censorship-promoting Right to Be Forgotten that TD's been (quite correctly) complaining about for a couple years is entirely different.
I'm not saying that this isn't a bad law (it's certainly sloppily written), but this simply isn't the same RTBF that's been causing all the issues with websearch and press censorship. I think it's important to keep the two separate, as they refer to very different bodies of law.
[ link to this | view in chronology ]
Fun consequences
Working from the bit quoted, it reads like you could also demand that any entity which received "personal information" cease storing it (regardless of how it was obtained), even for legitimate business purposes, so long as those purposes involve using the Internet for transmission. Critically, the quoted piece does not permit the edge provider to claim an exemption based on not publicly posting the information, so you can demand that they discard all record of the relevant information. This could be a major blow to the pending debacle of Internet-accessible Electronic Health Records.
[ link to this | view in chronology ]
Look - it's no fun parading around showing off your new ill-gotten gains when everyone knows you stole them. How can you lord it over the minions? The angst!
[ link to this | view in chronology ]
What are the good reasons? This debate is going to heat up quickly. I have tended to see TD as being on the right side of history on most issues in the past.
But when it comes to surveillance capitalism and the infiltration of corporate america (and the government, via the patriot act etc) into our homes and lives through their increasingly agressive data... exfiltration practices, I am not sure that I agree. My perception is that TD tends to reflexively side with the big Silicon Valley players' interests and not properly consider the legimate opposing concerns.
TD should consider doing a series of interviews with prominent thought leaders in this area to show the range of views. This would help to cut through some of the noise and disinformation and foster a truly informed, evidence-based debate. It could include the more libertarian perspectives but also the views of privacy proponents and FOSS enthusiasts. Maybe start with Richard Stallman and Edward Snowden.
[ link to this | view in chronology ]
Prominent Thought Leaders [was Re: ]
“Prominent thought leaders” is a phrase which lends itself well to ridicule.
You got names?
[ link to this | view in chronology ]
Re:
[ link to this | view in chronology ]
[ link to this | view in chronology ]