Yes, Privacy Is Important, But California's New Privacy Bill Is An Unmitigated Disaster In The Making
from the not-how-to-do-it dept
We've talked a little about the rush job to pass a California privacy bill -- the California Consumer Privacy Act of 2018 (CCPA) -- and a little about how California's silly ballot initiatives effort forced this mad dash. But a few people have asked us about the law itself and whether or not it's any good. Indeed, some people have assumed that so many lobbyists freaking out about the bill is actually a good sign. But, that is not the case. The bill is a disaster, and it's unclear if the fixes that are expected over the next year and a half will be able to do much to improve it.
First, let's state the obvious: protecting our privacy is important. But that does not mean that any random "privacy regulation" will be good. In a future post, I'll discuss why "regulating privacy" is a difficult task to tackle without massive negative consequences. Hell, over in the EU, they spent years debating the GDPR, and it's still been a disaster that will have a huge negative impact for years to come. But in California they rushed through a massive bill in seven days. A big part of the problem is that people don't really know what "privacy" is. What exactly do we need to keep private? Some stuff may be obvious, but much of it actually depends quite heavily on context.
But the CCPA takes an insanely broad view of what "personal info" is covered. Section 1798.140(o)(1) defines "personal information" to mean... almost anything:
“Personal information” means information that identifies, relates to, describes, is capable of being associated with, or could reasonably be linked, directly or indirectly, with a particular consumer or household. Personal information includes, but is not limited to, the following:
(A) Identifiers such as a real name, alias, postal address, unique personal identifier, online identifier Internet Protocol address, email address, account name, social security number, driver’s license number, passport number, or other similar identifiers.
(B) Any categories of personal information described in subdivision (e) of Section 1798.80.
(C) Characteristics of protected classifications under California or federal law.
(D) Commercial information, including records of personal property, products or services purchased, obtained, or considered, or other purchasing or consuming histories or tendencies.
(E) Biometric information.
(F) Internet or other electronic network activity information, including, but not limited to, browsing history, search history, and information regarding a consumer’s interaction with an Internet Web site, application, or advertisement.
(G) Geolocation data.
(H) Audio, electronic, visual, thermal, olfactory, or similar information.
(I) Professional or employment-related information.
(J) Education information, defined as information that is not publicly available personally identifiable information as defined in the Family Educational Rights and Privacy Act (20 U.S.C. section 1232g, 34 C.F.R. Part 99).
(K) Inferences drawn from any of the information identified in this subdivision to create a profile about a consumer reflecting the consumer’s preferences, characteristics, psychological trends, preferences, predispositions, behavior, attitudes, intelligence, abilities, and aptitudes.
So, first off, note that it's not just associated with any individual, but also a "household." And, again, in the list above, there are lots of items and situations where it totally makes sense that that information should be considered "private." But... there are other cases where that might not be so obvious. Let's take "information regarding a consumer's interaction with an internet web site." Okay. Yes, you can see that there are reasonable privacy concerns around a company tracking everything you do on a website. But... that's also generally useful information for any website to have just to improve the user experience -- and basically every website has tended do some form of user tracking. It's not privacy violating -- it's just understanding how people use your website. So if I'm tracking how many people flow from the front page to an article page... now suddenly that's information impacted by this law. Perhaps the law is intended to mean tracking people on other websites through beacons and such... but the law appears to make no distinction between tracking people on your own website (a feature that's built into basically every webserver) and tracking people elsewhere.
Similarly, "preferences" is private information? Sure, on some sites and for some reasons that makes sense. But, I mean, we do things like let users set in their preferences whether or not they want to see ads on our site at all (in case you don't know, you can turn off ads on this site in the preferences, no questions asked). But... in order to make sure we don't show you ads, we kinda have to keep track of those preferences. Now, I think many of you will recognize that in removing ads, we're actually helping you protect your privacy. But under this law, we're now incentivized not to keep such preferences because doing so is now its own legal liability.
And that leads into the "costs" vs. the "benefits" of such a law. Again, lets be clear: many internet companies have been ridiculously lax in how they treat user data. That's a real problem that we don't, in any way, mean to diminish. But, the costs of this law seem very, very, very likely to outweigh a fairly minimal set of benefits. On the benefits side: yes, a few companies who have abused your data will face some pretty hefty fines for continuing to do so. That's potentially good. But the costs are going to be massive. For this part, I'll borrow from Eric Goldman's analysis of the bill, which is well worth reading. It's long, but he outlines just some of the likely "costs":
How Much Will This Cost? (part 1) Regulated companies–i.e., virtually every business in California–will need to spend money on compliance, including building new processes to deal with the various consumer requests/demands. Adding up all of the expenditures across the California economy, how much will this cost our society? It’s not like these expenditures come from some magic pot of money; the costs will be indirectly passed to consumers. Are consumers getting a good deal for these required expenditures?
How Much Will This Cost? (part 2) Lengthy statutes seem like they are detailed enough to eliminate ambiguity, but it actually works in reverse. The longer the statutes, the more words for litigators to fight over. This law would give us 10,000 different bases for lawsuits. One of the current tussles between the initiative and the bill is whether there is a private right of action. Right now, the bill attempts to limit the private causes of action to certain data breaches. If the private right of action expands beyond that, SEND YOUR KIDS TO LAW SCHOOL.
How Much Will This Cost? (part 3) The bill would create a new “Consumer Privacy Fund,” funded by a 20% take on data breach enforcement awards, to offset enforcement costs and judiciary costs. Yay for the bill drafters recognizing the government administration costs of a major new law like this. Usually, bill drafters assume a new law’s enforcement costs can be buried in existing budgets, but here, the bill drafters are (likely correctly) gearing up for litigation fiestas. But exactly how much will these administration costs be, and will this new fund be sufficient or have we written a blank check from the government coffers to fund enforcement? Most likely, I expect the Consumer Privacy Fund will spur enforcement, i.e., enforcement actions will be brought to replenish the fund to ensure it has enough money to pay the enforcers’ salaries–a perpetual motion machine.
Let's dig into that first part, because it's important. It's important to remind people that this bill is not an "internet" privacy bill. It's an everyone privacy bill. Any business that does business in California more or less is impacted (there are some limitations, but for a variety of reasons -- including vague terms in drafting -- those limitations may be effectively non-existent). So now, in order to comply, any company, including (for example!) a small blog like ours, will have to go through a fairly onerous process to even attempt to be in compliance (though, as point two in Eric's list above shows, even then we'll likely have no idea if we really are).
An analysis by Lothar Determann breaks out some of what we'd need to do to comply, including setting up entirely new processes to handle data access requests, including a system to verify identity and authorization of individuals, and ways of tracking requests and blocking the chance that anyone who opts out of certain practices is offered a chance to opt-back in. So... to use the example above, if someone sets a preference on our site not to see ads, but then makes a data privacy request that we not track that data, we then would likely need to first verify the person and that they are the one making the request, and then we'd need to set up a system to (get this!) make sure we somehow track them well enough so that they... um... can't "opt-in" to request that we no longer show them ads again.
Think about that. In order to stop letting someone opt out of ads on our site "for privacy purposes" we'd have to set up a system to track them to make sure that they aren't even offered the possibility of opting out of ads again. It's mind boggling. Also, this:
Consider alternative business models and web/mobile presences, including California-only sites and offerings, as suggested in Cal. Civ. Code §1798.135(b) and charges for formerly free services to address the complex and seemingly self-contradictory restrictions set forth in Cal. Civ. Code §1798.125 on a company's ability to impose service charges on California residents who object to alternate forms of data monetization.
Okay, so I'm all for businesses exploring alternative business models. It's what we talk about here all the time. But... requiring that by law? And, even requiring that we offer a special "California-only" site that we charge for?
I'm having difficulty seeing how that helps anyone's privacy. Instead, it seems like it's going to cost a ton. And... for limited to negative benefit in many cases. Just trying to figure out what this would cost us would mean we'd probably have to let go of multiple writers and spend that money on lawyers instead.
And that leaves out the cost to innovation in general. Again, this is not to slight the fact that abusive data practices are a real problem. But, under this law, it looks like internet sites that want to do any customization for users at all -- especially data-driven customization -- are going to be in trouble. And sure, some customization is annoying or creepy. But an awful lot of it is actually pretty damn useful.
An even larger fear: this could completely cut off more interesting services and business models coming down the road that actually would serve to give end users more control over their own data.
Also, as with the GDPR, there are serious First Amendment questions related to CCPA. A number of people have pointed out that the Supreme Court's ruling in Sorrell v. IMS Health certainly suggests some pretty serious constitutional defects with the CCPA. In Sorrell, the Supreme Court struck down a Vermont law that banned reporting on certain prescription practices of doctors as violating the First Amendment. It's possible that CCPA faces very similar problems. In an analysis by professor Jeff Kosseff, he explains how CCPA may run afoul of the Sorrell ruling:
CCPA is more expansive than the Vermont law in Sorrell, covering personal information across industries. Among its many requirements, CCPA requires companies to notify consumers of the sale of their personal information to third parties, and to opt out of the sale. However, CCPA exempts “third parties” from coverage if they agree in a contract to process the personal information only for the purposes specified by the company and do not sell the information. Although CCPA restricts a wider range of third-party activities than the Vermont statute, it still leaves the door open for some third parties to be excluded from the disclosure restrictions, provided that their contracts with companies are broadly written and they do not sell the data. For instance, imagine a contract that allows a data recipient to conduct a wide range of “analytics.” Because the recipient is not selling the data, the company might be able to disclose personal information to that recipient without honoring an opt-out request.
Under Sorrell, such distinctions might lead a court to conclude that CCPA imposes a content-based restriction on speech. Moreover, the findings and declarations section of the bill cites the revelations about Cambridge Analytica’s use of Facebook user data, and states “[a]s a result, our desire for privacy controls and transparency in data practices is heightened.” This could cause a court to conclude that the legislature was targeting a particular type of business arrangement when it passed CCPA.
I highly recommend reading the rest of Kosseff's analysis as well. He notes that he's generally in favor of many internet regulations -- and has been advocating for cybersecurity regulations and didn't think amending CDA 230 would be that bad (I think he's wrong on that... but...). And his analysis is that CCPA is so bad it cannot become law:
[M]y initial reaction was that nothing this unclear, burdensome, and constitutionally problematic could ever become law.
He then goes on to detail 10 separate serious issues under the law -- and notes that those are just his top 10 concerns.
While it's nice to see lots of people suddenly interested in privacy issues these days, the mad dash to "deal" with privacy through poorly thought out legislation where the target and intent is unclear other than "OHMYGOD INTERNET COMPANIES ARE DOING BAD STUFF!!!" isn't going to do any good at all. There is little in here that suggests real protection of privacy -- but plenty of costs that could significantly change the internet many of you know and love, in large part by making it nearly impossible for small businesses to operate online. And, frankly, ceding the internet to the largest providers who can deal with this doesn't exactly seem like a way to encourage smaller operations who actually are concerned about your privacy.
Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.
Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.
While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.
–The Techdirt Team
Filed Under: california, ccpa, free speech, gdpr, privacy, regulations
Reader Comments
Subscribe: RSS
View by: Time | Thread
Yes, Privacy Is Important, But GOOGLE vastly more important.
But we don't have to DO that, Google-boy! Those can be broken up until small enough to not have the power to sway regulators and legislators. It's easy.
Along with that is TAX THE HELL OUT OF THE RICH because can ONLY do evil with money and power.
You are utterly an inside-the-box Establishment thinker. -- And since supposedly for social justice, why are your notions never actually for empowering individuals, only for The Rich and their corporations?
Wasn't very long ago that "libertarians", Democrats, and socialists were ALL pretty uniformy of the view that corporations were bad, NOW those supposedly for "social justice" rarely mention individual rights, only worry about corporations.
You have been lured and subsumed by The Establishment, Masnick. Not surprising since born into it and don't actually know anyone outside of Ivy League weenies who believe all is essentially right with the world and why can't the peasants see how wise and enlightened we are?
[ link to this | view in thread ]
The problem isn't what "internet companies" do, but what "normal companies" can do with the stuff gathered by legal or illegal means.
I'm not sure how is it in the US, but nowadays, you can't be sure that an unknown phone number (that is, one that identifies itself but you don't know it) might want to call about.
Many are scams, or spam calls, they call you repeatedly to sell you whatever shit. That's when they don't try to scam you out telling you that you got a debt with them, they don't give you any real documents regarding it and they harass you so that you pay them.
Then there are those calls that, and this is without any real knowledge, are silent, but it seems that they are used to gather data or set a connection or whatever and subscribe you whatever paid service without your consent.
Still, not sure if the regulation is good or not, but the ones breaking the fucking internet is those bad actors themselves, not the regulators.
Sure, a good approach would be fine, but to be honest, some practices and data gathering should be banned outright.
No opt-ins, no opt-outs and shit like that, because there will always be someone that finds a loophole to fuck people over it.
PS: and no, anti-spam lists are the reactive type. That is, they act once a number has been identified as spam, and then they switch to other. They got a ton of those.
In the end, the one who is left without being able to answer the phone just in case is the users.
[ link to this | view in thread ]
foreclosures
[ link to this | view in thread ]
[ link to this | view in thread ]
Re: Yes, Privacy Is Important, But incomprehensible
I am afraid that I don't see how your commentary contributes to the discussion, but I think that perhaps you have made your point that you believe Google is somehow responsible for some ill that has befallen you? Please clarify.
[ link to this | view in thread ]
Re: Yes, Privacy Is Important, But GOOGLE vastly more important.
[ link to this | view in thread ]
Re:
It is entirely possible, in fact likely, for regulators to break it more than the bad actors did in an attempt to fix it when their attempt at regulation is done hastily.
The thing to realize about bad actors, spammers, and especially scammers, is that law and regulation in and of itself is not a fix. Scammers are already participating in illegal activity - fraud. That it is illegal does not stop them.
Making more things illegal will not stop the activities of those who are already breaking the law. It may stop the activities of those who are not currently breaking any law. This latter bit may or may not be good, depending on what the regulated activity is - and if what is currently legal is beneficial to the public as a whole, then regulating it away is not going to be a good move.
The big thing about regulation is that it needs to be well-thought out, it needs to avoid being overly broad, and it needs to be held up against vigorous criteria that will ensure that it will be effective, able to be implemented, and will actually be beneficial to the public.
If you are not certain if this regulation will do that, I'd recommend re-reading the article, reading the information linked to by the article, and possibly doing some independent research for arguments that are in favor of CCPA.
I can say that from what I've read of it so far, it definitely looks to me to be the kind of kneejerk regulation that will harm more than help - but that's just my take on it.
[ link to this | view in thread ]
Re: foreclosures
[ link to this | view in thread ]
Re: Yes, Privacy Is Important, But GOOGLE vastly more important.
You do not get it. Complex laws like this require a company spend time and money on complying with the law. A large corporation can find the time and money to do so, while an company with very few people has to employ an extra person if there is full time work there, or add to the workload of the people in the company.
That is why complex laws and regulations give advantage to large corporations, it cost them relatively very little, while a one man business now has to work an extra hour a day, or spend an hour less time doing things to make money so as they can cope with the extra administration.
[ link to this | view in thread ]
The Internet: It wasn't broken, but they fixed it anyway
[ link to this | view in thread ]
Mike, I don't think that you really understand the power of big data and machine learning.
It doesn't matter what your intent is regarding the collection of personal data.
All personal data can be repurposed and combined with other datasets and then combined with other datasets.
Before you know it you have a list of 200 million citizens with data points on every intimate aspect of their lives.
You have enough to rig elections or run pre-crime initiatives and potentially much, much worse.
I agree that a happy balance is required. Big data and machine learning has the potential to improve our lives in countless, immeasurable ways.
It should not be undermined. But, similarly, it is not free, and Trump and Brexit and a potential dystopian nightmare future is not a price I am willing to pay.
They don't need to break innovation and personalization. They just need to break on-selling and fundamental repurposing of personal data. This includes targeted ads.
[ link to this | view in thread ]
ISeeOpportunity
This is also an opportunity for startups and others that can make products to fill the void.
Think Tax Software, before software based tax packages people cried in emotional pain every tax season. Since tax software others have stepped up with solutions, including the old school tax prep's that filled the void by branding themselves into retail stores every season.
In the process of coming up with the solutions, there will be plenty of opportunity by the folks that have well meaning intentions to revamp parts of the legislation but until then this privacy bill will be a much needed stop gap and warning to companies that create the rules as they go, that ignore their own policy and create public awareness around this topic.
When enough people are up to speed, and the litany of lawsuits have worked through the court systems, something better can be crafted on a national level. It's not the end times, just opportunity and new spaces for new and established business to grow just like the tax software.
Think small business and the small business applications that are available with or without consulting. Here's another addon opportunity to their works. I see opportunity for companies to fill the void and paperwork nightmare that can be added to small business applications.
There's also the opportunity for groups like the E.F.F. to fill the void and become a brand in the sense of certification like you might see with United Labs (The UL listing) or consumer reports who can rate compliance with privacy provisions. Just hope it's not Google or Facebook that creates this certification, they are the problem and should not be part of the solution.
[ link to this | view in thread ]
Re:
Mike, I don't think that you really understand the power of big data and machine learning.
I think you're wrong about that, but I'm not sure it much matters.
All personal data can be repurposed and combined with other datasets and then combined with other datasets.
Yes. Sure.
Before you know it you have a list of 200 million citizens with data points on every intimate aspect of their lives.
Yes. And if this bill targeted THAT activity you'd have a point. It does not.
I agree that a happy balance is required. Big data and machine learning has the potential to improve our lives in countless, immeasurable ways.
Sure.
It should not be undermined. But, similarly, it is not free, and Trump and Brexit and a potential dystopian nightmare future is not a price I am willing to pay.
I'm... at a total loss as to what you think Trump and Brexit have to do with the CCPA?
They don't need to break innovation and personalization. They just need to break on-selling and fundamental repurposing of personal data. This includes targeted ads.
[ link to this | view in thread ]
[ link to this | view in thread ]
Only thing...
This seems to be a way to HIDE those in the corporations more then anything else..
[ link to this | view in thread ]
Glad you finally picked a side.
Mike, you know enough about information theory to know that digital speech is a quantifyable construct. While the law chooses to use rediculous levels of abstraction they are hardly neccessary in the current day and age.
"I'll discuss why "regulating privacy" is a difficult task to tackle without massive negative consequences."
Who says there shouldn't be massive negative consequences?
Privacy is not difficult to describe when it comes to digital communications. Digital speech is the first bit that is flipped for the purpose of relating an idea, to the last bit that is flipped for that purpose. Interception of those bits without consent, is a violation of privacy. The only exception required is for the minimal amount of bits required to facilitate common carrier delivery. The only argument remaining is what that minimum amount of bits is. And that is really quite simple.
If what your doing when your switching customer datagrams devalues the data transmitted; and if privacy has value, than the removal of it constitutes such a devaluation; then it isn't improving service. Therefore is isn't "network management". Which is to say that all of the "value add" services that the ISP's have been using to screw people, are not network management, because they were a violation of rights prior to to the addition of any subsequent capability.
What your looking at in CA, is legislators trying to negotiate a price for their constituents rights, after their donors are already balls deep in federal and state constitutions ass. There is no purpose served in trying to limit the damage. It has already been done.
This is really about playing hot potatoe to see who is going to pay the bill when the citizens demand their rights back. And that will probably end how it usually does. With one banker/would-be executive going to jail, and the rest of them revising their quarterly reports to account for half a trillion dollars worth of vaporized book value. And that loss is going to manifest because they spent money building infrastructure, that would have been completely unneccessary had they shown some basic human decency. The "negative effects" your talking about have already happened. They just haven't been revealed to the market yet.
Certainly this subject needs to be brought before SCOTUS, and the only way that is going to happen if there are some lawsuits. If CA has to be a bull in a china shop to create the lawsuits that get us there, then so be it.
[ link to this | view in thread ]
[ link to this | view in thread ]
Re:
[ link to this | view in thread ]
Re: Glad you finally picked a side.
You weren't around for 2008 were you?
[ link to this | view in thread ]
Redundancy
[ link to this | view in thread ]
Re: Yes, Privacy Is Important, But GOOGLE vastly more important.
show us on the doll where google hurt you...
[ link to this | view in thread ]
do as we say, not as we do ...
https://www.nraila.org/articles/20180707/california-massive-data-breach-and-significant-regist ration-problems-with-ca-doj-s-assault-weapon-registration-system
[ link to this | view in thread ]
Re: Glad you finally picked a side.
If this is the legislation I think it is, my understanding is that the reason the California legislature even considered this bill at all - never mind passed it, never mind passed it so quickly - is because there's an impending "citizens' ballot initiative" referendum which would do pretty much all the bad things this bill does only worse.
What's more, if the initiative passes by referendum, it can only be repealed or modified by another referendum; the legislature can't amend it to fix problems which get identified later, either by analysis (such as this article covers) or in practice.
If the legislature passes this bill, and the initiative doesn't go on the ballot because it's seen as redundant (which I understand its sponsor has indicated he hopes for, because he only has so much money to fund the campaign), and problems are discovered later, in theory the legislature could then amend or repeal the problematic law.
But if the legislature doesn't pass something close enough to the initiative to satisfy its supporters, and the initiative winds up being adopted at the ballot box, and problems are discovered later, we're stuck - short of another multimillionaire funding another ballot initiative, and getting people to support it over the same forces that are pushing this one.
So people in the CA legislature who don't even like the idea at all have still worked to pass a bill like this one, because the alternative outcome is likely to be significantly worse.
[ link to this | view in thread ]
Good article but...
Thanks for writing this article. I think it goes a long way to explaining some of the issues that CCPA will bring up. I would definitely encourage more articles like this in the future. I think I can say that all (most) of us here are fans of Techdirt and we would love see how CCPA affects the site. Feel free to complain in great detail. People should know about the consequences of stuff like this.
(Now here comes the but)
With all of that in mind, this reaction sounds a little bit like cries that the sky is falling. This article doesn't get into very many specific problems that will arise. It also glosses over some of the benefits that will probably be very popular. Instead of bemoaning hypotheticals wouldn't it be more productive to point to specific changes that could be made to improve the law? Taken straight from the bill it is trying to accomplish:
"(1) The right of Californians to know what personal information is being collected about them.
(2) The right of Californians to know whether their personal information is sold or disclosed and to whom.
(3) The right of Californians to say no to the sale of personal information.
(4) The right of Californians to access their personal information.
(5) The right of Californians to equal service and price, even if they exercise their privacy rights."
The only one of these that I have issue with is #5. I'm not sure that is "fair" to the service provider. I could see arguments either way. It's definitely something that should be discussed. It seems like you have not with what the bill is trying to accomplish but how it is implemented. Correct?
You're ad/tracking example is a little thin. I don't work with html but I'm fairly certain this could be accomplished with simple cookies. Even if it couldn't there's a point where I feel it's okay to tell the user that they can't have their cake and eat too. You (and a lot of other people) seem to be implying that the bill allows users to demand a service provider stop tracking the user. As far I can tell this is not the case. It allows a user to delete all personal information. So Techdirt can do all the internal tracking it wants as long as it deletes it when I ask. If this is wrong please point to that section of the bill.
I've only been able to skim Jeff Kosseff's article, but based on my initial read it's not very good. Is he a friend of yours? You usually make much better references. He (like everyone else it seems) speaks with a lot of hyperbole. For the sake of argument I wanted to address his some of his concerns.
1) Companies just spent two years – and billions of dollars – on GDPR
Yeah? That's how laws work. I don't want to be dismissive, but we could go on for days about GDPR. Either way I'm not sure GDPR has shown itself to be the death of the internet.
2) Many small companies must comply with CCPA.
Yup. This is kind of like proper network security. It costs money to have and it cost a whole lot more to not have.
3) Many companies’ systems are not set up for CCPA
If they have $50 in revenue, 50% comes from selling personal data or know that 50,000 customers live in CA they should probably have this.
4) CCPA fails to clearly define the roles of controllers and processors
There actually is something to this argument but again it's very hypothetical in nature. I imagine that most processors keep data in a bit of a walled garden. I would hope that processors are not selling data that is held for another company.
5) CCPA’s sales restrictions raise First Amendment concerns
Yes they do. This will be interesting. I can't wait to see how this play out.
6) CCPA opens the door to 49 more overlapping (and conflicting) state data protection laws, raising Dormant Commerce Clause concerns
Yes this stuff should be fixed. Please go into specifics about this rather than say the whole system is broken without putting in any effort to fix the situation.
7) CCPA provides another unclear cause of action for data breach victims
I would say CCPA does the opposite of this but what do I know.
8) Where are the benefits?
Do I even need to dignify this with a response?
9) CCPA will affect all industries, not just “the Internet”
Feature not a bug. Will not fix.
10) The ‘wait and see’ approach is impractical
THAT'S WHAT WE'VE BEEN DOING! It didn't work.
All of this sort of ignores the fact that the easiest way to get around all of this is to simply not store personal information, which as you pointed out they very clearly define. Here's how I see most companies dealing with this. John Smith calls Umbrella Corp for all personal information that they have on him. He uses their site (Deathmachines.com, a site to keep up on the latest bioengineered peace keeping devices) all time, but has never made an account under his real name. Umbrella Corp which knows his secret addiction to zombie porn but never bothered to get his real name says that they have no personal information under that name. The end.
[ link to this | view in thread ]
You're selling me on this privacy bill
Sounds good to me! Maybe you lot should try a privacy-first model and not have the firehose open by default. Have opt-out by default rather than assume you have permission to track every move a user makes.
I get that it's more work, but hey, you have to earn the trust for users to volunteer their info. Check out this article by thoughtworks about the German way of capturing only data needed https://www.thoughtworks.com/insights/blog/datensparsamkeit
[ link to this | view in thread ]
False choice
This strikes me as a typical false choice from a business perspective. "What you propose is not a perfect solution for ALL scenarios and ALL actors AND with zero cost (or less). That's why I oppose your solution to this important and pressing issue. For now we're just stuck with the wild west and businesses should just continue to do whatever they want. (WOOHOO!)"
As a consumer I'll take a good solution until an even better one comes along. Or maybe we can start with this solution and just make it better and better over time.
Also, the issues affecting businesses tend to create new opportunities. If small businesses need a low cost solution to manage their website preferences within the legal framework, somebody on the internet is going to offer to manage that for them at a low, low, low, low monthly cost. There's no reason to anticipate 60,000 new unique solutions for every individual small business in California. By February 2020, WordPress is going to have a plugin like Everest Forms or Jetpack to cover most of these concerns.
[ link to this | view in thread ]