GDPR isn't about privacy, it is about processing. Even if the pictures are public, that does not authorise anyone to process the pictures.
The regulation does allow some cases where processing is permitted without agreement from the subject: specifically for domestic purposes (so you can keep an address book for your own family use with names, addresses and photos scraped from the Internet if you wish).
Any other processing requires approval from the subject.
In effect, GDPR makes the question not one about ownership but about purpose and intention.
Moral rights (a concept not used in UK law as far as I know but certainly valid in some European law codes) is more about ownership I believe.
Whether or not it violates the Illinois law, this clearly violates GDPR and also moral rights.
Me publishing my photo in public, does not give you the right to process it. Personal data is owned by the person and permission must be granted to process it.
That may be the approach in your country -- I am happy to believe you. But here in Europe the approach is that we own data about ourselves. If Facebook, or anyone else, finds out or even creates some data about us, it cannot monetise it without our permission, because we own it, not them.
Just like if I take my car to be serviced, the garage can't monetise my car while they are servicing it -- I still own it./div>
This was a company that recognized it had valuable information and was trying to figure out the best way to monetize it. There isn't much of a scandal there, though some people seem to think there is.
The "scandal" is because Facebook didn't own the information. Just because it has the information doesn't mean it has any right to monetize it. If it had been figuring out how it could offer to help the users to monetize their information (and take a cut) that would have been fine -- and we would have all got some insight into what our information was worth and be able to decide whether we found the offer acceptable.
It would certainly be a disaster if the bill was to add pre-emption of states' laws at this time. The time to consider pre-emption is when there has been experience (good and bad) of various attempts. That should be after a few years of experience with GDPR plus some insights into the effects of different states' approaches. Then Congress can consider pre-empting states in a bill built on experience./div>
The easiest way is to allow the person concerned to issue a form of "takedown" notice to Google. Yes they would need to pay people, possibly a lot of people, to manual assess these -- just as their competitors in the personal dossier business today have to. In fact, Google AI is almost certainly already good enough to apply most of these legal rules fairly automatically.
P.S. Actually, in the 80's, I did start creating an internet search engine myself. This was for gopher sites and some other pre-WWW technologies. I stopped fairly quickly because it was hard, and other people in my company told me they were way ahead of me in developing something similar -- which eventually became Altavista./div>
Sorry, Mike, you are wrong to say this is anything to do with news reporting. Google is not a news reporter, it is an indexer. If the suggestion was that news reports should be taken down that would be completely different.
There are many commercial companies that create dossiers on people (credit reference agencies, journalists, recruiters, private investigators, etc). They are regulated by law and, in some countries, there are laws limiting what data can be included in their dossiers (such as old credit data, expired prison sentences, etc). You can argue about whether those laws should exist but in many countries those laws do those exist.
Google is a commercial company which competes against those companies by creating dossier of mentions on the Internet when you give them a name. As such, it should be bound by the same laws as its competitors: whether the laws are right or wrong.
The issue is that Internet search for people is a very different thing than internet search for other things. Searching for people is, and should be, restricted by various privacy and personal data laws (different in different countries).
I am not arguing for RTBF laws. But I am arguing that where such laws exist they apply to Google searches just as much as to any other form of personal information.
However, they should not apply to searches made in other countries with different laws: the laws of the searcher should apply, not the laws of the object of the search.
No matter how it's pitched in the future, it's important to remember law enforcement isn't somehow "owed" every tech advance that comes its way.
This is a key point. Just because we can do something doesn't mean we should.
Law enforcement have never had it so good. There is so much more evidence available to them now (photos and videos, criminal plans discussed in emails, drug dealer contacts stored in phones, ...).
As we all know, policing is a difficult job. Unfortunately for them, we need it to remain so in order to protect our civil liberties (such as trade unions, effective protest and campaigns for major societal changes). Some of the simplifications that the digital world have introduced to their job need to be rolled-back to protect civil liberties.
Supporters will say facial recognition tech makes both officers and the public safer.
As you say, this may be true. But so would doing away with trials and imprisoning anyone on suspicion by a police officer.
Just because we can do something, doesn't mean we should.
The law. Allow for data to be marked, by the owner or source, as "anonymised" (whether any technical steps are taken or not) and make it a criminal offence to either (i) attempt to de-anonymise, or (ii) correlate such data with any other data. This should be enough to prevent (for example) insurance companies using such data to set premiums and it might even be enough to prevent major commercial data brokers from using the data (although steps would have to be taken to make sure investigation and penalties are severe enough to prevent data-washing, possibly abroad). Of course, it has no effect on governments, nor on commercial deals where the source is not willing to mark the data as "anonymised".
Publish standards (NIST?) for anonymisation. Maybe not so much specific algorithms as principles. For example, if identifiers are to replaced by meaningless numbers, the identifier-to-number mapping must change more frequently than an adversary is likely to be able to gether enough data to de-anonymise. These would have to be based on research. For example, based on the research in the article, a database of tweets might need to change the mapping of the profile name every 29 tweets, or something. Or a database of ANPR data showing traffic movements might have to change the vehicle pseudo-identity every 1 hour.
These two steps would also have to be accompanied by greater public awareness of de-anonymisation. The legal route is particularly important in making sure that companies cannot claim something is "anonymised" unless there are ways for the data subjects to actually enforce it.
/div>
Techdirt has not posted any stories submitted by Graham Cobb.
Re:
GDPR isn't about privacy, it is about processing. Even if the pictures are public, that does not authorise anyone to process the pictures.
The regulation does allow some cases where processing is permitted without agreement from the subject: specifically for domestic purposes (so you can keep an address book for your own family use with names, addresses and photos scraped from the Internet if you wish).
Any other processing requires approval from the subject.
In effect, GDPR makes the question not one about ownership but about purpose and intention.
Moral rights (a concept not used in UK law as far as I know but certainly valid in some European law codes) is more about ownership I believe.
/div>GDPR
Whether or not it violates the Illinois law, this clearly violates GDPR and also moral rights.
Me publishing my photo in public, does not give you the right to process it. Personal data is owned by the person and permission must be granted to process it.
/div>(untitled comment)
Just like if I take my car to be serviced, the garage can't monetise my car while they are servicing it -- I still own it./div>
It's called "stealing"
The "scandal" is because Facebook didn't own the information. Just because it has the information doesn't mean it has any right to monetize it. If it had been figuring out how it could offer to help the users to monetize their information (and take a cut) that would have been fine -- and we would have all got some insight into what our information was worth and be able to decide whether we found the offer acceptable.
/div>Not time to consider pre-emption
Re: Re: Re: News reporting
P.S. Actually, in the 80's, I did start creating an internet search engine myself. This was for gopher sites and some other pre-WWW technologies. I stopped fairly quickly because it was hard, and other people in my company told me they were way ahead of me in developing something similar -- which eventually became Altavista./div>
Re: News reporting
Sorry, Mike, you are wrong to say this is anything to do with news reporting. Google is not a news reporter, it is an indexer. If the suggestion was that news reports should be taken down that would be completely different.
There are many commercial companies that create dossiers on people (credit reference agencies, journalists, recruiters, private investigators, etc). They are regulated by law and, in some countries, there are laws limiting what data can be included in their dossiers (such as old credit data, expired prison sentences, etc). You can argue about whether those laws should exist but in many countries those laws do those exist.
Google is a commercial company which competes against those companies by creating dossier of mentions on the Internet when you give them a name. As such, it should be bound by the same laws as its competitors: whether the laws are right or wrong.
The issue is that Internet search for people is a very different thing than internet search for other things. Searching for people is, and should be, restricted by various privacy and personal data laws (different in different countries).
I am not arguing for RTBF laws. But I am arguing that where such laws exist they apply to Google searches just as much as to any other form of personal information.
However, they should not apply to searches made in other countries with different laws: the laws of the searcher should apply, not the laws of the object of the search.
/div>Changing the narrative
No matter how it's pitched in the future, it's important to remember law enforcement isn't somehow "owed" every tech advance that comes its way.
This is a key point. Just because we can do something doesn't mean we should.
Law enforcement have never had it so good. There is so much more evidence available to them now (photos and videos, criminal plans discussed in emails, drug dealer contacts stored in phones, ...).
As we all know, policing is a difficult job. Unfortunately for them, we need it to remain so in order to protect our civil liberties (such as trade unions, effective protest and campaigns for major societal changes). Some of the simplifications that the digital world have introduced to their job need to be rolled-back to protect civil liberties.
Supporters will say facial recognition tech makes both officers and the public safer.
As you say, this may be true. But so would doing away with trials and imprisoning anyone on suspicion by a police officer.
Just because we can do something, doesn't mean we should.
/div>How do we fight this?
Two possible routes (I am sure there are others):
The law. Allow for data to be marked, by the owner or source, as "anonymised" (whether any technical steps are taken or not) and make it a criminal offence to either (i) attempt to de-anonymise, or (ii) correlate such data with any other data. This should be enough to prevent (for example) insurance companies using such data to set premiums and it might even be enough to prevent major commercial data brokers from using the data (although steps would have to be taken to make sure investigation and penalties are severe enough to prevent data-washing, possibly abroad). Of course, it has no effect on governments, nor on commercial deals where the source is not willing to mark the data as "anonymised".
These two steps would also have to be accompanied by greater public awareness of de-anonymisation. The legal route is particularly important in making sure that companies cannot claim something is "anonymised" unless there are ways for the data subjects to actually enforce it.
/div>Techdirt has not posted any stories submitted by Graham Cobb.
Submit a story now.
Tools & Services
TwitterFacebook
RSS
Podcast
Research & Reports
Company
About UsAdvertising Policies
Privacy
Contact
Help & FeedbackMedia Kit
Sponsor/Advertise
Submit a Story
More
Copia InstituteInsider Shop
Support Techdirt