Another Day, Another Company Scraping Photos To Train Facial Recognition AI

from the ALL-YOUR-FACE-ARE-BELONG-TO-US dept

If your face can be found online, chances are it's now part of a facial recognition database. These aren't the ones being utilized by law enforcement, although those are bad enough. The ones used by law enforcement are littered with millions of noncriminals, all part of a system that works worse than advertised 100% of the time.

The faces aren't in those databases (yet!), but they're being used to train facial recognition AI with an eye on selling it to law enforcement and other government agencies. Another photo storage company has been caught using users' photos to fine tune facial recognition software… all without obtaining consent from those whose faces became fodder for the tech mill.

“Make memories”: That’s the slogan on the website for the photo storage app Ever, accompanied by a cursive logo and an example album titled “Weekend with Grandpa.”

Everything about Ever’s branding is warm and fuzzy, about sharing your “best moments” while freeing up space on your phone.

What isn’t obvious on Ever’s website or app — except for a brief reference that was added to the privacy policy after NBC News reached out to the company in April — is that the photos people share are used to train the company’s facial recognition system, and that Ever then offers to sell that technology to private companies, law enforcement and the military.

This has been 2019's theme for the first five months of the year. Users of popular photo apps and services are being notified belatedly -- and not by the companies performing the harvesting -- that their faces are an integral part of law enforcement machinery and/or the military-industrial complex.

Ever's oh-shit-we-got-caught statement doesn't offer much mollification.

Doug Aley, Ever’s CEO, told NBC News that Ever AI does not share the photos or any identifying information about users with its facial recognition customers.

Lots of people would rather not be participants in creating surveillance tech. Most never seek employment at companies crafting products for law enforcement, intelligence agencies, and the US military. Without being informed, the photos they thought they were harmlessly sharing with family and friends have been used to make surveillance easier and more pervasive, if not actually any better.

Ever is just the latest. Prior to this, Flickr photos were swept up in a facial recognition data set compiled by IBM.

The photo is undeniably cute: a mom and a dad — he with a stubbly beard and rimless glasses, she with choppy brown hair and a wide grin — goofing around and eating ice cream with their two toddler daughters.

The picture, which was uploaded to photo-sharing site Flickr in 2013, isn't just adorable; with a bunch of different faces in various positions, it's also useful for training facial-recognition systems, which use artificial intelligence to identify people in photos and videos. It was among a million images that IBM harnessed for a new project that aims to help researchers study fairness and accuracy in facial recognition, called Diversity in Faces.

IBM also apologized for using people's photos for its data set without their permission. It said users were welcome to opt out at any time, but did not give users tools to find out whether their photos had been used. Nor is there any way to expeditiously remove found photos other than by handing over your Flickr ID to IBM.

And if it's not a tech company harvesting photos to run AI tests, it's random internet users showing just how easy it is to compile a data set using other people's photos.

Tinder users have many motives for uploading their likeness to the dating app. But contributing a facial biometric to a downloadable data set for training convolutional neural networks probably wasn’t top of their list when they signed up to swipe.

A user of Kaggle, a platform for machine learning and data science competitions which was recently acquired by Google, has uploaded a facial data set he says was created by exploiting Tinder’s API to scrape 40,000 profile photos from Bay Area users of the dating app — 20,000 apiece from profiles of each gender.

The data set, called People of Tinder, consists of six downloadable zip files, with four containing around 10,000 profile photos each and two files with sample sets of around 500 images per gender.

Tinder's reaction was to call this a violation of its Terms of Service. But this determination doesn't undo the damage nor make it impossible for someone else to do the same thing. Tinder users spoken to by TechCrunch weren't happy their photos -- some of which have never been uploaded outside of the app -- are being used by a person they don't know to perform research.

It's not that there's no legitimate uses for publicly-available photos. But transparency is the key and no one harvesting photos to train AI systems or perform research seems too concerned about being upfront with the people whose photos they're using. It's even worse in the case of Ever, where the app company itself is the one developing facial recognition software on the side, which should make users question the intent of the app developers. Did they really want to offer another photo service or were they just using this to gather faces for their real moneymaker?

Hide this

Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.

Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.

While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.

–The Techdirt Team

Filed Under: ai, facial recognition, photos, social media
Companies: ever


Reader Comments

Subscribe: RSS

View by: Time | Thread


  1. identicon
    Crinisen, 14 May 2019 @ 12:16pm

    IBM also apologized for using people's photos for its data set without their permission. It said
    users were welcome to opt out at any time, but did not give users tools to find out whether their
    photos had been used.

    Obviously they need a way to figure out if you are the person in the photo, so please turn on your web cam and slowly turn your head left, right, up, and down as far as you can...

    link to this | view in thread ]

  2. icon
    discordian_eris (profile), 14 May 2019 @ 12:16pm

    I misread the title of this article, and seem to have found the truth instead. Instead of seeing it as facial recognition, I saw it as 'farcical recognition'. Now, suddenly, the whole thing makes sense.

    link to this | view in thread ]

  3. identicon
    Anonymous Coward, 14 May 2019 @ 2:26pm

    With video surveillance cameras just about everywhere, it's a wonder that they're not feeding into facial recognition databases yet ... or are they?

    link to this | view in thread ]

  4. identicon
    Anonymous Coward, 14 May 2019 @ 3:31pm

    Is it illegal to wear a mask yet?
    Other than a bank of course.

    link to this | view in thread ]

  5. identicon
    Anonymous Coward, 14 May 2019 @ 4:50pm

    Re:

    The especially fun one was when Facebook rolled out a 'confirm you're human' log-in check that required tagged faces in photos uploaded by your friends. So even if you purposefully mis-tag your own photos to opt out, you're going in the system.

    link to this | view in thread ]

  6. identicon
    Anonymous Coward, 15 May 2019 @ 7:18am

    Employers

    Employers are getting in on this act too. Large employers such as mine take photos of all employees and then do with them as they like. Of course you can opt out by becoming unemployed. yay.

    link to this | view in thread ]

  7. icon
    Bamboo Harvester (profile), 15 May 2019 @ 8:19am

    Re:

    I recall a NYC case where the defense argued that wearing a ski mask as his defendants did during a robbery was a "part of their culture".

    Didn't fly...

    link to this | view in thread ]


Follow Techdirt
Essential Reading
Techdirt Deals
Report this ad  |  Hide Techdirt ads
Techdirt Insider Discord

The latest chatter on the Techdirt Insider Discord channel...

Loading...
Recent Stories

This site, like most other sites on the web, uses cookies. For more information, see our privacy policy. Got it
Close

Email This

This feature is only available to registered users. Register or sign in to use it.