Surprising, But Important: Facebook Sorta Shuts Down Its Face Recognition System

from the good-to-see dept

A month ago, I highlighted how Facebook seemed uniquely bad attaking a long term view and publicly committing to doing things that are good for the world, but bad for Facebook in the short run . So it was a bit surprising earlier this week to see Facebook (no I'm not calling it Meta, stop it) announce that it was shutting down its Face Recognition system and (importantly) deleting over a billion "face prints" that it had stored.

The company's announcement on this was (surprisingly!) open about the various trade-offs here, both societally and for Facebook, though (somewhat amusingly) throughout the announcement Facebook repeatedly highlights the supposed societal benefits of its facial recognition.

Making this change required careful consideration, because we have seen a number of places where face recognition can be highly valued by people using platforms. For example, our award-winning automatic alt text system, that uses advanced AI to generate descriptions of images for people who are blind and visually impaired, uses the Face Recognition system to tell them when they or one of their friends is in an image.

[....]

But the many specific instances where facial recognition can be helpful need to be weighed against growing concerns about the use of this technology as a whole. There are many concerns about the place of facial recognition technology in society, and regulators are still in the process of providing a clear set of rules governing its use. Amid this ongoing uncertainty, we believe that limiting the use of facial recognition to a narrow set of use cases is appropriate.

One interesting tidbit buried in this is that only about 1/3 of Facebook users opted in to use Facebook's facial recognition tool (despite the company pushing it heavily on users). At the very least, it showed that a large number of users weren't comfortable with the technology.

There's also the issue that, while they're turning off the tool and deleting the facial prints, the NY Times notes they're hanging on to the algorithm that was built on all those faces:

Although Facebook plans to delete more than one billion facial recognition templates, which are digital scans of facial features, by December, it will not eliminate the software that powers the system, which is an advanced algorithm called DeepFace. The company has also not ruled out incorporating facial recognition technology into future products, Mr. Grosse said.

That's resulted in some (expected) amount of cynicism from Facebook's critics that Facebook "got what it wanted" and is now moving on. However, I think that's a bit silly. Facebook could have easily kept the facial recognition program going. Of all the regulatory pressures the company is facing, this was way down the list and barely on the radar.

And, to make a bigger point, here's a case where the company is actually doing the right thing: turning off a questionable product and deleting a ton of data it collected. And we should at least encourage both Facebook and other companies to be willing to make that decision based on recognizing the societal risks, and without waiting around until they're forced to do so.

Hide this

Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.

Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.

While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.

–The Techdirt Team

Filed Under: ai, data, deepface, facial recognition, privacy, society, trade-offs
Companies: facebook


Reader Comments

Subscribe: RSS

View by: Time | Thread


  1. icon
    Koby (profile), 3 Nov 2021 @ 12:36pm

    Rarely Is It Not About Money

    only about 1/3 of Facebook users opted in to use Facebook's facial recognition tool

    I'll betcha that another 1/3 of Facebook users found the tool to be really creepy and interacted with FB less because of it. And I think we all know what happens when FB sees its metrics declining.

    link to this | view in thread ]

  2. identicon
    Anonymous Coward, 3 Nov 2021 @ 1:06pm

    Sure they are, let's start trusting suck after he's lied all this time.

    link to this | view in thread ]

  3. icon
    Mike Masnick (profile), 3 Nov 2021 @ 1:35pm

    Re:

    Who said anything about trusting him?

    link to this | view in thread ]

  4. icon
    Mike Masnick (profile), 3 Nov 2021 @ 1:36pm

    Re: Rarely Is It Not About Money

    I'll betcha that another 1/3 of Facebook users found the tool to be really creepy and interacted with FB less because of it.

    Uh, no way that actually happened. Why are you making shit up?

    link to this | view in thread ]

  5. identicon
    Anonymous Coward, 3 Nov 2021 @ 1:40pm

    Re: Rarely Is It Not About Money

    Don't you have summer school work to do?

    link to this | view in thread ]

  6. identicon
    Anonymous Coward, 3 Nov 2021 @ 2:35pm

    Faces have too few bits to be psuedounique at scale

    I suspect that they found that with large numbers facial recognition had far too many collisions even if they resign themselves to accepting identical twins as an edge-case. If there just isn't enough in distinguishing bits in a "noisy" environment on real life they would find their data turning to shit for identification purposes. Unlike police forces and the facial recognition merchants everywhere they have no incentive to stick their head in the sand about it.

    Now what doesn't have that issue is facial procedural generation. You don't need to map it, just generate convincing ones. Once trained they can dump their putrifying dataset.

    That is my suspicion anyway, that they decided to use their algorithm for what it turned out to be good for. That has happened several times at Silicon Valley, one attempt of Google at making a street number reader more tolerant of real world distortions didn't work so well there but broke the letters style Captcha breaker.

    link to this | view in thread ]

  7. icon
    Koby (profile), 3 Nov 2021 @ 3:35pm

    Re: Re: Rarely Is It Not About Money

    I'm confident that FB didn't secretly divulge to you the metrics between the folks that did opt in, versus the folks that didn't. For a corporation that attempts to track everything, and then bases its decisions upon the data, I would find it surprising if FB didn't conduct a study on its users, or did conduct one which showed that it wasn't losing engagement yet they decided to abandon its facial recognition program out of the goodness of their hearts. And then privately messaged you with the data just in case someone doubted their sincerity. You don't have to shill for them so hard Maz, geez!

    link to this | view in thread ]

  8. identicon
    Capt ICE Enforcer, 3 Nov 2021 @ 3:48pm

    Of course Facebook is going to shut down the facial recognition. They sold it to another company named Meta. Also, they already have everyone's faces for each moment in the past 30 years. Not to mention all the kids not born yet.

    link to this | view in thread ]

  9. icon
    PaulT (profile), 4 Nov 2021 @ 12:47am

    Re: Rarely Is It Not About Money

    I'll take the bet if you can provide evidence that comes from somewhere other than your rear end. Given your constant inability to deal with reality, especially on the subject of sites that you've been bitter about for years since they kicked your klan buddies off their property, I don't think we should take your guesswork seriously.

    link to this | view in thread ]

  10. icon
    PaulT (profile), 4 Nov 2021 @ 12:50am

    Re: Re: Re: Rarely Is It Not About Money

    "I'm confident that FB didn't secretly divulge to you the metrics between the folks that did opt in, versus the folks that didn't"

    So, you were talking out of your ass when you claimed that over 1 billion people used FB less because of this single issue, because nobody has the data, including you? Gotcha.

    Whenever you have evidence to back up your claims, we're all ears, but until then stop guessing, you're usually wrong even on the issue where evidence does exist.

    link to this | view in thread ]

  11. icon
    PaulT (profile), 4 Nov 2021 @ 12:52am

    Re:

    "They sold it to another company named Meta"

    No, they didn't, but the details of what actually happened seems to have been caught up in a misinformation campaign.

    link to this | view in thread ]

  12. icon
    Strawb (profile), 4 Nov 2021 @ 12:57am

    Re: Re: Re: Rarely Is It Not About Money

    I'm confident that FB didn't secretly divulge to you the metrics between the folks that did opt in, versus the folks that didn't.

    They didn't divulge them to you, either, so please stop making shit up.

    link to this | view in thread ]

  13. identicon
    Anonymous Coward, 4 Nov 2021 @ 6:17am

    There's also the issue that, while they're turning off the tool and deleting the facial prints, the NY Times notes they're hanging on to the algorithm

    I mean, developers never toss out old code. You never know when you'll be looking at some other problem, and go "wait, how did we fix that last time... ah, here it is".

    link to this | view in thread ]

  14. icon
    nasch (profile), 4 Nov 2021 @ 7:57am

    Re:

    And even if they "deleted" it, to actually get rid of it they would have to go into the source control and remove it (considered bad practice and only to be done in extreme cases), and also delete it from all their backups (probably easier said than done).

    link to this | view in thread ]

  15. identicon
    Anonymous Coward, 4 Nov 2021 @ 11:53am

    Re: Re:

    Yeah there are basically two main reasons why something gets outright deleted from version control completely.

    1. Credentials or other sensitive information accidentally included.
    2. Legal obligations, be it a lawsuit or because some dumbass sicko stuck outright child pornography in it.

    link to this | view in thread ]

  16. identicon
    Anonymous Coward, 4 Nov 2021 @ 12:38pm

    Re: Rarely Is It Not About Money

    For a boy looking down the barrel of a full class-load of summer school you sure got a lot of time to spare.

    link to this | view in thread ]


Follow Techdirt
Essential Reading
Techdirt Deals
Report this ad  |  Hide Techdirt ads
Techdirt Insider Discord

The latest chatter on the Techdirt Insider Discord channel...

Loading...
Recent Stories

This site, like most other sites on the web, uses cookies. For more information, see our privacy policy. Got it
Close

Email This

This feature is only available to registered users. Register or sign in to use it.