Facial Recognition Company's Employees Abused Tech To Sexually Harass Coworkers

from the just-amazing-stuff-they're-doing-these-days-with-AI dept

Someone wants to out-evil Clearview. Now, there are a lot of questionable facial recognition tech vendors but most have decided to cede "most hated" to Clearview. But another startup thinks it should generate as much ill will as possible before securing prominent marketplace status.

The unanswered question, I guess, is whether to direct your AI-enabled sociopathy at random individuals or your own coworkers. While Clearview has allowed police officers, private companies, authoritarian regimes, billionaires, and politicians to "run wild" with baseless searches of its 4 billion scraped images, a Silicon Valley company has taken a hands-off approach to internal "testing" of its tech. Joseph Cox has more details for Motherboard.

Verkada, a fast-growing Silicon Valley surveillance startup, equips its offices in downtown San Mateo, California, with its own state-of-the-art security cameras.

Last year, a sales director on the company's sales team abused their access to these cameras to take and post photos of colleagues in a Slack channel called #RawVerkadawgz where they made sexually explicit jokes about women who worked at the company, according to a report in IPVM, which Motherboard independently verified and obtained more information about.

"Face match… find me a squirt," the sales director wrote in the company Slack channel in August 2019, according to one screenshot obtained by Motherboard.

Charming. And one more reason why everyone should be extremely wary of companies promising to do only good things with their surveillance products. If employees can abuse the tech, anyone can. Handing it over to entities with the power to take away people's freedom just compounds the problems. If the tech has already been abused before it even ships, are we supposed to believe customers won't do the same horrible things once they have it in their possession?

Verkada's proprietary "face search" prompted a cascade of abusive behavior by Verkada employees. The tech can pinpoint single faces in a crowd of faces. This led to a group of male employees sexually harassing female employees, utilizing the tech to turn images captured by the company's surveillance cameras into their personal degradation playground.

According to three sources who worked at Verkada at the time, the group of men posted sexually graphic content about multiple female employees in similar Slack messages.

This may say more about people than about the tech, but the refusal to believe shitty people like this exist everywhere -- even at government agencies -- should be a fatal flaw. It's impossible to prevent all abuse of tech like this, but Verkada's weak response suggests it won't dump customers who engage in similar behavior.

After the Slack channel was reported to the company's Human Resource team in February, Verkada's CEO Filip Kaliszan announced in a company all-hands meeting that an undisclosed number of employees active in the Slack channel were given the choice between leaving the company or having their share of stock reduced. All of them chose the latter option, and the Slack channel was removed, according to four employees who worked at Verkada during the time.The person who posted the screenshot still works at Verkada.

Verkada's killer app is its "face search." This allows users to target an individual across multiple cameras and recordings, rather than manually search recordings for an individual. Its algorithm searches recordings for uploaded faces, returning everything that matches to make sifting through recordings that much easier. Its roster of clients includes law enforcement agencies.

The company's first comment stated that it "does not tolerate sexual harassment." But giving harassers the choice between being fired and taking a theoretical hit to stock options isn't exactly the model of intolerance. Fortunately, the company has revised its position following this nationwide exposure of its employees' abuse of the tech.

Upon a further review of this incident and feedback from several employees about how it was initially addressed, we have terminated the three individuals who instigated this incident, engaged in egregious behavior targeting coworkers, or neglected to report the behavior despite their obligations as managers.

All well and good, but this incident happened in 2019 and Verkada management was informed of it as early as August of last year. An "all hands" meeting in March of this year resulted in the incident being brought up again. There was no response from the management team at that point. It wasn't until Motherboard asked Verkada for comment prior to publication that the company finally addressed the issue. And even then, it gave violators the option to stay onboard in exchange for some stock options.

If the product is being abused by the company that makes it, it will be abused by its customers. This fact cannot be argued. Combine this tech with the databases full of personal info government agencies have access to and you've got dozens of potential misuse cases on your hands. Verkada employees can't be trusted with their own tech. No one else should be trusted with it either.

Hide this

Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.

Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.

While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.

–The Techdirt Team

Filed Under: creep, facial recognition, privacy, sexual harassment, surveillance
Companies: verkada


Reader Comments

Subscribe: RSS

View by: Time | Thread


  • icon
    Upstream (profile), 9 Nov 2020 @ 2:53pm

    Trying not to be repititious, but...

    see comment under It Took Just 5 Minutes Of Movement Data To Identify 'Anonymous' VR Users

    link to this | view in chronology ]

  • icon
    Uriel-238 (profile), 9 Nov 2020 @ 3:56pm

    I remember this being a problem with Google

    A few(?) technicians in Google were caught using their access to stalk colleagues, ex-lovers and such, which was one of the first alarm bells that Google, in order to don't be evil needs to do more to prevent others from doing evil with Google's technology.

    I'd like to imagine it's possible to make a system in which a person's private data is only accessible to them and then through robust anonymization to trend analysis computers (say if personal data was intermixed with that of five hundred other persons before being passed to statistic counters. As I Am Not A Security Expert, I don't know if that would work.)

    link to this | view in chronology ]

    • identicon
      Anonymous Coward, 9 Nov 2020 @ 4:10pm

      Re: I remember this being a problem with Google

      robust anonymization = unicorn

      link to this | view in chronology ]

      • icon
        Scary Devil Monastery (profile), 10 Nov 2020 @ 12:18am

        Re: Re: I remember this being a problem with Google

        "robust anonymization = unicorn"

        Not quite. It's possible, but the good old security triangle still applies; Out of security, convenience and economy you can have any two, never three.

        So you can have good security and convenience, but it will cost you. Or you can have cheap and inconvenient security which is decent. Or you can have that third choice which the average user opts for; Shit security which is convenient to use and cheap.

        Anonymization is easy if all you need to do is to be a random ip number accessing a webpage. Use a VPN and have a scriptblocker and anti-fingerprinting add-on installed.
        But the very second you actually log on anywhere all the security of your information relies with the party keeping your login identity. This is the part where Apple and Google become a risk, because on a smartphone anything you do goes through your account with them.

        link to this | view in chronology ]

        • icon
          Upstream (profile), 10 Nov 2020 @ 2:38am

          Re: Re: Re: I remember this being a problem with Google

          You are correct about some forms of anonymization. VPNs with appropriate browser security / anonymization extensions and Tor both can be fairly effective. I was thinking more in terms of the comment's subject matter: anonymization of large amounts of personally identifiable data that has already been collected while retaining the usefulness of said data for trend and other analysis. From what I understand, to effectively anonymize such data you must also destroy most, if not all, of the data's usefulness for analysis.

          link to this | view in chronology ]

    • identicon
      Anonymous Coward, 9 Nov 2020 @ 6:52pm

      Re: I remember this being a problem with Google

      I, too, remember when Google was trying to don't be evil. Those were the days.

      link to this | view in chronology ]

  • icon
    Uriel-238 (profile), 9 Nov 2020 @ 4:04pm

    Killing face recognition abuses

    Is it possible to make face-recognition useless to large companies and law enforcement? Say by making its use in detection rendering any evidence downstream inadmissible?

    Granted we still use parallel construction / evidence laundering, which is its own problem.

    Maybe abolish the justice system since we can't seem to trust human beings to hold authority over others without abusing the power?

    link to this | view in chronology ]

    • icon
      Scary Devil Monastery (profile), 10 Nov 2020 @ 12:25am

      Re: Killing face recognition abuses

      "Is it possible to make face-recognition useless to large companies and law enforcement?"

      Tricky. The tech is out there and we'll have to live with it existing. Much like encryption allows bad people to frustrate police, Cars allow bad people to move vast distances quickly and basic chemistry allows bad people the knowledge to make explosives and toxic gas from the average inventory of a well stocked larder and cleaning supply closet.

      "Say by making its use in detection rendering any evidence downstream inadmissible?"

      Privacy law with teeth is the way forward, yes. I'm drawing the simile to the concept of "presumed innocence" here; No matter how convenient it would be for law enforcement to just arrest everyone and waterboard them until the real criminal confessed we don't do that in a civilized nation. If collecting and using personal data is outright illegal except through very stringently held exceptions this is a technology which might not be quite so tempting to abuse.

      You'd think the historical examples of the reich, Soviet Russia, the DDR and China would suffice to show people the hazards of allowing governments - or corporations - to register and tally too much personal information. The cost always ends up being too damn high and always paid by innocents.

      link to this | view in chronology ]

  • icon
    fairuse (profile), 9 Nov 2020 @ 4:42pm

    Crazy will find a way to do

    It's a human nature problem. Guy with crazy get that b!†©h problem will find a way to stalk, harass, and injure.

    The boys will be boys approach to the company's internal problem highlights this, "The facial recognition system will be abused by anyone anywhere for any reason, even if it illegal and/or immoral." - signed me

    Somebody needs HR to grow a pair. Oops, I forgot, HR was the crazy job of protecting the company, not the employee - Mission critical guys got ruler slap on hands for getting caught. I've seen this my entire tech life - i've gone fishing.

    link to this | view in chronology ]

  • identicon
    Kriegerbot, 9 Nov 2020 @ 7:47pm

    Guess we are not bringing them on

    Wow, I have been evaluating these guys as an option for replacing an IP camera vendor we dropped.

    Not anymore, Jesus.

    link to this | view in chronology ]

  • icon
    That One Guy (profile), 9 Nov 2020 @ 8:07pm

    'Oh if we HAVE to...'

    The fact that it took significant public backlash for them to hand out an even kinda serious punishment is pretty damning of the company, as it would seem to indicate that those in charge of punishments didn't see anything wrong with what had been done and were merely going through the motions to make it look like they cared until it reached the point where they had to do something more.

    link to this | view in chronology ]

    • identicon
      Valley Vet, 10 Nov 2020 @ 2:50am

      Re: 'Oh if we HAVE to...'

      This is inaccurate reporting though. As has been reported elsewhere, before the media became aware of it Verkada had already stripped these guys of their options. This was done it seems as soon as senior management became aware (which was a few months after the incident itself). General consensus is that that punishment was mild, but think about it; at least two of them we know were senior director level, and they were within the first couple of hundred hires. Those shares will likely, given Verkada's market valuation, be worth hundreds of thousands of dollars when they IPO. That sounds like a pretty major sanction to me. On top of that, due to the subsequent media backlash, they've also lost their jobs.

      link to this | view in chronology ]

      • icon
        That One Guy (profile), 11 Nov 2020 @ 11:41am

        Re: Re: 'Oh if we HAVE to...'

        Those shares will likely, given Verkada's market valuation, be worth hundreds of thousands of dollars when they IPO. That sounds like a pretty major sanction to me.

        Can't say I'm terribly impressed honestly, as taking away something that might be worth a bunch of money down the line may sting, but it doesn't really impact them right then and there, and if they're still employed the message it sends is that such behavior might be frowned upon but not enough to be a dealbreaker for the company

        On top of that, due to the subsequent media backlash, they've also lost their jobs.

        Which was kinda my point, it took outside pressure for them to get rid of the guilty men, until that point they merely offered to let them quit or lose something that might be valuable in the future, which does not make for a good look for the company.

        link to this | view in chronology ]

        • identicon
          Anonymous Coward, 2 Dec 2020 @ 2:52pm

          Re: Re: Re: 'Oh if we HAVE to...'

          the previous reply-er obviously works at verkada and was friends with the people in question - not surprising considering its all a big clique. As an internal person who was genuinely offended by what happened, it's a big punch in the face that upper management tried to brush it under the rug for so long, and so aggressively. It was totally mishandled - and is a testament to the leadership there. Unfortunately, the shady tactics spill over to not just the culture, but the way the product is sold and represented, and well as how the company recruits. if it is verkada, don't trust it.

          link to this | view in chronology ]

    • icon
      Uriel-238 (profile), 10 Nov 2020 @ 12:17pm

      Perks of the surveillance economy

      Edward Snowden talked about how cheesecake pics and other intimate uncoverings were considered an unspoken perk within the NSA offices, and notable examples were passed around the office for everyone's enjoyment.

      This is also consistent with the TSA's nude scans which were later (allegedly) corrected by having the machine read the scans and replacing private parts with nothing to see here icons. I suspect if the airport security office has the option of turning those off, they're disabled as soon as the scanners are installed.

      Private use of surveillance facilities has always been a part of the culture, so long as the additional processing isn't too expensive, isn't moving agents around or turning satellites.

      It's going to be a challenge to make respect for privacy the new normal, or we're going to have to find a way for machines, not human beings to process the take.

      link to this | view in chronology ]


Follow Techdirt
Essential Reading
Techdirt Deals
Report this ad  |  Hide Techdirt ads
Techdirt Insider Discord

The latest chatter on the Techdirt Insider Discord channel...

Loading...
Recent Stories

This site, like most other sites on the web, uses cookies. For more information, see our privacy policy. Got it
Close

Email This

This feature is only available to registered users. Register or sign in to use it.