AT&T, Verizon Feign Ethical Outrage, Pile On Google's 'Extremist' Ad Woes

from the manufactured-outrage dept

So you may have noticed that Google has been caught up in a bit of a stink in the UK over the company's YouTube ads being presented near "extremist" content. The fracas began after a report by the Times pointed out that advertisements for a rotating crop of brands were appearing next to videos uploaded to YouTube by a variety of hateful extremists. It didn't take long for the UK government -- and a number of companies including McDonald's, BBC, Channel 4, and Lloyd's -- to engage in some extended pearl-clutching, proclaiming they'd be suspending their ad buys until Google resolved the issue.

Of course, much like the conversation surrounding "fake news," most of the news coverage was bizarrely superficial and frequently teetering toward the naive. Most outlets were quick to malign Google for purposely letting extremist content get posted, ignoring the fact that the sheer volume of video content uploaded to YouTube on a daily basis makes hateful-idiot policing a Sisyphean task. Most of the reports also severely understate the complexity of modern internet advertising, where real-time bidding and programmatic placement means companies may not always know what brand ads show up where, or when.

Regardless, Google wound up issuing a mea culpa stating they'd try to do a better job at keeping ads for the McRib sandwich far away from hateful idiocy:

"We know advertisers don't want their ads next to content that doesn’t align with their values. So starting today, we’re taking a tougher stance on hateful, offensive and derogatory content. This includes removing ads more effectively from content that is attacking or harassing people based on their race, religion, gender or similar categories. This change will enable us to take action, where appropriate, on a larger set of ads and sites."

As we've noted countless times, policing hate speech is a complicated subject, where the well-intentioned often stumble down the rabbit hole into hysteria and overreach. Amusingly though, AT&T and Verizon -- two U.S. brands not exactly synonymous with ethical behavior -- were quick to take advantage of the situation, issuing statements that they too were simply outraged -- and would be pulling their advertising from some Google properties post haste. This resulted in a large number of websites regurgitating said outrage with a decidedly straight face:

"We are deeply concerned that our ads may have appeared alongside YouTube content promoting terrorism and hate," an AT&T spokesperson told Business Insider in a written statement. "Until Google can ensure this won’t happen again, we are removing our ads from Google’s non-search platforms."

"Once we were notified that our ads were appearing on non-sanctioned websites, we took immediate action to suspend this type of ad placement and launched an investigation," a Verizon spokesperson told Business Insider. "We are working with all of our digital advertising partners to understand the weak links so we can prevent this from happening in the future."

Of course, if you know the history of either company, you should find this pearl-clutching a little entertaining. In just the last few years, AT&T has been busted for turning a blind eye to drug dealer directory assistance scams, ripping off programs for the hearing impaired, defrauding government programs designed to shore up low-income connectivity, and actively helping "crammers" by making scam charges on consumer bills harder to detect. Verizon, recently busted for covertly spying on internet users and defrauding cities via bogus broadband deployment promises isn't a whole lot better.

That's not to say that all of the companies involved in the Google fracas are engaged in superficial posturing for competitive advantage. Nor is it to say that Google can't do more to police the global hatred brigades. But as somebody who has spent twenty years writing about these two companies specifically, the idea that either gives much of a shit about their ads showing up next to hateful ignoramuses is laughable. And it was bizarre to see an ocean of news outlets just skip over the fact that both companies are pushing hard into advertising themselves with completed or looming acquisitions of Time Warner, AOL and Yahoo.

Again, policing hateful idiocy is absolutely important. But overreach historically doesn't serve anybody. And neither does pretentious face fanning by companies looking to use the resulting hysteria to competitive advantage.

Hide this

Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.

Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.

While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.

–The Techdirt Team

Filed Under: ads, extremism, moral panic, pearl clutching, search, video
Companies: at&t, google, verizon, youtube


Reader Comments

Subscribe: RSS

View by: Time | Thread


  1. icon
    Ninja (profile), 24 Mar 2017 @ 4:33am

    Not to mention Google at the very least issued a mea culpa. The other two tired to spin things as awesome and positive or simply pretended it didn't exist when caught with pants down. Having ads beside extremist content will be the least problematic issue in companies that are ok in destroying your privacy with stealth super cookies, browsing habits recording and selling, all without any way to opt-out.

    link to this | view in thread ]

  2. identicon
    Anonymous Coward, 24 Mar 2017 @ 6:07am

    "non-search platforms"
    read: platforms we are not in open competition with.

    link to this | view in thread ]

  3. identicon
    Anonymous Coward, 24 Mar 2017 @ 6:08am

    Re:

    *are in open competition with

    link to this | view in thread ]

  4. identicon
    Anonymous Coward, 24 Mar 2017 @ 6:26am

    How much pressure will Google stand before they switch to being a gatekeeper rather than an open publishing platform?

    link to this | view in thread ]

  5. identicon
    Anonymous Coward, 24 Mar 2017 @ 6:34am

    Re:

    You have a strange definition of open.

    link to this | view in thread ]

  6. identicon
    Anonymous Coward, 24 Mar 2017 @ 6:41am

    Re: Re:

    Open, as in open to anyone to publish their contents on.

    link to this | view in thread ]

  7. identicon
    Anonymous Coward, 24 Mar 2017 @ 7:03am

    Relevant XKCD
    https://xkcd.com/1425/


    "We want you to flag every video that has our intellectual property in it."
    Google: "Sure, give us a few hours."
    "Now we want you to flag every video that is evil."

    link to this | view in thread ]

  8. icon
    TheResidentSkeptic (profile), 24 Mar 2017 @ 7:17am

    Look on the bright side

    ... we don't have to suffer through their horrible ads for a while.

    link to this | view in thread ]

  9. identicon
    Anonymous Coward, 24 Mar 2017 @ 7:32am

    Re: Look on the bright side

    Youtube has ads?

    link to this | view in thread ]

  10. identicon
    anonymous, 24 Mar 2017 @ 7:52am

    most effective ad deterrent

    This means that all we have to do in order to avoid seeing obnoxious ads is make some tongue-in-cheek "hate" video, share it around with our friends to make sure it's in our feed, and voila, no more big media ads. We just have to find the keywords that Google uses to categorize them.

    link to this | view in thread ]

  11. identicon
    Anonymous Coward, 24 Mar 2017 @ 8:16am

    any excuse to have a go at Google! and lets face it, whilst Google is under the spotlight, the true asshole companies, AT&T, Comcast, Verizon etc are being left alone. their mistake is to be the ones throwing the stones! they would actually be better off keeping shtum!!

    link to this | view in thread ]

  12. identicon
    Anonymous Coward, 24 Mar 2017 @ 8:32am

    Re:

    Wait I thought Mike was a google schill?

    link to this | view in thread ]

  13. icon
    Richard (profile), 24 Mar 2017 @ 8:34am

    McRib

    Regardless, Google wound up issuing a mea culpa stating they'd try to do a better job at keeping ads for the McRib sandwich far away from hateful idiocy:

    I am sure that McD isquite happy to sell (Halal)burgers to Jhadis - just like Lenin's capitalist who would source his own noose.

    Of course they are presumably bothered about being labelled "Islamaphobic" for advertising McRib ( a pork product in case anyone hadn't noticed) to Muslims.

    link to this | view in thread ]

  14. This comment has been flagged by the community. Click here to show it
    identicon
    Anonymous Coward, 24 Mar 2017 @ 8:42am

    Missing the problem

    Of course the real problem isn't Islam related "hate" videos. Because both the Jihadi ones and the anti-islam ones have the virtue of broadly telling the truth about Islam and Jihad. It is the likes of Bush, Obama, Cameron, Khan and May that lie about it. Those lies are the problem because they spread complacency about the issue.

    Don't listen to May and her crowd. Listen to ex-muslims and non-muslims who live in muslim majority countries - they know the truth.

    link to this | view in thread ]

  15. identicon
    UniKyrn, 24 Mar 2017 @ 8:47am

    McAT&GoApple want to be the only company left standing.

    I keep thinking about that silly line from "Demolition Man", "But all restaurants are Taco Bell, they won the franchise wars". Yeah, it becomes less funny every year.

    link to this | view in thread ]

  16. identicon
    Anonymous Coward, 24 Mar 2017 @ 10:14am

    Perhaps the biggest problem is that the establishment treats left-wing extremists and right-wing extremists very differently for doing exactly the same thing.

    Twitter is notorious for enforcing this double-standard, with Facebook not far behind, and now there is increasing pressure on Youtube to enact similar left-leaning censorship.

    The DMCA has been the most potent weapon for censoring Youtube content. Right wing polemicists (YouTube has an awful lot of them) have learned to be very careful about using short video clips as "fair use" discussion material, as YouTube will reflexively remove such videos instantly (and slap the uploader with a copyright "strike") upon receiving a DMCA claim, putting the burden on the uploader to embark on the long slow process of getting the video restored by arguing fair use.

    link to this | view in thread ]

  17. icon
    orbitalinsertion (profile), 24 Mar 2017 @ 12:30pm

    This is like IBM complaining in the 40s that some Klan parade notice was placed too close to one of their ads.

    link to this | view in thread ]

  18. identicon
    OGquaker, 24 Mar 2017 @ 3:37pm

    Re: McRib

    The decades long assault against Americans who shun beef and pork products (Oprah Winfrey?) continues; yesterday Carls Jr. refused to sell us their 'Meatless Berger' which we have bought from them for years.
    Pushing cloven hoofed animal meat in your face has been as hate-filed as the usual profit over prophet.
    As a religious Green, the meat industry in this country is disrespectful and dangerous to our collective future.

    'It is difficult to get a man to understand something, when his salary depends on his not understanding it.' -Upton Sinclair

    link to this | view in thread ]

  19. identicon
    Anonymous Coward, 24 Mar 2017 @ 6:49pm

    Google makes billions in profits. They csn most certainly afford to hire a room full of people to review videos at least on a superficial level.

    If they cannot perhaps their business model is broken.

    link to this | view in thread ]

  20. icon
    That One Guy (profile), 24 Mar 2017 @ 8:11pm

    Everything is easy and cheap when you don't have to do it

    Along those lines, the recording and movie industries also make billions in profits, which means they too can certainly afford to 'hire a room full of people' to review DMCA claims before sending them out to make sure that they don't flag something erroneously.

    If they can't manage that, then perhaps their business models are broken.

    Tell you what, if you think that it's really that easy to review(superficially or otherwise) at least 400 hours worth of video uploaded per minute why don't you give Google a call, I'm sure they'd love to offer you the job.

    link to this | view in thread ]

  21. identicon
    Anonymous Coward, 24 Mar 2017 @ 10:28pm

    Re: Everything is easy and cheap when you don't have to do it

    Its actually pretty simple. First thing is scan the title if the posting. Tgey could ding a bunch tight sway.

    Second run the video past the google voice sppech capture system. Scan the results for keywords and kick out any video tgat fsils for manusl review.

    Third check the first 30 second and the description text. Timeline jump to a dew place to spot check.

    Already they will have gone a long way to spot troubles before they get posted.

    link to this | view in thread ]

  22. icon
    That One Guy (profile), 24 Mar 2017 @ 11:03pm

    Re: Re: Everything is easy and cheap when you don't have to do it

    Once again: 400+ hours worth of video per minute.

    What may be 'simple' for a few videos is anything but when you scale it up to that level, so you're not talking about 'a room full of people' but a massive system requiring various levels of review of enormous amounts of content.

    There's also the problem of false positives, something that already plagues ContentID, a black or white 'Does X match Y?' system. Make the question a subjective one, 'Does X count as 'extremist' content?' and things would be even more insane.

    link to this | view in thread ]

  23. identicon
    Anonymous Coward, 25 Mar 2017 @ 1:00am

    Re: Re: Re: Everything is easy and cheap when you don't have to do it

    Scale us a business model problem. It should never be sn excuse. Certainly not from a fat cat company like Google.

    link to this | view in thread ]

  24. identicon
    Anonymous Coward, 25 Mar 2017 @ 1:48am

    Re:

    Your concept of scale is badly broken. YouTube has hundreds of hours a minute uploaded to it, so your room full would in reality be a large building full, staffed 24/7.

    YouTubes business model is not broke, it just that they are not gatekeepers, but rather facilitators that allow anybody to publish without seeking any form of permission or review.

    link to this | view in thread ]

  25. identicon
    Anonymous Coward, 25 Mar 2017 @ 6:03am

    Re: Re: Everything is easy and cheap when you don't have to do it

    That filtering model on keywords would be an incredibly bad idea. I thought we all learned about the "Scunthorpe" problem by now. Blind word-list filtering results in stupidity like requiring the real location and then banning people for giving it as "Fort Gay". It was a bad idea then and it is a bad idea now.

    Not to mention that meanings aren't always clear just by the words used. Under the type of filtering logic you want we'd see British content flagged as hardcore gay child pornography because someone asked if they could "bum a couple of fags for the boys".

    link to this | view in thread ]

  26. icon
    That One Guy (profile), 25 Mar 2017 @ 11:12am

    Re: Re: Re: Re: Everything is easy and cheap when you don't have to do it

    Only if what's being scaled up is part of the business model being used, and not something they're being slapped with after the fact.

    Were Google/YT in the 'pre-screening video content' business then yes, they would be to blame if they set things up such that they couldn't handle the increased load of what they had to go through, but since they're not it's not a 'business model problem' at all. Youtube hosts videos, that's it's business model, saying they should have to pre-screen everything first isn't a matter of scaling up something they've always had to do, it's adding something new on top of what they already do, something that the scale of the problem would make insanely expensive and bring the service to a crawling halt if they were required to do, contrary to your claims otherwise.

    On a semi-related tangent, but your mention of how Google is big so it's not a problem has me again wondering, do you hold others to that same standard? Do you think that the movie and recording industries should likewise hire 'a room full of people' to personally vet every DMCA claim they send out to avoid false positives? They make billions too after all, surely it would be just as easy if not easier for them to pre-screen DMCA claims as it would be for Google/YT to pre-screen videos, so does that standard of yours apply to everyone, or just Google?

    link to this | view in thread ]

  27. identicon
    Anonymous Coward, 25 Mar 2017 @ 11:52pm

    Re: Re: Re: Re: Re: Everything is easy and cheap when you don't have to do it

    Youtube has a business model of posting user videos. They make piles of money doing so.

    The entertainment industry doesn't make money from Dmca notices.

    The difference here is huge.

    link to this | view in thread ]

  28. identicon
    Anonymous Coward, 25 Mar 2017 @ 11:59pm

    Re: Re: Re: Re: Re: Everything is easy and cheap when you don't have to do it

    For what its worth i am not suggedting thst Google have people wstch every secind of every video. Its about coming up eith ways to pick out uploads that are potentially problems and review them.

    Google has incredibly powerful tools to index content online and to extract semantic information. You don't that they could apply this to videos and their comments to determine videos that need review?

    I also don't th8nk they should delete videos, it would be good enough to flag them gor adults and remove ads from them. Deleting a video should be saved for the most egregious situations.

    That would go a very long way towards resolving the issues at hand.

    link to this | view in thread ]

  29. identicon
    Anonymous Coward, 26 Mar 2017 @ 3:22am

    Re: Re: Re: Re: Re: Re: Everything is easy and cheap when you don't have to do it

    Indeed the difference is huge, the MPAA and RIAA members combined publish less hours of content in a year than is posted to YouTube in a few minutes. Add a requirement for per-screening, which those organization want Goggle to do, and they eliminate most of the content that is competing with theirs for for customer attention; because Google could not keep up with what is being posted.

    As for the practicality of per-screening, that would require at least 24,00 actively screening content all the time, so to keep that up 24/7 would require at least 100,000 people; (allowing for holidays, sickness, meal breaks etc. along with the necessary managers and HR personnel). Then you run into the problem that those people do not know every existing work, who owns the copyright, and what licenses have been granted, or whether the poster works for the company that they claim, and have the authority to post the work.

    The only people who can reliably identify a work as belonging to them or their organization, are the producers (not necessaries the creators) of the work. And only they know, or have access to the information needed to determine whether or not it has been licensed to the poster.

    There is no magic crystal ball that will identify infringing works. Indeed, because of lack of a data-base of all works, there is no way of identifying the copyright holders of any particular work, or verifying that the claimant is actually the copyright holder, or licensed yo use the work

    link to this | view in thread ]

  30. icon
    That One Guy (profile), 26 Mar 2017 @ 10:42am

    Re: Re: Re: Re: Re: Re: Everything is easy and cheap when you don't have to do it

    Hosting user submitted videos, not posting. The distinction is significant, as it means that thanks to 230/common sense protections in the US at least they aren't held responsible for what their users post and as such have no requirement to pre-screen for CYOA reasons. With no requirement to pre-screen more videos isn't a scaling problem getting out of control because it was never a problem in the first place.

    (Imagine if you will a donation-based library, where all the books are donated by others. Their 'business model' is to make sure that they have enough shelves to hold all the books and making sure that people can find what they want. So long as they can manage those two, no matter how many books are donated or how fast then they're doing good and their 'business model' is fine. Now imagine someone comes in and demands that they check every book for 'offensive' content before people can check it out. Now how much is donated is a problem, but that problem has nothing to do with their 'business model', and everything to do with the new requirement that's been dumped in their laps.)

    The entertainment industry does make money from the content that they're filing DMCA claims for(assuming a valid notice anyway), and unlike user submitted content that YT makes money from hosting the DMCA contains a (effectively theoretical at this point) requirement to swear 'under penalty of perjury', which would require manual review.

    As I noted above a DMCA claim is also easier to check, as the only subjective part involved is a consideration of fair use, which has a quick and easy 'checklist' attached, quite unlike the subjective 'is this offensive/extremist?' which, barring extreme cases(and sometimes not even then) can be much harder to decide on. Ask enough people and anything can be seen as 'offensive', so the question becomes 'how many people can we safely offend according to the requirements?'

    There's also the difference in consequence, miss a 'guilty' copyright infringement case and the harm isn't likely to be very bad, whereas if a site is liable for user submitted content and they let an 'offensive/extremist' post through they're likely to be facing a serious penalty, which means they're much more likely to block even fringe stuff 'just in case', leading to large amounts of content and/or speech blocked.

    In both cases a faulty claim means legitimate/legal content and/or speech being removed, and while services like YT don't have a requirement to screen content those sending out copyright claims (theoretically) do, so why is it you think that only the former group should be required to pre-screen?

    link to this | view in thread ]

  31. icon
    That One Guy (profile), 26 Mar 2017 @ 11:13am

    Re: Re: Re: Re: Re: Re: Everything is easy and cheap when you don't have to do it

    Even having to pre-screen 'problematic' videos would be a huge problem due to how many they'd have to deal with, and the massive numbers of false positives they've be wading through.

    ContentID, something that's based upon a 'Does it contain content X or doesn't it?' already has problems a plenty flagging things for reasons ranging from absurd to downright broken. Now imagine a similar system but for 'offensive' content and the nightmare that would be.

    If Google wants to manually review videos flagged by users as 'offensive/extremist', which I believe they already do, then I've no problem with that. What I have a problem with is requiring them to do so ahead of time, as it would be insanely expensive, cause significant collateral damage, and make the service vastly less useful as a hosting platform(all of which would be bad enough for a huge company like them, but would be even worse for smaller services trying to break into the market and who wouldn't have the same resources that YT/Google does).

    link to this | view in thread ]

  32. identicon
    My_Name_Here, 26 Mar 2017 @ 1:47pm

    Re: Re: Re: Re: Re: Re: Re: Everything is easy and cheap when you don't have to do it

    "As for the practicality of per-screening, that would require at least 24,00 actively screening content all the time, so to keep that up 24/7 would require at least 100,000 people; (allowing for holidays, sickness, meal breaks etc. along with the necessary managers and HR personnel). Then you run into the problem that those people do not know every existing work, who owns the copyright, and what licenses have been granted, or whether the poster works for the company that they claim, and have the authority to post the work."

    You are assuming that every minute of every video would have to be pre-screened. That is stupid. Nobody needs to watch a whole cat video to know it's a cat video.

    Nobody would have to watch the videos in real time either. Even with your example, run the videos all at double speed and the needed people drops in half. Only watch 10% of the video time at double speed, and suddenly the need is down to 5% of the people you suggested. So 5% of 100,000 people would be...5000 workers. Suddenly it's falling into the realm of possible. Apply a little automation to filter out half of the videos that aren't harmful at all, and boom, you need 2500 people. Getting easier, isn't it?

    It's easy to blow it off as impossible. It's not. It's pretty simple stuff. The anonymous coward has it closer to right that anyone would like.

    link to this | view in thread ]

  33. identicon
    Anonymous Coward, 26 Mar 2017 @ 7:28pm

    Re: Re: Re: Re: Re: Re: Re: Everything is easy and cheap when you don't have to do it

    As soon as YouTube accepted to block copyright content and enforce dmca notices they tipped their hands. They are not a mere host but rather a publishing company. Hosting only would not include a YouTube mandated web page or related video link and such. Your uploaded video likely does not have ads embedded, those are added by the publishing company.

    link to this | view in thread ]

  34. icon
    That One Guy (profile), 27 Mar 2017 @ 9:51am

    Re: Re: Re: Re: Re: Re: Re: Re: Everything is easy and cheap when you don't have to do it

    Uh, no. That was the entire point of codifying the idea of safe harbors into law, that sites can take steps to moderate content without suddenly becoming liable for content posted by users. Were what you are saying true then Youtube would have been better off not implementing ContentID and ignoring DMCA claims, and I rather doubt that's what you meant to imply.

    'Voluntarily' implementing a (lousy) filter for one type of content and complying with the law does not magically change their status such that they are responsible for what's posted by others using their service, whether that content be copyright related or otherwise.

    link to this | view in thread ]

  35. icon
    btr1701 (profile), 27 Mar 2017 @ 3:19pm

    Re: Extremist

    The other problem, not talked about in any of these mainstream articles on the topic, is that "extremist" is being defined as everything from jihadi beheading videos to conservative political commentary. Basically anything that challenges politically-correct orthodoxy is now labeled "extremist" by YouTube and demonetized.

    link to this | view in thread ]

  36. identicon
    Wendy Cockcroft, 28 Mar 2017 @ 5:45am

    Re: Re: Extremist

    Actually, I'd say that political correctness is extremist. Safe space, anybody?

    link to this | view in thread ]


Follow Techdirt
Essential Reading
Techdirt Deals
Report this ad  |  Hide Techdirt ads
Techdirt Insider Discord

The latest chatter on the Techdirt Insider Discord channel...

Loading...
Recent Stories

This site, like most other sites on the web, uses cookies. For more information, see our privacy policy. Got it
Close

Email This

This feature is only available to registered users. Register or sign in to use it.