Techdirt's think tank, the Copia Institute, is working with the Trust & Safety Professional Association and its sister organization, the Trust & Safety Foundation, to produce an ongoing series of case studies about content moderation decisions. These case studies are presented in a neutral fashion, not aiming to criticize or applaud any particular decision, but to highlight the many different challenges that content moderators face and the tradeoffs they result in. Find more case studies here on Techdirt and on the TSF website.

Content Moderation Case Study: Dealing With Podcast Copycats (2020)

from the podcast-yourself,-but-not-someone-else dept

Summary: Since the term was first coined in 2004, podcasts have obviously taken off, with reports saying that around 55% of Americans have listened to a podcast as of early 2021. Estimates on a total number of podcasts vary, but some sites estimate the total at 1.75 million podcasts, with about 850,000 of them described as “active.” Still, for many years, actually hosting a podcast remained somewhat complicated.

A few services have been created to try to make it easier, and one of the biggest names was Anchor.fm, which tried to make it extremely easy to create and host a podcast -- including the ability to add in an advertising-based monetization component. In early 2019, as part of its aggressive expansion into podcasts, Spotify purchased Anchor for $150 million.

However, in the summer of 2020, podcasters began calling out Anchor for allowing others to re-upload copies of someone else’s podcasts, claim them as their own, and monetize those other podcasts. Erika Nardini from Barstool Sports called this out on Twitter, after seeing a variety of Barstool podcasts show up on Anchor, despite not being uploaded there by Barstool.

The issue got a lot more attention a month later when podcaster Aaron Mahnke wrote a thread detailing how a variety of popular podcasts were being reuploaded to Anchor and monetized by whoever was uploading them.

After that thread started to go viral, Anchor promised to crackdown on copied/re-uploaded podcasts. The company claimed that it had an existing system in place to detect duplicates, but that those doing the uploading had figured out some sort of workaround, by manually uploading the podcasts, rather than automating the effort:

The copycats, Mignano says, found a workaround in Anchor’s detection system. “This is definitely a new type of attack for Anchor,” he says. The people who uploaded these copycat shows downloaded the audio from another source, manually reuploaded it to Anchor, and filled in the metadata, essentially making it appear to be a new podcast.

This manual process, he says, makes uploading copycat shows more time-intensive and therefore less appealing and only achievable on a small scale. He says the company found “a few dozen” examples out of the more than 650,000 shows uploaded to Anchor this year. (In contrast, people can also upload shows more automatically by pasting an RSS feed link into Anchor, but the company would seemingly detect if someone tried to upload a popular show’s feed and pass it off as their own.)

“The good news is that so many creators are using Anchor, and that growth has been far more than I think we projected, which is great, but I think the downside in this case is that, with any rapidly growing platform, that has brought on some growing pains and we need to do a better job of anticipating things like this,” he says. “We’re working right now to ensure that our copycat detection and creator outreach continues to improve to keep pace.”

Decisions to be made by Anchor/Spotify:

  • How do you detect which podcasts are copies from elsewhere, especially when the original versions may not have originated on Anchor?

  • People can always work around attempts to block copycats, so what kind of review process can be put in place to prevent abuse?

  • Will being too aggressive at preventing abuse potentially lead to taking down too much? For example, what if one podcast uses clips from another for the purpose of commentary?

  • Should there be extra validation or other hurdles to turn on monetization?

Questions and policy implications to consider:
  • What are the trade-offs in making it especially easy to host, distribute and monetize podcasts? Is it worth making it so easy when that process will likely be abused?

  • Is there a middle ground that allows for the easy creation, distribution and monetization of audio content that won’t be as subject to abuse?

  • Is there a risk that cracking down on copycat content itself could go too far and silence commentary and criticism?

While Anchor/Spotify continue to update their practices, in the midst of all of this, another story came out (with less attention) showing how being too aggressive in stopping copycats can also backfire. This story actually came out a month before Anchor said it would beef up its process for stopping copycats, and involved a podcaster who wanted to test out Anchor to see if he should recommend it on his own podcast. That podcaster, Jonathan Mendonsa, posted a test episode to Anchor to see how it works, only to have his entire account shut down with no explanation or notice.

Mendonsa was surprised to find out that Anchor was comparing audio he uploaded to Anchor to audio uploaded elsewhere, and felt that the decision to completely shut down his account immediately was perhaps a step too far. From the story at PodNews:

Jonathan told Podnews: “I was testing Anchor to see if I would recommend it to my podcast course students. This ”duplicate content" caused them to not only take down the episode but to actually shut down my account entirely without no recourse or notice. What does that mean when a podcaster wants to republish an old episode? Or use a clip from another episode?"

For this show to have been pulled within two hours of posting must mean that Anchor is automatically comparing audio uploaded to their platform with all audio already available on Spotify - since this audio was identical to an episode already there.

“Anchor needs to clarify the tech they are using, and what triggers this,” Jonathan told us. “I never considered that any podcast platform would be looking for duplicate content, so I just used the same trailer. I wouldn’t be mad if it got flagged or the episode got unpublished - but to delete the entire account?”

It’s interesting to note the challenges on both sides of this issue, with some being upset that Anchor makes it too easy to distribute duplicate content, and others being upset at how quickly and aggressively it responds to duplicate content.

Originally posted to the Trust & Safety Foundation website.

Hide this

Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.

Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.

While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.

–The Techdirt Team

Filed Under: content moderation, copycats, duplicates, podcasts
Companies: anchor.fm, spotify


Reader Comments

Subscribe: RSS

View by: Time | Thread


  • identicon
    christenson, 9 Apr 2021 @ 4:41pm

    Proof of Masnick's Impossibility Theorem

    Nuking the guy that was duplicating his own content as a test is an excellent demonstration of why moderation at scale is impossible -- the correctness of the decision to suppress or not suppress depended entirely on things outside the content itself.

    Context is everything!

    link to this | view in chronology ]

    • identicon
      Anonymous Coward, 10 Apr 2021 @ 5:48am

      Re: Proof of Masnick's Impossibility Theorem

      The correctness of the decision depended on the copyright holder. Who wasn't even informed the system existed.

      link to this | view in chronology ]

      • identicon
        christenson, 12 Apr 2021 @ 9:58pm

        Re: Re: Proof of Masnick's Impossibility Theorem

        Just curious....how you planning on figuring out
        a) Who is the copyright holder? (Even this post is copyrighted, according to the opt-out only copyright policy currently in the US) and,
        b) That this copyrightholder is the same person on a completely different website?

        There's a reason the cops set up in-person meetings when they bust people for pedophiiia on-line.

        Oh, and c) what about fair use? uncopyrightable material, like NASA broadcasts?

        link to this | view in chronology ]

  • This comment has been flagged by the community. Click here to show it
    identicon
    Mort Adella says this is BALONEY!, 9 Apr 2021 @ 7:35pm

    Proof of Focus on Edge Cases instead of the 99%.

    The key failure here is obviously not providing an appeal system at all!

    Sites esp Google / Facebook / Twitter want to maximize shareholder value, enjoy controlling "users", and do so little serving of The Public as optimizes both those real goals.

    link to this | view in chronology ]

    • This comment has been flagged by the community. Click here to show it
      identicon
      Mort Adella says this is BALONEY!, 9 Apr 2021 @ 7:35pm

      Re: Proof of Focus on Edge Cases instead of the 99%.

      Yet Maz doesn't list an Appeal System as either Decision or Question! I think because:

      A) simply doesn't occur to his money-mad brain that the most common cases of bad results are easily fixed by having a reasonable person able to decide -- by common sense and common law that almost any bartender does without having to write down rules!

      The min wage / furriners sites hire at present must try to figure which written rule to apply in a culture they don't know intuitively, and most likely just go with least trouble to the system which is to support the AI flag. Users are MERE users, and when available by the millions, no one is valuable enough to merit another second of considering. -- Yes, know Maz mentions this, but that's pro forma, so he can then ignore it! He ALWAYS asserts that sites have "1A" right to kick anyone off arbitrarily, no appeal, just FUCK YOU GET LOST. Maz can't ever reach the basic question of whether "social media" is workable, because MONEY and the next two items are his major motives:

      B) Maz is openly of late advocating sites be social control systems and he ACTIVELY doesn't wish to please users who don't fit his leftist template

      C) Maz assumes and hopes AI will soon take over all surveillance and decisions, because he's a surveillance enthusiast first, and it's also cheaper.

      link to this | view in chronology ]

      • identicon
        Rocky, 9 Apr 2021 @ 8:04pm

        Troll descends into pile of manure

        It's always fun when you display your stupidity for everyone to see. Let see if you actually manage to realize the mistake you based your scatological posts on.

        link to this | view in chronology ]

    • identicon
      Anonymous Coward, 11 Apr 2021 @ 5:40pm

      Re:

      The key failure here is obviously not providing an appeal system at all!

      That's what copyright maximalists like you have been demanding since forever, you numpty. You play stupid games, you get all the stupid prizes.

      link to this | view in chronology ]


Follow Techdirt
Essential Reading
Techdirt Insider Discord

The latest chatter on the Techdirt Insider Discord channel...

Loading...
Recent Stories

Close

Email This

This feature is only available to registered users. Register or sign in to use it.