Content Moderation Case Study: Automated Copyright Takedown Bot Goes Haywire (2018)
from the take-it-all-down dept
Summary: For years, Google and YouTube have included a trusted flagger program by which certain entities that have shown they “are particularly effective at notifying YouTube” of content violations are given more powerful tools with which to do so.
This is used often in the copyright context, and companies with a good history may be given access to things like bulk flagging tools and priority review of flagged content. One such trusted flagger for copyright was a company called Topple Track, which offered an automated service for musicians, searching the internet for infringing works and dashing off automated DMCA notices.
In May of 2015, digital music distribution company Symphonic purchased Topple Track, but appeared to keep the service running under its own brand.
In the summer of 2018, some people noticed that Topple Track’s automated DMCA notices appeared to go a bit haywire, sending DMCA notices for all kinds of perfectly legitimate content. Among those targeted with DMCA notices were the Electronic Frontier Foundation (EFF), the American Bar Association, NYU’s Law Review, the Crunchbase article about the company MP3Tunes and many, many more -- including many artists’ own web stores. EFF’s summary of the wild takedowns gives a sample
Among others, these notices improperly target:
Other targets include an article about the DMCA in the NYU Law Review, an NBC News article about anti-virus scams, a Variety article about the Drake-Pusha T feud, and the lyrics to ‘Happier’ at Ed Sheeran’s official website. It goes on and on.
- EFF’s case page about EMI v MP3Tunes
- The authorized music store on the official homepage of both Beyonce and Bruno Mars
- A fundraising page on the Minneapolis Foundation’s website
- The Graceland page at Paul Simon’s official website
- A blog post by Professor Eric Goldman about the EMI v MP3Tunes case
- A Citizen Lab report about UC Browser
- A New Yorker article about nationalism and patriotic songs
EFF published an article about this and noted that it seemed as yet another example of an automated DMCA reporting bot “running amok.” The group also questioned why such a company was in Google’s “trusted flagger” program.
Decisions to be made by Google / YouTube:
-
What qualifications are there for a partner to be considered a “trusted flagger”?
-
How often are trusted flaggers reviewed to make sure they still belong in the program?
-
What does it take to get a trusted flagger removed from the program?
-
With more emphasis on the speed of removals, it is often tempting for regulators to promote “trusted flagging” or “priority” accounts that are able to get content removed at a much quicker pace. What are the benefits and risks of such programs?
-
Automated flagging and now AI/Machine Learning flagging are increasingly a part of the content moderation landscape. How are they calibrated? How frequently are they reviewed?
-
What should the response be when an automated bot is flagging many accounts mistakenly?
A few weeks after the article, YouTube also told EFF that Topple Track had been removed from its Trusted Flagger program, “due to a pattern of problematic notices.” Some time after this, Topple Track, as a unique organization appeared to disappear, and the service and technology have apparently been subsumed into Symphonic Distribution’s catalog of services.
Originally published on the Trust & Safety Foundation website.
Filed Under: automated takedowns, content moderation, copyright, dmca, trusted flagger
Companies: google, symphonic, topple track, youtube