from the that's-not-how-any-of-this-works dept
Andy Parker has experienced something that no one should ever have to go through: having a child murdered. Even worse, his daughter, Alison, was murdered on live TV, while she was doing a live news broadcast, as an ex-colleague shot her and the news station's cameraman dead. It got a lot of news coverage, and you probably remember the story. Maybe you even watched the video (I avoided it on purpose, as I have no desire to see such a gruesome sight). Almost none of us can even fathom what that experience must be like, and I can completely understand how that has turned Parker into something of an activist. We wrote about him a year ago, when he appeared in a very weird and misleading 60 Minutes story attacking Section 230.
While Parker considers himself an "anti-big tech, anti-Section 230" advocate, we noted that his story actually shows the benefits of Section 230, rather than the problems with it. Parker is (completely understandably!) upset that the video of his daughter's murder is available online. And he wants it gone. As we detailed in our response to the 60 Minutes story, Parker had succeeded in convincing various platforms to quickly remove that video whenever it's uploaded. Something they can do, in part, because of Section 230's protections that allow them to moderate freely, and to proactively moderate content without fear of crippling lawsuits and liability.
The 60 Minutes episode was truly bizarre, because it explains Parker's tragic situation, and then notes that YouTube went above and beyond to stop the video from being shared on its platform, and then it cuts to Parker saying he "expected them to do the right thing" and then says that Google is "the personification of evil"... for... doing exactly what he asked?
Parker is now running for Congress as well, and has been spouting a bunch of bizarre things about the internet and content moderation on Twitter. I'd link to some of them, but he blocked me (a feature, again, that is aided by Section 230's existence). But now the Washington Post has a strange article about how Parker... created an NFT of the video as part of his campaign to remove it from the internet.
Now, Andy Parker has transformed the clip of the killings into an NFT, or non-fungible token, in a complex and potentially futile bid to claim ownership over the videos — a tactic to use copyright to force Big Tech’s hand.
So... none of this makes any sense. First of all, Parker doesn't own the copyright, as the article notes (though many paragraphs later, even though it seems like kind of a key point!).
Parker does not own the copyright to the footage of his daughter’s murder that aired on CBS affiliate WDBJ in 2015.
But it says he's doing this to claim "ownership" of the video, because what appear to be very, very bad lawyers have advised him that by creating an NFT he can "claim ownership" of the video, and then use the DMCA's notice-and-takedown provisions instead. Everything about this is wrong.
First, while using copyright to takedown things you don't want is quite common, it's not (at all) what copyright is meant for. And, as much as Parker does not want the video to be available, there is a pretty strong argument that many uses of that video are covered by fair use.
But, again, he doesn't hold the copyright. So, creating an NFT of the video does not magically give him a copyright, nor does it give him any power under the DMCA to demand takedowns. That requires the actual copyright. Which Parker does not have. Even more ridiculously, the TV station that does hold the copyright has apparently offered to help Parker use the copyright to issue DMCA takedowns:
In a statement, Latek said that the company has “repeatedly offered to provide Mr. Parker with the additional copyright license” to call on social media companies to remove the WDBJ footage “if it is being used inappropriately.”
This includes the right to act as their agent with the HONR network, a nonprofit created by Pozner that helps people targeted by online harassment and hate. “By doing so, we enabled the HONR Network to flag the video for removal from platforms like YouTube and Facebook,” Latek said.
So what does the NFT do? Absolutely nothing. Indeed, the NFT is nothing more than basically a signed note, saying "this is a video." And part of the ethos of the NFT space is that people are frequently encouraged to "right click and save" the content, and to share it as well -- because the content and the NFT are separate.
Hell, there's an argument (though I'd argue a weak one -- though others disagree) that by creating an NFT of a work he has no copyright over, Parker has actually opened himself up to a copyright infringement claim. Indeed, the TV station is quoted in the article noting that, while it has provided licenses to Parker to help him get the video removed, "those usage licenses do not and never have allowed them to turn our content into NFTs."
I understand that Parker wants the video taken down -- even though there may be non-nefarious, legitimate reasons for those videos to remain available in some format. But creating an NFT doesn't give him any copyright interest, or any way to use the DMCA to remove the videos and whoever told Parker otherwise should be disbarred. They're taking advantage of him and his grief, and giving him very, very bad legal advice.
Meanwhile, all the way at the end of the article, it is noted -- once again -- that the big social media platforms are extremely proactive in trying to remove the video of her murder:
“We remain committed to removing violent footage filmed by Alison Parker’s murderer, and we rigorously enforce our policies using a combination of machine learning technology and human review,” YouTube spokesperson Jack Malon said in a statement.
[...]
Facebook bans any videos that depict the shooting from any angle, with no exceptions, according to Jen Ridings, a spokesperson for parent company Meta.
“We’ve removed thousands of videos depicting this tragedy since 2015, and continue to proactively remove more,” Ridings said in a statement, adding that they “encourage people to continue reporting this content.”
The reporter then notes that he was still able to find the video on Facebook (though all the ones he found were quickly removed).
Which actually goes on to highlight the nature of the problem. It is impossible to find and block the video with perfect accuracy. Facebook and YouTube employ some of the most sophisticated tools out there for finding this stuff, but the sheer volume of content, combined with the tricks and modifications that uploaders try, mean that they're never going to be perfect. So even if Parker got the copyright, which he doesn't, it still wouldn't help. Because these sites are already trying to remove the videos.
Everything about this story is unfortunate. The original tragedy, of course, is heartbreakingly horrific. But Parker's misguided crusade isn't helping, and the whole NFT idea is so backwards that it might lead to him potentially facing a copyright claim, rather than using one. I feel sorry for Parker, not only for the tragic situation with his daughter, but because it appears that some very cynical lawyers are taking advantage of Parker's grief to try to drive some sort of policy outcome out of it. He deserves better than to be preyed upon like that.
Filed Under: alison parker, andy parker, content moderation, copyright, dmca, free speech, nfts, section 230, takedowns
Companies: facebook, youtube