Bizarre Magistrate Judge Ruling Says That If Facebook Deletes An Account, It No Longer Needs To Keep Details Private
from the that-doesn't-make-any-sense dept
There have been a bunch of slightly wacky court rulings of late, and this recent one from magistrate judge Zia Faruqui definitely is up there on the list of rulings that makes you scratch your head. The case involves the Republic of Gambia seeking information on Facebook accounts that were accused of contributing to ethnic genocide of the Rohingya in Myanmar. This situation was -- quite obviously -- horrible, and it tends to be the go-to story for anyone who wants to show that Facebook is evil (though I'm often confused about how people often seem more focused on blaming Facebook for the situation than the Myanmar government which carried out the genocide...). Either way, the Republic of Gambia is seeking information from Facebook regarding the accounts that played a role in the genocide, as part of its case at the International Court of Justice.
Facebook, which (way too late in the process) did shut down a bunch of accounts in Myanmar, resisted demands from Gambia to hand over information on those accounts noting, correctly, that the Stored Communications Act likely forbids it from handing over such private information. The SCA is actually pretty important in protecting the privacy of email and messages, and is one of the rare US laws on the books that is actually (for the most part) privacy protecting. That's not to say it doesn't have its own issues, but the SCA has been useful in the past in protecting privacy.
The ruling here more or less upends interpretations of the SCA by saying once an account is deleted, it's no longer covered by the SCA. That's... worrisome. The full ruling is worth a read, as you'll know you'll be in for something of a journey when it starts out:
I come to praise Facebook, not to bury it.
Not quite what you expect from a judicial order. The order lays out the unfortunately gory details of the genocide in Myanmar, as well as Facebook's role in enabling the Myanmar government to push out propaganda and rally support for its ethnic cleansing. But the real question is how does all of this impact the SCA. As the judge notes, since the SCA was written in 1986 it certainly didn't predict today's modern social media, or the questions related to content moderation, so this is a new issue for the court to decide. But... still. The court decides that because an account is disabled... that means that the communications are no longer "stored." Because [reasons].
The Problem Of Content Moderation
At the time of enactment, Congress viewed ECS and RCS providers as mail/package delivery services. See Cong. Rsch. Serv., R46662, Social Media: Misinformation and Content Moderation Issues for Congress (2021), https://crsreports.congress.gov/product/pdf/R/R46662. This view failed to consider content moderation; mail/package delivery services have neither the ability nor the responsibility to search the contents of every package. Yet after disinformation on social media has fed a series of catastrophic harms, major providers have responded by taking on the de facto responsibility of content moderation. See id. “The question of how social media platforms can respect the freedom of expression rights of users while also protecting [users] from harm is one of the most pressing challenges of our time.” ...
This Court is the first to consider the question of what happens after a provider acts on its content moderation responsibility. Is content deleted from the platform but retained by the provider in “backup storage?” It is not.
That obviously seems like a stretch to me. If the company still retains the information then it is clearly in storage. Otherwise, you've just created a massive loophole by saying that any platform can expose the private communications of someone if they first disable their account.
The court's reasoning, though gets at the heart of the language of the SCA and how it protects both "any temporary, intermediate storage of a wire or electronic communication incidental to the electronic transmission thereof" or "any storage of such communication by an electronic communication service for purposes of backup protection of such communication." It says the first bit can't apply because these communications had reached their "final destination" and were no longer temporary. And it can't be "backup" since the original content had been deleted, therefore there couldn't be any "backup."
Congress’s conception of “‘backup’ necessarily presupposes the existence of another copy to which this [backup record] would serve as a substitute or support.” Id. Without an original, there is nothing to back up. Indeed “the lifespan of a backup is necessarily tied to that of the underlying message. Where the underlying message has expired . . . , any copy is no longer performing any backup function. An [ECS] that kept permanent copies of [deleted] messages could not fairly be described as ‘backing up’ those messages.”
But... I think that's just wrong. Facebook retaining this data (but blocking the users from accessing it themselves) is clearly a "backup." It's backup in case there is a reason why, at some future date, the content does need to be restored. Under the judge's own interpretation, if you backup your hard drive, but then the drive crashes, your backup is no longer your backup, because there's no original. But... that's completely nonsensical.
The judge relies on (not surprisingly) a case in which the DOJ twisted and stretched the limits of the SCA to get access to private communications:
Nearly all “backup storage” litigation relates to delivered, undeleted content. That case law informs and supports the Court’s decision here. “Although there is no binding circuit precedent, it appears that a clear majority of courts have held that emails opened by the intended recipient (but kept on a web-based server like Gmail) do not meet the [backup protection] definition of ‘electronic storage.’” Sartori v. Schrodt, 424 F. Supp. 3d 1121, 1132 (N.D. Fla. 2019) (collecting cases). The Department of Justice adopted this view, finding that backup protection “does not include post-transmission storage of communications.” U.S. Dep’t of Just., Searching and Seizing Computers and Obtaining Electronic Evidence in Criminal Investigations, 123 (2009), https://www.justice.gov/sites/default/files/criminal-ccips/legacy/2015/01/14/ssmanual2009.pdf. The Gambia argues for following the majority view’s limited definition of backup storage. See Sartori, 424 F. Supp. 3d at 1132; ECF No. 16 (Pet’r’s Resp. to Surreply) at 5–6. If undeleted content retained by the user is not in backup storage, it would defy logic for deleted content to which the user has no access to be in backup storage.
As for the argument (which makes sense to me) that Facebook made that the entire reason for retaining the account shows that it's backup, the judge just doesn't buy it.
Facebook argues that because the provider-deleted content remains on Facebook servers in proximity to where active content on the platform is stored, both sets of content should be protected as backup storage. See Conf. Tr. at 76. However, the question is not where the records are stored but why they are stored. See Theofel, 359 F.3d at 1070. Facebook claims it kept the instant records as part of an autopsy of its role in the Rohingya genocide. See Conf. Tr. at 80–81. While admirable, that is storage for self-reflection, not for backup.
The judge also brushes aside the idea that there are serious privacy concerns with this result, mainly because the judge doesn't believe Facebook cares about privacy. That, alone, is kind of a weird way to rule on this issue.
Finally, Facebook advances a policy argument, opining that this Court’s holding will “have sweeping privacy implications—every time a service provider deactivates a user’s account for any reason, the contents of the user’s communications would become available for disclosure to anyone, including the U.S. government.”.... Facebook taking up the mantle of privacy rights is rich with irony. News sites have entire sections dedicated to Facebook’s sordid history of privacy scandals.
So... because Facebook doesn't have a great history regarding the protection of privacy... we can make it easier for Facebook to expose private communications? What? And even if it's true that Facebook has made problematic decisions in the past regarding privacy, that's wholly separate from the question of whether or not it has a legal obligation to protect the privacy of messages now.
Furthermore, the judge insists that even if there are privacy concerns, they are "minimal":
The privacy implications here are minimal given the narrow category of requested content. Content urging the murder of the Rohingya still permeates social media. See Stecklow, supra (documenting “more than 1,000 examples . . . of posts, comments, images and videos attacking the Rohingya or other Myanmar Muslims that were on Facebook” even after Facebook apologized for its services being “used to amplify hate or exacerbate harm against the Rohingya”). Such content, however vile, is protected by the SCA while it remains on the platform. The parade of horribles is limited to a single float: the loss of privacy protections for de-platformed content. And even that could be mitigated by users joining sites that do not de-platform content.
Yes. In this case. But this could set a precedent for accessing a ton of other private communications as well, and that's what's worrying. It's absolutely bizarre and distressing that the judge doesn't bother to think through the implications of this ruling beyond just this one case.
Prof. Orin Kerr, one of the foremost experts on ECPA and the SCA, notes that this is both an "astonishing interpretation" and "stunning."
Also, it's a stunning interpretation in its consequences. Under the op, the most fundamental rule of Internet privacy -- that your e-mails and messages are protected from disclosure -- is largely meaningless. A provider can just delete your account and hand out your messages.
— Orin Kerr (@OrinKerr) September 24, 2021
The entire ruling is concerning -- and feels like yet another situation where someone's general disdain for Facebook and its policies (a totally reasonable position to take!) colored the analysis of the law. And the end result is a lot more dangerous for everyone.
Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.
Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.
While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.
–The Techdirt Team
Filed Under: backup, deleted profiles, ecpa, gambia, myanmar, privacy, sca, stored communications act, zia faruqui
Companies: facebook
Reader Comments
Subscribe: RSS
View by: Time | Thread
Be nice if those appointed to judgeships had a minimal understanding of the tech world and the laws that impact it. Might also help if they had a basic understanding of English vocabulary in a tech context, e.g., the meaning of "stored."
[ link to this | view in chronology ]
Re: Appearances + friends in positions of authority
Agree. Also, if qualified at all beyond the level of occupying a chair and playing Let’s Pretend cuz I’m de judge sez da man wid da power to say… Just sayin’.
[ link to this | view in chronology ]
Re:
The thing is, a reasonable person wouldn't care what these definitions are, and don't need to understand tech at all, they simply understand that their stuff is stored somewhere, and whether they or the service provider suspend or delete their account, anything remaining is still their private stuff, and should be protected as such.
[ link to this | view in chronology ]
Isn't it true that Facebook never really deletes an account? If that's the case, then aren't the materials still backups, even under this opinion?
[ link to this | view in chronology ]
How do you know for sure? How do you really know?
This line caught my eye
For a philosophical discussion, what's the difference between the two?
Let's say you have two text or image files, one a backup of another. How can you tell which one is the original? From the directory timestamp? Is it the location each is stored?
If you move both files to another volume - hell, both to the same volume - and reset their directory timestamps, how can you tell which one is the backup and which one is the original?
This isn't an issue in meatspace. Making an exact duplicate of any physical thing is not possible for the foreseeable future (or ever, if you take into account the Observer effect), and that's what the courts are used to dealing with.
That's also why they fail so spectacularly to frame their rulings without putting their collective feet in their collective mouths.
[ link to this | view in chronology ]
'If you can do it then so can we.'
That argument regarding privacy is beyond absurd. 'Facebook doesn't care about user privacy so the courts and legal systems don't have to either' in a single step both condemns Facebook and then exonerates them by arguing that once privacy has been ignored by one party the government no longer needs to care either, something which if anything leaves the government in a position where they want Facebook or other companies to show as much contempt towards user privacy as possible.
On a more general note the idea that once an account is deleted any data from it is free to grab is beyond disturbing as that makes the law an absolute joke by punching a massive hole in it's protections and encourages people to keep accounts on services they might otherwise avoid in order to 'protect' their data from being grabbed by any third party that wants it.
[ link to this | view in chronology ]
SCA?
Why is the SCA being applied to a crime that happened in Myanmar and a law enforcement agency in Gambia?
[ link to this | view in chronology ]
Re: SCA?
Because Facebook is US company and still has to follow US law?
[ link to this | view in chronology ]
Re: Re: SCA?
Maybe it's more like FB doesn't have a presence in Myanmar or Gambia, so there is no entity to sue there. It seems like the law enforcement exception should apply here though.
[ link to this | view in chronology ]
Assumes facts not in evidence
Facebook isn't necessarily or even likely deleting "the original" data at all, merely making it unavailable for the user or the public to access.
[ link to this | view in chronology ]
I could tell on you, but then I would have to delete you
My name is Zuckerberg. Mark Zuckerberg.
[ link to this | view in chronology ]
If you're not on Facebook, don't get on.
If you're on Facebook, spend a year defacing your own page with false information before getting off, so that when you close your account, none of the data has been accurate for a few iterations.
[ link to this | view in chronology ]
Re: If you're not on Facebook, don't get on.
That's assuming it's accurate in the first place. For all the complaints, I've been friends with people who use use pseudonyms, people who use multiple accounts, even inanimate objects and dogs, for the whole time I used the service.
The problem with FB isn't so much the way they gather and use data, it's the people who believe Facebook "sources" more than they do reputable sources, and I'm not sure I know how to fix that any more than I do people who think that Fox, the Daily Mail and The Sun are factual sources - and that's a problem that predates the internet, let alone Facebook.
[ link to this | view in chronology ]
Odd rulings
A quick skim of Google headlines for judge Zia Faruqui is... interesting.
[ link to this | view in chronology ]
Makes one wonder if the good judge and his overseers have an account or two and if their contents could be made public using this principle?
[ link to this | view in chronology ]