Facebook Asked To Change Terms Of Service To Protect Journalists
from the a-chance-to-fix-things dept
There are plenty of things to be concerned about regarding Facebook these days, and I'm sure we'll be discussing them for years to come, but the Knight First Amendment Center is asking Facebook to make a very important change as soon as possible: creating a safe harbor for journalists who are researching public interest stories on the platform. Specifically, the concern is that basic tools used for reporting likely violate Facebook's terms of service, and could lead to Facebook being able to go after reporters for CFAA violations for violating its terms. From the letter:
Digital journalism and research are crucial to the public’s understanding of Facebook’s platform and its influence on our society. Many of the most important stories written about Facebook and other social media platforms in recent months have relied on basic tools of digital investigation. For example, research published by an analyst with the Tow Center for Digital Journalism, and reported in The Washington Post, uncovered the true reach of the Russian disinformation campaign on Facebook. An investigation by Gizmodo showed how Facebook’s “People You May Know” feature problematically exploits “shadow” profile data in order to recommend friends to users. A story published by ProPublica revealed that Facebook’s self-service ad platform had enabled advertisers of rental housing to discriminate against tenants based on race, disability, gender, and other protected characteristics. And a story published by the New York Times exposed a vast trade in fake Twitter followers, some of which impersonated real users.
Facebook’s terms of service limit this kind of journalism and research because they ban tools that are often necessary to it—specifically, the automated collection of public information and the creation of temporary research accounts. Automated collection allows journalists and researchers to generate statistical insights into patterns, trends, and information flows on Facebook’s platform. Temporary research accounts allow journalists and researchers to assess how the platform responds to different profiles and prompts.
Journalists and researchers who use tools in violation of Facebook’s terms of service risk serious consequences. Their accounts may be suspended or disabled. They risk legal liability for breach of contract. The Department of Justice and Facebook have both at times interpreted the Computer Fraud and Abuse Act to prohibit violations of a website’s terms of service. We are unaware of any case in which Facebook has brought legal action against a journalist or researcher for a violation of its terms of service. In multiple instances, however, Facebook has instructed journalists or researchers to discontinue important investigative projects, claiming that the projects violate Facebook’s terms of service. As you undoubtedly appreciate, the mere possibility of legal action has a significant chilling effect. We have spoken to a number of journalists and researchers who have modified their investigations to avoid violating Facebook’s terms of service, even though doing so made their work less valuable to the public. In some cases, the fear of liability led them to abandon projects altogether.
This is a big deal, as succinctly described above. We've talked in the past about how Facebook has used the CFAA to sue useful services and how damaging that is. But the issues here have to do with actual reporters trying to better understand aspects of Facebook, for which there is tremendous and urgent public interest, as the letter lays out. Also, over at Gizmodo, Kash Hill has a story about how Facebook threatened them over their story investigating Facebook's "People You May Know" feature, showing that this is not just a theoretical concern:
In order to help conduct this investigation, we built a tool to keep track of the people Facebook thinks you know. Called the PYMK Inspector, it captures every recommendation made to a user for however long they want to run the tool. It’s how one of us discovered Facebook had linked us with an unknown relative. In January, after hiring a third party to do a security review of the tool, we released it publicly on Github for users who wanted to study their own People You May Know recommendations. Volunteers who downloaded the tool helped us explore whether you’ll show up in someone’s People You Know after you look at their profile. (Good news for Facebook stalkers: Our experiment found you won’t be recommended as a friend just based on looking at someone’s profile.)
Facebook wasn’t happy about the tool.
The day after we released it, a Facebook spokesperson reached out asking to chat about it, and then told us that the tool violated Facebook’s terms of service, because it asked users to give it their username and password so that it could sign in on their behalf. Facebook’s TOS states that, “You will not solicit login information or access an account belonging to someone else.” They said we would need to shut down the tool (which was impossible because it’s an open source tool) and delete any data we collected (which was also impossible because the information was stored on individual users’ computers; we weren’t collecting it centrally).
The proposal in the letter is that Facebook amend its terms of service to create a "safe harbor" for journalism. While Facebook recently agreed to open up lots of data to third party academics, it's important to note that journalists and academics are not the same thing.
The safe harbor we envision would permit journalists and researchers to conduct public-interest investigations while protecting the privacy of Facebook’s users and the integrity of Facebook’s platform. Specifically, it would provide that an individual does not violate Facebook’s terms of service by collecting publicly available data by automated means, or by creating and using temporary research accounts, as part of a news-gathering or research project, so long as the project meets certain conditions.
First, the purpose of the project must be to inform the general public about matters of public concern. Projects designed to inform the public about issues like echo chambers, misinformation, and discrimination would satisfy this condition. Projects designed to facilitate commercial data aggregation and targeted advertising would not.
Second, the project must protect Facebook’s users. Those who wish to take advantage of the safe harbor must take reasonable measures to protect user privacy. They must store data obtained from the platform securely. They must not use it for any purpose other than to inform the general public about matters of public concern. They must not sell it, license it, or transfer it to, for example, a data aggregator. And they must not disclose any information that would readily identify a user without the user’s consent, unless the public interest in disclosure would clearly outweigh the user’s interest in privacy.
There are a few more conditions in the proposal, including not interfering with the proper working of Facebook. The letter includes a draft amendment as well.
While there may be some hesitation among certain people with anything that seems to try to carve out different rules for a special class of people, I appreciate that the approach here is focused on carving out a safe harbor for journalism rather than journalists. That is, as currently structured, anyone could qualify for the safe harbors if they are engaged in acts of journalism, and it does not have any silly requirement about being attached to a well known media organization or anything like that. The entire setup seems quite reasonable, so now we'll see how Facebook responds.
Filed Under: cfaa, data collection, journalism, reporting, safe harbor, tools
Companies: facebook, knight 1st amendment center