from the everybody-loses dept
Earlier this week, Facebook announced that it had uncovered a new wave of disinformation attacks ahead of the 2018 elections. To hear Facebook tell it, the new attacks pretty closely mirror Russia's Internet Research Agency attacks during the 2016 election. As in, the culprits are trying to sow distrust and amplify partisan divisions on both sides of the aisle by creating fake organizations, fake people, fake news, and rockin' memes. How much that actually accomplishes is the matter of some debate, but it's also pretty clear we don't yet understand how deep this rabbit hole really goes.
According to Facebook, this latest attack on the nation's gullible shows signs of evolution from the more ham-fisted attacks seen during the 2016 election. And while there's no hard link to Russia yet, Facebook claims there are some connections between Russian Internet Research Agency "troll farm" accounts and this new wave of disinformation:
"For example they used VPNs and internet phone services, and paid third parties to run ads on their behalf. As we’ve told law enforcement and Congress, we still don’t have firm evidence to say with certainty who’s behind this effort. Some of the activity is consistent with what we saw from the IRA before and after the 2016 elections. And we’ve found evidence of some connections between these accounts and IRA accounts we disabled last year, which is covered below."
The problem, as Twitter, Facebook (and Mike, And Tim) have made abundantly clear, these companies are absolutely terrible at policing their own platforms, and simply crying out that they should "do something" without understanding what they're doing isn't likely to work. When these platforms do attempt to address hate speech or propaganda, they repeatedly do a terrible and inconsistent job of it. Said terribleness almost always results in over-reach and collateral damage.
For example, Facebook's response to the discovery of the latest disinformation attack involved taking down at least 32 pages and numerous profiles on Facebook pushing bullshit to partisans on both sides of the aisle. Said "inauthentic operators," as Facebook called them, included groups like "Aztlan Warriors," "Black Elevation," and "Mindful Being." Most of these groups and pages were indeed bogus efforts designed to stoke existing partisan tensions by amplifying many of the most annoying aspects of extreme partisans on both sides.
But in the process Facebook also shut down a group named "Resisters," which had been organizing a “No Unite the Right 2 - DC" counter-protest against a planned white supremacist rally scheduled in Washington DC on August 12. Needless to say, the legitimate activists weren't particularly pleased:
"However, activists who had worked with Resisters said the counterprotest they planned against a far-right rally was legitimate — and that Facebook was harming their ability to combat the rise of white supremacy. The event, called “No Unite the Right 2 DC” and promoted by Resisters along with other left-leaning groups, was collateral damage in Facebook’s battle against disinformation, they said.
Facebook has “delegitimized our whole event — and all the work that folks across the D.C. area have put a lot of time and effort into,” said Caleb-Michael Files, an organizer of the March to Confront White Supremacy, a group that was organized after the Charlottesville protests, and a co-host of the counterprotest event page."
Over at Facebook, one of the justifications for the removal of the page was that an account linked to the Russian IRA disinformation effort had been an administrator for the page for all of seven minutes:
"The IRA engaged with many legitimate Pages, so these leads sometimes turn up nothing. However, one of these leads did turn up something. One of the IRA accounts we disabled in 2017 shared a Facebook Event hosted by the “Resisters” Page. This Page also previously had an IRA account as one of its admins for only seven minutes. These discoveries helped us uncover the other inauthentic accounts we disabled today."
Taking down a whole, legitimate website because one IRA-linked account had admin rights for all of seven minutes seems shaky at best, and Facebook isn't clear on what additional evidence they relied on. The other problem is that Facebook notified all of the group's legitimate members about its move, undermining the effort as a whole. Facebook's blog post is also misleading, in that it suggests that these legitimate activists were somehow conned into participating in a counter-protest they would have been engaged with anyway:
"The Event – “No Unite the Right 2 – DC” – was scheduled to protest an August “Unite the Right” event in Washington. Inauthentic admins of the “Resisters” Page connected with admins from five legitimate Pages to co-host the event. These legitimate Pages unwittingly helped build interest in “No Unite Right 2 – DC” and posted information about transportation, materials, and locations so people could get to the protests."
The event is still scheduled, but the new Facebook group created in the wake of Facebook's actions has far fewer members, and it's unclear how many people who would have otherwise attended were scared off by what feels like over-reach.
It's a bit of a master class on the terrible position Facebook suddenly finds itself in and the need for transparency as we try to craft a solution. The Washington Post quotes somebody inside Facebook that notes the company debated whether or not the bans would harm legitimate activism, but ultimately decided that the harm to these groups from being potentially co-opted by hostile foreign intelligence efforts was worse than the potential censorship.
That was cold comfort to activists, who suddenly found their efforts to undermine white supremacy derailed due to no fault of their own:
“It’s an extremely dangerous situation for free speech when politicians are screaming at web platforms to ‘do something’ about a problem that is difficult to address,” she said. “Censoring an anti-Nazi protest was a particularly egregious example of collateral damage."
The problem here is there's no real consensus on a path forward. We don't honestly know how to combat this yet.
Modern disinformation efforts clearly amplify tensions and sow distrust, and, as intended (check out Adam Curtis' Hypernormalisation documentary when you have a few hours to kill some time), help contribute to the undermining of traditional media and traditional institutions, opening the door wider to future disinformation efforts. The impact isn't easily measurable, letting many dismiss the problem as over-hyped and unimportant. But if you've watched as conspiracy goes mainstream and often wanders comically close to White House policy, it should pretty clear there's a very fucking real problem here.
But Facebook and Twitter have both shown they're aggressively incompetent at regulating platforms or being transparent about such behavior. Congress can barely put its pants on in the morning. And despite some healthy debate for the better part of the year by journalists and academics alike, you'd be hard pressed to find anybody that currently has a solution to this particular problem. In part, because none of the solutions are easy, whether it involves shoring up critical thinking training in a country that consistently likes to underfund education, or reconfiguring systems to make spreading bullshit less profitable.
This new wave of disinformation and conspiracy is much like a bacterial infection, and it's going to take time for the culture and body politic to generate an immune response. In the interim, the learning process is going to be ugly as we feel out the best path forward. Whatever that path winds up looking like, hopefully it comes with notably less censorship and a whole lot more transparency.
Filed Under: activists, content moderation, election interference, free speech, trolls
Companies: facebook