New Research Shows Social Media Doesn't Turn People Into Assholes (They Already Were), And Everyone's Wrong About Echo Chambers
from the seems-notable dept
We recently wrote about Joe Bernstein's excellent Harper's cover story, which argues that we're all looking at disinformation/misinformation the wrong way, and that the evidence of disinformation on social media really influencing people is greatly lacking. Instead, as Bernstein notes, this idea is one that many others are heavily invested in spreading, including Facebook (if the disinfo story is true, then you should buy ads on Facebook to influence people in other ways), the traditional media (social media is a competitor), and certain institutions with a history of having authority over "truth" (can't let the riffraff make up their own minds on things).
We've also seen other evidence pop up questioning the supposed malicious impact of social media. Yochai Benkler's work has shown that Fox News has way more of an impact on spreading false information than social media does.
And even with all this evidence regarding disinformation, there are also people who focus on attitude, and insist that social media is responsible for otherwise good people turning awful. Yet, as was covered in an fascinating On the Media interview with Professor Michael Bang Petersen, there really isn't much evidence to support that either! As Petersen explained in a useful Twitter thread, his research has shown that there's no real evidence to support the idea that social media turns people hostile. Instead, it shows that people who are assholes offline are also assholes online.
But in the interview, Petersen makes a really fascinating point regarding echo chambers. I've been skeptical about idea of online echo chambers in the past, but Petersen says that people really have it all backwards -- and that we're actually much more likely to live in echo chambers offline than online, and we're much more likely to come across different viewpoints online.
One way to think about social media in this particular regard is to turn all of our notions about social media upside down. And here I'm thinking about the notion of 'echo chambers.' So we've been talking a lot about echo chambers and how social media creates echo chambers. But, in reality, the biggest echo chamber that we all live in is the one that we live in in our everyday lives.
I'm a university professor. I'm not really exposed to any person who has a radically different world view or radically different life from me in my everyday life. But when I'm online, I can see all sorts of opinions that I may disagree with. And that might trigger me if I'm a hostile person and encourage me to reach out to tell these people that I think they are wrong.
But that's because social media essentially breaks down the echo chambers. I can see the views of other people -- what they are saying behind my back. That's where a lot of the felt hostility of social media comes from. Not because they make us behave differently, but because they are exposing us to a lot of things that we're not exposed in our everyday lives.
And then this upside down view of echo chambers also explains why people feel like the internet is a more hostile place full of assholes and trolls. It's more that it's because we're now being exposed to these points of view and can respond. As he notes, this kind of hostility actually happens all the time, but it's usually just not witnessed by more than a couple people at a time. Now, online, it's witnessed by a much larger audience, and so we overcorrect and think that it's making people worse.
In our offline lives, there is a lot of hostility as well, but that happens behind closed doors, in private. It happens in bars where we cannot hear what is going on. But we're exposed to all that when we enter the online realm.
Another interesting tidbit in the interview (and in his Twitter thread) is the idea that people who tend to share misinformation often know that it's misinformation, but they don't care because they're just so focused on pissing off the people they don't like (see: "owning the libs") that they don't care. It's more important to anger the "other side" than to share legit info. This was based on a detailed study they did of people on Twitter.
The people who are sharing misinformation are not ignorant. They are used to navigating social media and the internet. They know more about politics than the average person. But where they're really different from the average is they have much more negative feelings towards members of the other party. And that's really what's predicting, not only their sharing of fake news, but also their sharing of real news. They want to derogate people that they don't like, and they are actively searching for information that they can use for that purpose.
So it's not that social media turns people into assholes, nor does it put them into echo chambers where their minds are turned to mush by disinformation. The evidence suggests that some people -- who were already predisposed to such kinds of "us/them" bickering, just jump right into the fray online as a kind of status thing to anger the people they don't like. Because those people can actually hear them now.
Later, he notes that it's not that they're purposefully sharing misinformation -- it's just that whether or not something is true or false is "not part of the calculus." All that matters is basically "will this trigger people I dislike."
And then the interview brings us back around to a similar issue that we noted with Yochai Benkler's research about the problems of Fox News. Petersen notes that his research has shown that Republicans tend to share somewhat more misinformation than Democrats, but it's not because (as some people believe) the relative education levels, but rather that the media ecosystem set up for Republicans (i.e., Fox News, OAN, Newsmax, etc.) is much more designed to feed them the kinds of news that they want to share for this very purpose -- the "own the libs" kind of stories. Though, as he then notes, the fact that these kinds of news sources exist and feed into this is (like the social media panic) more of a symptom, rather than the real problem.
His conclusion?
Misinformation is not, in itself, a big problem. So that's the good news. The bad news is that it's a symptom of a much worse problem. And here we come back to the polarization in society, because that is what's driving the sharing of misinformation. I think we've been focusing a lot on the symptoms -- Fox News, Trump, Facebook -- but I think that there is some evidence that suggests that rising inequality over the last decade has been a fundamental driver of political instability in the US and beyond. It's a problem in many Western Democracies. That's where I would start to look for solutions.
All of this is quite fascinating -- and backed up by his research. And, it does get back to the point that some of us have been making for years. That social media isn't so much "the problem" as it is a mirror that reflects the kinds of societal problems that civilization has been dealing with for centuries. It's just that now it plays out in a way where more people can see it all happen.
Filed Under: echo chambers, michael bang petersen, social media, society, trolls
Companies: facebook