89. Facebook scores your politics with a number. Brendan Nyhan figured out what they do with it. (Part 1 of 2)

Brendan Nyhan
Reimagining the Internet
Reimagining the Internet
89. Facebook scores your politics with a number. Brendan Nyhan figured out what they do with it. (Part 1 of 2)
/

Does Facebook make people’s politics more extreme? Do algorithms force us into bubbles? Does social media threaten American democracy? Political scientist Brendan Nyhan used his permission to research political data on Facebook as an opportunity to tackle these questions head on. In part one of our interview with Brendan, he tells us about his contribution to the recent spate of Meta-sanctioned Facebook studies. (Listen to part 2)

Brendan Nyhan is professor in the department of government at Dartmouth College and co-director of the quantitative political science group Bright Line Watch.

In this episode, we discuss his paper “Like-minded sources on Facebook are prevalent but not polarizing” in Nature. We’ve previously discussed the Facebook studies with Laura Edelson and Talia Stroud.

Transcript

Ethan Zuckerman:
Hey, everybody. Welcome back to Reimagining the Internet. I am your host, Ethan Zuckerman. I’m with my friend, Brendan Nyhan. 

He is the James Friedman Presidential Professor in the Department of Government at Dartmouth College. He’s a political scientist who’s done just tons of influential work on mis- and disinformation, how we form political beliefs, the relationship between those beliefs in social media. He’s co-director of a group called Bright Line Watch, a watchdog group that’s monitoring the status of American democracy. He’s one of the people that I asked my students to read to think about quantitative methodology in the social sciences, because he’s got a great gift for taking on some very subtle questions and finding answers. 

And recently he’s one of the authors in this series of the 2020 election studies conducted with cooperation from Meta that are giving us a lot of new insights into what social media might and might not do to our society and democracy. Brendan, thank you so much for being with us.

Brendan Nyhan:
My pleasure, thank you for having me.

Ethan Zuckerman:
So I would love to start with these series of papers, not only because they’re interesting, but also because I’m literally teaching your paper to students in one of my 200 level classes in about an hour and a half. So I wanna make sure that I have it right. 

These papers are co-authored many, many different co-authors, but you are the lead author on a paper in Nature titled “Like-Minded Sources on Facebook are Prevalent, but Not Polarizing.” As I read it, it’s an analysis of the echo chamber effect, and you looked at the extent to which Americans are encountering content from the like-minded, and then with their permission, you turn down the dial on like-minded voices. 

What did you find about what we’re hearing from the like minded and what happens when we fiddle with that dial? 

Brendan Nyhan:
Yeah, so this is, as you said, a co authored work with many colleagues. It’s a large group of independent and independent academic team that worked together on conducting this study. And I’m one of the four lead authors with Jaime Settle, Emily Thorsen, and Magdalena Wojcieszak. And what we did is Right. They had two elements. The first is we looked at people’s information diet on Facebook. And for that, we were able to measure the information exposure of the full US adult population. So we had the unprecedented opportunity to be able to say something about the prevalence of so-called echo chambers on Facebook. 

And then second, we conducted this experiment in which we, as you said, turned down the dial by with the permission of participants who opted into participating in the study, we reduced their exposure to content from what we call like-minded sources. So those are two components of study. I’m happy to go into more detail. Should we start with the echo chamber measurement part? 

Ethan Zuckerman:
Yeah. So let’s start with the echo chamber because this is, I mean, this is crazy, right? I mean, Facebook is giving you access to 200 million American adults. I mean, it’s some enormous number. And Facebook has essentially an ideology score for all of us, either based on what we’ve reported or based on our behavior. They’re scoring us somewhere between one and zero. You ended up sort of saying, you know, zero to point four is liberal, point six to one is conservative, point four to point six, you use this fairly awkward phrase, but I’m gonna refer to those as moderates, but you end up referring to it as non-aligned, non-cross-cutting when you sort of do it in the study. To what extent does data on what people encounter on Facebook, to what extent does that validate at least the existence of echo chambers, the idea that we’re mostly encountering information from the like-minded?

Brendan Nyhan:
Yeah. So let me just add one point of clarification just to help listeners get a sense of how we measured this. We took that score for the so-called political classifier which ranges from zero to one, like you said, where zero is gonna be someone where absolutely confident is liberal and one is someone where absolutely confident is a conservative. So we have that, it’s based on a variety of signals that Facebook observes. And we show in the appendix to our study that those scores predict very well a series of political and attitudinal variables that we would expect them to correlate with. 

So they predict your presidential vote, how you identify, which party you identify with in our surveys and so forth, right? So they seem to do a very good job of guessing people’s political leanings insofar as we can validate them in the context of the experiment that we’re gonna talk about in a moment. So we use that, now we have this estimate of your, how liberal or conservative you are. And we also have via that same measure, estimates of how liberal or conservative the people you’re connected to are, as well as the pages and groups whose content you see. 

And one of the reasons we don’t use moderate there, moderate in the study is in particular for pages and groups, they don’t necessarily have a moderate political ideology as such, they may just be a relatively even combination of people who lean to the left and people who lean to the right, right? 

So I don’t know. Taco Bell’s page on Facebook, right? I don’t expect to have any particular partisan valence, but I wouldn’t think of them as moderate in any political sense, right? They’re just falling somewhere in the middle, right? Whereas if we were thinking about the content from like a hunting page versus a like organic gardening page, right? We might have stronger expectations about the political leanings of those audiences.

Ethan Zuckerman:
Right, so those middle values may be apolitical rather than moderate in one fashion or another. And that’s actually a very good caution. One of the things you find is that less than 7% of the content encountered on Facebook is either political or civic content. 

Brendan Nyhan:
Yes, yes. So when you’re thinking about the question, are Americans trapped in these echo chambers of like-minded political content? To a first approximation, the answer is no. they barely pay attention to politics at all, right? So the people you see talking about politics and you hear talking about politics, and if you yourself spend a lot of time talking and hearing about politics, those are all aberrational in the context of the US population, which simply pays attention to other things and is busy with their lives, right? 

And most people don’t have especially extensive political information consumption habits. They’re encountering political news incidentally. That’s actually one of the things that social media does. But as a proportion of their feed, that content that’s explicitly political is relatively low. And the content that’s explicitly news also is low. Now, what we did is we wanted to have this broader measure though, we wanted to make sure that we could take into account not just the explicitly political content you saw and the news content you saw, but whether you were also hearing from sources that were aligned with, tended to be aligned with your side of the political aisle, even if they weren’t explicitly political. So like I said, a hunting page is an example, right? You might imagine that there’s some conservative cultural content associated with that, given the political valence around guns in this country, right, even if that was not talking about lobbying for gun rights legislation in Congress, for instance, right? And so we’re taking all the kinds of content people are exposed to into account. 

What do we find? We find about half of the content people see is coming from like-minded sources on average. Now, whether you think that’s high or low, I guess depends on your expectations going into this. It does suggest that people tend to see content from sources that are on their side of the political aisle on average, right, to the extent that we can tell. On the other hand, as we said, most of that content isn’t political or news explicitly. 

And we only find very extreme distribution, which we use a cutoff at 75%, 75% plus coming from like-minded sources. That’s a much smaller minority of the American public. And that’s consistent with what we’ve seen in a lot of other studies, using all other kinds of digital behavior data. It’s really a small subset of the American public that has highly skewed information diets. It’s not a common experience.

Ethan Zuckerman:
So just to make sure that I’m sort of understanding this and that our listeners are understanding this. If I am someone who leans to the left and as a New England University professor, you could probably safely assume that that’s the way that I’m gonna lean. I am likely to encounter a substantial amount of content on Facebook from people who share my ideological leanings, which is to say they also lean to the left. That’s gonna be about 50% of my feed. It’s not necessarily going to be Rachel Maddow. It might be someone talking about yoga or organic gardening, which might have a political valence. It might even be someone talking about going to the Taylor Swift concert, which may or may not have a political valence. All of that is going to be coded as being ideologically consonant because it’s coming from someone who shares my ideology. 

And you’re finding that there’s not very much cross-ideological content. I’m not getting a ton from folks who might be identifying as conservatives. I believe it’s less than 20%. And then 30% is content that’s sort of in the middle. It isn’t that it’s necessarily political moderates, it’s that it’s coming from groups that are so mixed and it doesn’t have an obvious political valence associated with it. It’s basically apolitical or it’s somewhere in the middle and that’s sort of representing the rest of the feed.

Brendan Nyhan:
That’s right, that’s right.

Ethan Zuckerman:
Is this basically just a homophily effect? Is this not basically just the liberals flock together, the conservatives flock together? Is Facebook doing this to us or do we do this to ourselves?

Brendan Nyhan:
Well, that’s a great question. The study is not set up to distinguish those. Prior research by Facebook researchers actually, but very credible social scientists, separates out the extent to which people are seeing content that’s like-minded because of who they follow versus what they engage with versus the algorithmic ranking, right? So you should think about kind of different steps that process, right? 

One thing about Facebook, and actually this is an aspect of social media that’s changing now, of course, as you know, but just to remind listeners, we used to have news feeds that were strictly composed of content from accounts you had opted into following. And in that sense, we were creating the homophilous relationships you’re describing. We were choosing to follow account A and not follow account B. And in that pattern of choices we were generating a set of accounts we followed and thus a set of content we were exposed to intended to coincide with our—we would expect to coincide with our political leanings or cultural preferences. Algorithmic recommendations from content, from sources you don’t follow, have become a larger part of what people are seeing even on following-based social media. And we’re now seeing platforms like TikTok that barely have a following structure at all. It’s almost entirely recommendations from the full network. So this will change. 

So we’re doing part of this in what we choose to follow. That’s step one. And then step two is we’re then making choices about what content to view and engage with. But of course, the algorithm is interacting with each of those steps. These are hard to cleanly separate into human behavior versus algorithms. Why? Because the algorithm is recommending accounts it thinks we will like. And in that way, it may worsen homophily and who we choose to follow. It of course also is ranking content higher things you’re more likely to engage with, or that has some other kind of prioritization in the way the algorithm makes its recommendations. And in that way, it may also be potentially increasing homophily in terms of what we’re seeing. 

Ethan Zuckerman:
And indeed, there’s another paper in this series and we talked with Talia Stroud about it, about what happens if you turn off the algorithmic feed and replace it with a reverse chronological order feed. And the TLDR on that is it doesn’t solve all the problems of American political isolation, you know, even if we thought it did. 

But on this paper, you’re doing actually something quite different and very interesting. You’re finding that we’re getting lots of information from the like-minded. We’re getting less cross-ideological information. In fact, we’re getting very little political civic information in general. That’s not for the most part what Facebook is about. 

You then have this amazing opportunity for about 20,000 users on Facebook to turn the knob down and essentially say, “Okay, Ethan, we know that you’re comfortable with the organic farmers and the yoga instructors. We’re going to put a lot fewer of them in your feed.” What happens when you do that, Brendan? 

Brendan Nyhan:
You see a lot less content from like-minded sources. engage with it less frequently, in large part because you’re seeing it less. And we’ll talk about a nuance related to that in a moment. But it has no effect on your attitudes across a variety of outcomes we measure. There’s virtually no measurable effect on survey self-reports of various political attitudes or other kinds of attitudes that were thought to be affected by people’s information diets and specifically by this echo chambers idea, right, that people were becoming more polarized because they were getting these skewed information diets. And we show reducing how much content they get from like minded sources by about a third for a period of about three months has no measurable effect across all of these different outcomes. 

Ethan Zuckerman:
And I want to dig into that at length because it’s a fascinating finding and it’s one that you’re in my field are wrestling with right now. But I do want to get to that nuance, which I love, which is that if you do turn down the ideologically similar points of view, you have this really interesting effect, which is when you do see one of those ideologically aligned voices, you are more likely to interact with it. 

So if you get rid of my yoga teachers and my organic farmers—if one of them slips in—I’m so desperate for my yoga teacher interaction at that point that I engage with her, I vote her up at that point. Did you see that on both left and right? 

Brendan Nyhan:
Yes. As far as I know, there’s no evidence of heterogeneity by political leanings in that result. And it’s a really interesting one. It shows how you can change what people see, But to some extent, the behavioral response may at least partially offset the changes you make. Now, I want to be clear, it doesn’t fully offset it. We’ve reduced like-minded sources so much by decreasing them by about a third that the net effect is on total engagement, is still negative relative to a control group that had the normal Facebook experience. But we’re seeing part of that effect being offset by this behavioral response. And it tells us something about what we choose to engage with. 

When there’s this relative scarcity of content that’s appealing to us or seems congenial to us, people seem to gravitate to it and interact with it in the ways that are measurable on a social media platform at a higher rate. 

Ethan Zuckerman:
Now, the study in which the algorithm was replaced with reverse chronological order, they had a real effect of users falling off of the platform. They actually saw users moving on to using YouTube, using Reddit, using other sources more often. Did you see that or were the effects only around engagement? 

Brendan Nyhan:
They were only around engagement. We found we did not find, as I recall, significant changes in time spent on the platform. It was approximately similar. It might have been slightly lower, but it wasn’t a statistically significant difference. And that’s interesting. And at least suggest, our experiment suggests there’s no big payoff here in terms of changing attitudes, if that’s what you’re seeking to do. But it does suggest that at least on the margin that we were operating, it wasn’t driving people off the platform entirely. They were still getting a ranked feed that was more engaging to them. 

And I’ll say, I would say my experience with Mastodon, I hope this is a safe space to say this, very much reinforces to me the value of algorithms. I found my friend Mastodon virtually unusable. It’s the content is not interesting to me. But the way I’ve used social media in the entire time I’ve interacted with text based feeds is to follow lots of accounts and allow the algorithm to surface the content that’s most interesting to me from the many accounts I follow. Mastodon has completely broken that. I am wading through tons of content I find uninteresting and therefore I never use it. So I’m living proof of this kind of an effect. 

I think it’s something that’s well understood inside platforms, but hadn’t been demonstrated externally quite so dramatically. And it’s really important because people think reverse chronological feeds by turning off the algorithm are a kind of magic elixir, as you said, to kind of fix the problems of social media. And I don’t think that’s an evidence at all. 

Ethan Zuckerman:
Well, so first of all, I have some good news for you, which is that we’re building something that lets you choose your own algorithms to sort Mastodon, to sort Blue Sky. We had built it to sort Twitter and then a certain capricious billionaire changed his mind. We hope fairly soon we might be able to use it to sort Threads. That’s another story, but we’ll get you signed up for the beta on that. 

Brendan Nyhan:
Oh my gosh. Sign me up. Yeah. 

Ethan Zuckerman:
But beyond that, so people are fascinated by the finding in your study and the finding in this study about reverse chronological order, which is that you do not see major changes in political attitudes, in effective polarization, so how an individual was feeling about their political affiliation. 

Many people have sort bought into this echo chamber hypothesis that essentially our political views are being reinforced constantly by social media. The left is going further to the left, the right is going further to the right. There are certainly social scientists who are reacting to this and going, “Eh, it was only three months long, it’s not long enough.” You point out in the study that frankly, most political science research works on much, much smaller timeframes than that. 

Is this good news, Brendan? I mean, like, it seems like this should have an effect. In this paper, in this other paper, it’s quite clear there’s not a major change in behavior based on making what feels like a very substantial change to social media. Does this mean that social media is hard to fix or does it mean that social media doesn’t have as much effect on political leanings as we’d like to think? 

Brendan Nyhan:
I think the latter. You know, I would say in the in recent years, we’ve had conversion evidence across a variety of studies that persuasion is difficult. The most rigorous persuasion studies, for instance, in political campaigns, find extremely small effects of political ads that are purpose built to change your mind, that in many ways are much higher impact and more evocative than the kind of content we’re talking about here on Facebook. They have video, they have dramatic sound, they were carefully constructed every second to persuade you, and yet their effects are extremely small.