04 Kevin Roose, The New York Times

photo of Kevin Roose
Reimagining the Internet
Reimagining the Internet
04 Kevin Roose, The New York Times
/

We welcome Kevin Roose to the podcast — tech reporter extraordinaire for The New York Times and thorn in the side of Facebook — to talk to us about how platforms’ laser focus on growth resulted in building a misinformation ecosystems and algorithms that they don’t really understand. Kevin and Ethan talk about what’s really the healthiest social media platform of them all, and what Wall Street-style regulation might look like for major platforms.

Kevin Roose reported and produced the excellent Rabbit Hole podcast earlier this year, covering the rise of the alt right on YouTube. He also maintains the Twitter account Facebook’s Top 10, which lists the publishers of the 10 most shared articles on Facebook on a given day, a list that frequently sees legacy news side by side with alt right upstarts.​

Transcript

Ethan Zuckerman:

Hi, everyone. Welcome to Reimagining the Internet. This is an ongoing podcast series from the Initiative for Digital Public Infrastructure at the University of Massachusetts at Amherst. My name is Ethan Zuckerman, I’m your host. And today we’re with tech columnist for the New York Times, author, investigative journalist, a thorn in the side of Facebook and many technology platform companies, I think one of the people doing the best work right now in technology writing, Kevin Roose. How are you Kevin?

Kevin Roose:

I’m doing well. Thank you for having me on.

Ethan Zuckerman:

Thanks so much for being with us. This is an ongoing conversation with smart people who follow the tech industry seeing if we can get beyond just calling out what’s wrong with tech and actually trying to get towards solutions for social media that could actually be good for us rather than just not destroy our democracies. And so, we’ve been starting these conversations with really simple questions. What is wrong with social media at present and what do you think the companies and the rest of us should be doing to fix those problems?

Kevin Roose:

That’s a big question, a big couple of questions. Right now, I think the problems with social media vary widely by platform and by even functions within platforms. So, I think one of the things that unites a lot of these problems is that they were built by companies that were looking to attract growth, that we’re looking to attract users and engagement. They were built around the idea that the less friction is involved in sharing something or posting something or clicking on something the better. And so, they have really made design decisions and monetization decisions not thinking through downside risks, not thinking through abuse potential, just purely trying to make the numbers go in the right direction.

They’ve been hugely successful at that, these companies are very big and very profitable. And I’ve been writing a lot recently about automation and I have a book coming out next year about AI and automation and the way it is changing what it means to be human. And I think a lot of this, we don’t tend to think of the automation, AI conversation in the same breath as the social media conversation, but I think they’re related. I think that these companies suffer from an overconfidence in automation.

Ethan Zuckerman:

So you talk about this a lot with the rabbit hole work, you did a really excellent eight part podcast series that was looking at how YouTube recommends videos and the possibility that YouTube is pushing people towards increasingly radical content. How does that question about automation and overconfidence in automation come into play around something like YouTube?

Kevin Roose:

Well, I think we see on YouTube one of the most advanced forms of automation ever built. I mean, Google has the best AI engineers in the world, they publish the most papers, they win the most awards. And for much of the last decade, the biggest, most profitable AI project at Google has been YouTube recommendations. 70% of watch time on the platform comes from recommendations, and that’s a company that is making billions of dollars a year. So I think when we think of the cutting edge of AI, we think of an open AI lab or some Carnegie Mellon research project.

Ethan Zuckerman:

Auto generated text.

Kevin Roose:

Yeah, GPT3 or something like that. But YouTube recommendations are the cutting edge of AI research in this country and probably in the world and I don’t think we ever think about that. We just think, “Oh, that’s what every other platform does, it recommend stuff.” So I think that the confidence that these firms have in the ability of machine learning to do things like surface relevant and engaging content to people is based on a pretty shallow definition of engaging and relevant. It’s based on what they can measure, which is watch time, which is clicks, which is shares. And I think that their inability to look deeper than that, to cook these algorithms to some metric that is more substantive, such as whether something is true or not, for example, is one of their great failures and the source of a lot of their problems.

Ethan Zuckerman:

And you’ve been making this point lately on Facebook looking through the CrowdTangle data and let’s start by specifying that. Facebook says that CrowdTangle is not necessarily an accurate picture of what’s most shared on their site, at the same time they also make clear that they won’t give you any information.

Kevin Roose:

Right. I’m wrong according to their secret data that they won’t share.

Ethan Zuckerman:

That’s right, yes. In one of my other instances, I’m writing a paper about how we study an audit the platforms and the story of you crossing swords with Facebook over that is actually the opening story of the paper. But this problem with the logic of automation and the logic of algorithms comes out there as well. What’s the point that you’re trying to make in showing this string of highly emotional, often let’s go with factually challenged right-leaning content gaining top engagement marks on Facebook?

Kevin Roose:

Well, I think my point in showing that to people on Twitter was just pretty simple. It was, this is happening and you probably don’t know about it because I think people who spend time on Twitter, journalists, researchers, high information news consumers, the conversation there is pretty divorced from what’s happening on Facebook, which is 10 times as large. And my goal was not to prove a point about politics or algorithms or anything like that. When I started, I was just seeing this data and saying, “I cannot believe how different this is than what I’m seeing on my Twitter feed, and I bet Twitter might find it interesting to know what’s happening on this platform that is 10 times bigger.” So that was all I was trying to do by posting these lists of the most engaged Facebook posts is just simply, check this out. That’s interesting, maybe we should talk about that.

Ethan Zuckerman:

I want to push you on this question of what would we do to make things better? That there’s at least three things going on with the algorithms that you’re talking about.

The first is that they may be hiding from us content that we should know about, even if we don’t want to see it, you probably don’t know about PewDiePie because you may not need PewDiePie in your life but as a citizen, it’s probably a good idea to know who he is and what’s going on with him.

Second, there’s the danger that the algorithms lead us down a rabbit hole, that we start following someone who’s talking about self improvement and very quickly we find ourselves talking about White nationalism or really horrific misogyny.

And then the third is that it just turns out that humans are pretty awful, a good chunk of the time and that when you put forward a Facebook or YouTube, a lot of what people create turns out to be pretty miserable and stuff that’s very hard to look at and hard to know how to wrestle with. Is the solution that we build these things without the algorithms or what’s the way that we go after that cluster of problems?

Kevin Roose:

This is strange for me to be casting myself in the role of defender of humans here, but I actually don’t think humans at their core are bad. I don’t think most people wake up in the morning and say, “How could I enrage myself and others? How could I divide myself from others?” I don’t think that’s how we’re wired, and yet I think that’s the behavior that these platforms are built to incentivize. Not because Mark Zuckerberg’s evil or Jack Dorsey’s evil, or Susan Wojciki’s evil and they want to tear society apart. But I just don’t think they understand the decisions that they’ve made around how to measure and maximize certain forms of engagement and what that produces in people. So I actually don’t know that I think that we’re all just wired this way and that these platforms just reflect human nature. But I do think …

Ethan Zuckerman:

I spend too much time in 4chan.

Kevin Roose:

Yeah. I mean, I also spend a lot of time on 4chan, which makes it weird, but I do think that communities have, to use a very unscientific term, vibes. When you go into a space in the physical world, the design of that space teaches you how to act in that space. If you go into a very austere and imposing, government building, the design of that space, the architecture, the columns, the layout, it tells you something about how you are supposed to behave in that space. And I think the same is true of online platforms, that when people go on 4chan or they go into a Facebook group for the Boogaloo movement, or they go onto a YouTube channel for a White nationalist, the design of that community, the architecture of the platform has a lot to do with how they will ultimately behave.

Ethan Zuckerman:

We talk a lot about norms and affordances when we look at different platforms. So some of the work that I’m doing right now is trying to map what I think of as the Facebook logic. And so, for me, the Facebook logic of course, is as you’re identifying high engagement, it’s based in part on the economic model, right? We’re based on surveillance capitalism, so we need you both to stay and share your eyeballs, but we also need information about you and either you need to put up information or you need to give us behavioral data so that we can follow you and sell the ads. But it has other aspects to it, and part of it is centralization and facelessness, everybody sees more or less the same Facebook, you can’t change the affordances. The moderation is invisible, central, behind the scenes is holding on transparent.

So once you get your head around that logic, you can think about different logic’s, right? Like, Reddit operates on a very different logic. You move into a space on Reddit and it’s still ad supported, it’s still tracking you, but the governance there has been handed to a team of moderators and they say, “This is R/aw and if it’s not a cute animal, you can’t put it there. And by the way, we actually have defined cute very specifically, you can’t put up your puppy who’s recovering from surgery because that’s making people feel bad, that’s not actually cute, and if you don’t like it, you can become a moderator and become part of it.” So for me, I think that notion of taking the logic seriously makes a ton of sense. Facebook, in some ways, to me fails so much because it is one logic designed for everything two billion people might conceivably do on a social media platform. And it’s also one in which they have no control over how its governed, or how the speech ends up working on it.

Kevin Roose:

Yeah, and I think the other piece of this that you’ve identified is that human led governance is pretty effective, decentralized governance. Wikipedia is our most functional social network. And we don’t think of it as a social network, but it is and its product is knowledge and is accuracy and that’s what’s valued in that community. And if you go into a Wikipedia page and start messing around, you’re going to get chastised, you’re going to get banned. Moderators have a vested interest in keeping it civil and accurate and not allowing a ton of vandalism and so, there are norms in that community. So I think that part of what Facebook is running up against is just scale.

Ethan Zuckerman:

Yeah.

Kevin Roose:

But I also don’t think any of this as an accident. I think they designed it this way. They are so invested in this idea that algorithms can do as good or better a job at things like assessing hate speech, at things like choosing relevant information for people, at defining authority for various news sources. They just have really under-invested in humans as a source of guidance and wisdom on the job of providing information to two or three or billion people.

Ethan Zuckerman:

I don’t want to lose the algorithms piece of it, but I actually do want to dig in on your provocation that Wikipedia is one of our most successful social networks, I think that’s absolutely true. I would say you’ve got two things at work there. One is a norm, which is NPOV, neutral point of view and this idea that we’re just going to batter stuff back and forth until we get something that we can all more or less agree with. And it may take the rough edges off of it, it may give us something that’s smooth and comfortable and not necessarily as beautifully written, but it’s something that we can get our heads around. There’s also some incredible technical affordances and probably the easiest one to think about is the ability to roll back. The idea that with a click, you can basically say, “Nope, vandalism, sorry, not worth it.”

And that’s an incredibly different logic, either than the chans where you say whatever you want and the stream of time brushes it away or within the Facebook logic, it’s your words, you stand behind them and they’re sacred and sacrosanct to a certain extent. Wikipedia has none of that sense of sacrosanct, if you’re wrong, or if you’re right and you can’t defend it, it’s gone and that’s the space that you’re agreeing to play within. But let me push you a little further. Can you imagine a vision of social media that isn’t just less toxic than the patterns that you’ve identified, but is good for us? Is something that actually helps us as citizens or is the answer just that we’ve got to step away from this stuff and step offline or into other forms of storytelling?

Kevin Roose:

Yeah. I mean, I don’t think they need to be destroyed to be better. And I think that even within some of these very flawed companies, we see examples of things that are better or worse. So take, for example, Instagram. Instagram, you could rattle off a litany of harms, self image, things like that. But I think most people would agree that on the whole the information on Instagram has a higher integrity than the information on Facebook, why is that? Partly, I think it’s because Instagram has chosen not to add features that might degrade the quality of the information on there and make it feel less intimate, less personal, so there’s no native reshare on Instagram. You can’t click a button and share something. There’s also no links on Instagram, which removes a whole category of misbehavior that has plagued Facebook, which is this click bait economy that has grown up both in politics and outside of it and Facebook owns Instagram.

It’s not a secret that you can do these things, that you can have a successful platform without those features, but they’re so invested in the architecture of the big blue app that it’s not clear to me that they’re ever going to do that, but they have an example right there in their own company of a social network that is succeeding by not implementing certain features. Sure, it could have maybe added 5% daily active users by putting in a native reshare button, it could have maybe gotten more publishers more invested in the platform by adding links, these decisions have been weighed inside the company and they’ve chosen not to do them. Goldman Sachs always has this thing about how they’re longterm greedy, and I think about that a lot in the context of tech firms, which I think are very short-term focused. I think they’re very focused on next quarter’s KPI’s and next quarter’s monthly actives. And I just think they could have been a little bit longterm greedier when it came to things like limiting features that might degrade the quality of their services.

Ethan Zuckerman:

You can share a sheep a lot of times and you can skin it only once, and maybe the answer is Facebook in becoming this wildly popular network has been skinning its users. Do you see any hope for Facebook or YouTube changing? You’ve been engaged in some pretty fierce and high profile fights. Do you think the reporting that you’re doing is having a difference? Do you see them making any changes?

Kevin Roose:

I certainly hope so. I mean, I hope that what I’ve been harping on and yelling about for the last three years has mattered, I think it has. And I’ve heard from people inside these companies that their internal criticism resonates more loudly when it is in harmony with external criticism. And that they actually appreciate, obviously they don’t appreciate all of the criticism, but in certain areas like for QAnon for example, that it actually really helped the people inside Facebook to have people on the outside drawing attention to it and putting pressure on executives because often, it’s no secret, these companies act in response to media pressure. So I think they have changed.

I mean, YouTube changed its recommendations algorithm, they said it’s been hugely effective in reducing the growth of some channels that were on the extreme end. I believe that because I see these creators who used to get millions of views on everything they posted now complaining that they can’t get 100,000 views on their videos. And so, I think that they have done some admirable things there that have probably cut into growth, but may ultimately have saved them a lot of headaches. Yeah, I don’t think these are irredeemable platforms, I just think that they have to choose something other than the path of least resistance. They have to choose something other than growth to be their North Star.

Ethan Zuckerman:

Do we see a regulatory role in this at all? Do you think the way to do this is to expose what’s going badly on these platforms and let people agitate inside for change, or do you think Congress or the FTC or the FCC is going to end up putting on substantial at some point that’s going to force a change in this landscape?

Kevin Roose:

Yeah, I think regulation has a huge role. I mean, I used to cover Wall Street and regulations really have changed Wall Street. There’s still some unsavory stuff going on, but it is much harder for big banks to take advantage of consumers in the ways that they did before the financial crisis. And I think that the result of a lot of hard choices and a lot of good regulatory action. And I actually do think there’s a good parallel in the world of tech, because I think that in the banking world, one of the things that they did that I thought showed a lot of foresight, was they broke out the biggest banks, they call them, the SIFI’s, the systemically important financial institutions, the really big banks, that if they fail, those failures can cascade throughout the system.

And I think that we could do the same thing in tech where once you reach, let’s say 100,00,000 users, you become a systematically important technology firm, and your algorithms have to be open source now. We have to be able to see how you are recommending content and what metrics you are maximizing for. We actually have to be able to see your KPI’s, like what are you trying to do? And I think that that regulation could make a big difference in reigning in some of the accesses and in just providing more transparency. Right now we have no idea how these platforms work despite years of investigative reporting and attempts by regulators to subpoena documents and interview people. We have no idea how Facebook or YouTube or Twitter chooses what to show us.

Ethan Zuckerman:

One thing that I’ve been starting to suggest in my work is that we actually have firms that can do algorithm, it’s much in the same way that an accounting firm can come in and look at privileged information. It has a fiduciary responsibility for the company, so it’s not going to share it around, but it also has professional responsibility to be bound by certain standards of accounting and to raise their hand when those things are being violated. It’s a tough solution because you have to establish the standards and then build this whole space of algorithmic auditors.

But to me, in many ways, it may be more realistic than trying to open source these algorithms, which gets a little hard because it makes them highly gameable. But one way or another, I think that notion of significant technology institutions, city, I think you were using and then audibility would be a helpful place to go. I want to see if we can end on a vaguely happy note. Kevin, what’s an aspect of the internet that still gives you joy? What’s something that you encounter that makes you feel like you did those days when you were running your own geo cities page?

Kevin Roose:

It’s actually funny, it was a Facebook engineer side project, but there was a Facebook engineer who came out with this [Winamp skins 00:00:23:51] library, did you see this going around?

Ethan Zuckerman:

Yeah.

Kevin Roose:

I just scrolled through that thing for hours. I mean, it was like opening a time capsule of all these skins for the Winamp MP3 player, which I, and everyone else used in the early days. There are some things on the internet that still gives me joy. I’ve been trying to do more live streaming with some of my colleagues and particularly my colleague, Charlie [Worsol 00:24:19] and I have been trying to make a run at becoming Twitch streamers, which is fun. And it’s just a new mode for both of us and it’s fun and weird in the ways that make me excited. Let’s see, I’m really I’m into watching archival YouTube videos, doing a little bit of time warps. I’ll watch an NBA game from 1996 and just put that on and …

Ethan Zuckerman:

Anything to get away from 2020, right?

Kevin Roose:

Exactly, exactly. So I think escapism is still one of the internet’s strong suits and I still use it for that.

Ethan Zuckerman:

So, Kevin, it was really a pleasure to have you on here. I, and so many other people rely on the reporting that you’re doing. And it’s really helpful to ask you to take a little bit of that 20,000 or 40,000 foot view on what ails us in this space and what might be done again. Really appreciate your time. Thank you so much for being with us on Reimagining the Internet.

Kevin Roose:

Thanks so much, Ethan. This was great.

Ethan Zuckerman:

Thanks man.

Kevin Roose:

Thank you.

Leave a comment

Your email address will not be published. Required fields are marked *