03 Safiya Noble, UCLA Center for Critical Internet Inquiry

photo of Safiya Noble
Reimagining the Internet
Reimagining the Internet
03 Safiya Noble, UCLA Center for Critical Internet Inquiry
/

Safiya Noble, author of Algorithms of Oppression and co-founder of the Center for Critical Internet Studies at UCLA, outlines her abolitionist framework for Big Tech. Recorded the day after Jacob Blake was shot by Kenosha, WI police in August, Noble joins us to talk about what it might look like to hold social media platforms accountable for the dangerous speech they help disseminate.​

Transcript

Ethan Zuckerman:

Hey everybody, this is Ethan Zuckerman from the Initiative for Digital Public Infrastructure. I’m back with this ongoing conversation series on Imagining the Internet. I’m thrilled to have with us today Safiya Noble, who is the associate professor of UCLA in the Department of Information Studies. She’s also the co-director of the UCLA Center for Critical Internet Inquiry. I know her and you may know her through her really excellent book, Algorithms of Oppression, How Search Engines Reinforce Racism. She’s really one of the most thoughtful people out there about how the systems that we use in social media can have indirect or direct harms on us in terms of racism and discrimination.

Ethan Zuckerman:

She’s someone that I’m really looking to for inspiration in terms of sort of imagining different futures. So, Safiya, thanks for being with us, and I’m going to ask you the question I ask everyone, which is, what’s wrong with social media right now and what should we be doing to fix it?

Safiya Noble:

Well, thank you, Ethan, for that lovely introduction and I think that one of the biggest problems with social media is that we have the speed and scale that allows for some of the most harmful kinds of propaganda, disinformation, discriminatory actions, let’s say, in advertising, to proliferate. We also have a direct channel to consumers for the distribution of all kinds of, let’s say, products and goods that might be harmful, that might be unregulated, that have no oversight, right? There’s so many ways in which the speed and scale of the current large social media monopolies work that I think we have to assess the incredible damage and reconsider whether this is a model or a part of the media system that we really want.

Ethan Zuckerman:

I want to talk a little bit about the damage and how we sort of assess and understand that. Early on, in your work that led up to Algorithms of Oppression, you did really memorable research about how totally innocuous searches on Google. Searching for black girls, for instance, had a very high chance of leading towards pornography, where searching for white girls did not. Google has subsequently changed things. You’ll get black girls clothed when you search for black girls now, but for quite some time, when you were using Google to search for somebody like black girls, you were getting this highly sexualized content. What’s the harm? Explain to us how that constitutes a harm, how that affects us as users of these tools and members of society.

Safiya Noble:

Well, we know from other forms of media pre-internet, for example, television and film, those industries, that when you circulate exploitive and stereotypical kinds of information or entertainment about communities, especially communities that are oppressed in our societies around the world, that it actually contributes to the dehumanization of those people. So, one of the reasons why I … in that early study was looking at what happens when you look for girls of color. I looked at Asian girls, Latina girls, a lot of different kinds of girls and women, and found that over and over women and girls were sexually objectified in search.

Now, the challenge here is that the way people relate to search is that it is kind of like a public library. Some people think of it in that way. They think of it as a fact checker. They think of it as kind of the first go to to find out something about anything.

Ethan Zuckerman:

There’s implication of objectivity.

Safiya Noble:

There is an implication. There’s not only an implication of objectivity, but when Google’s own kind of moniker is organizing all the world’s knowledge, then that actual organizing makes us think that there are some type of credibility or some veracity in the process of sorting through and indexing all of this content on the web. So, I think it’s really important, as the public is engaging with these technologies, that they understand that they’re actually advertising platforms. Of course, in the case of the highly sexualized content, it’s really to do with the fact that the porn industry and other industries like it have more money than children and girls and even women.

So, I think we have to ask ourselves this idea that those who can buy their way to the top can control certain narratives in our society. That really is what I was trying to point to more than anything in the book, is that we have a lot at stake when we leave our information knowledge landscape to these kinds of advertising companies that really are not invested in truth.

Ethan Zuckerman:

Right? So, in the case of search engines, the concern in many ways is we have something which is a proxy for truth, maybe a stand in for truth. It’s an extremely imperfect stand in, but when you really unpack it, what it is is a marketplace. It is more advantageous for porn companies to be advertising on black girls or Asian girls than it is for an NGO, or for women’s rights organization. Then, there’s the strong possibility that we’re going to have to be encountering that ad content. How does this play out in the social media spaces? Why does that relationship between advertising that the market associated with it, and the content that spreads on social media, why is that of such primary concern for you?

Safiya Noble:

Well, one of the things, Ethan, you and I both have been on the internet for a very long time. I’ll tell you, we remember the pre-advertising days, and of course, the many models that came about to try to monetize people’s activity and access to entertainment, and ideas, and all kinds of things on the web. So, first of all, we have this kind of publicly built infrastructure that was funded by taxpayers, in the kind of early internet days, then the move to commercialize the space. What it means is that the largest most moneyed actors in the world, and by that, I mean, organizations, businesses, governments, really are able to control the kinds of information that we come into contact with.

Now, that might be great on some levels. I think it’s really helpful. I’d love to see, for example, the universities and the public libraries of the United States have a really big visible space for librarians and other kinds of information professionals, for example, to help curate and help us navigate through. In fact, you remember the early web, people thought of it as a virtual library. So, what we have though, instead, is this profit imperative, where return on shareholder investment is really the most important imperative in the social media and kind of big tech landscape.

What that means is, if people are harmed, and when you are talking about, let’s say a little bit of propaganda, racist disinformation, hate speech, all kinds of things like that, only a little bit can grow to impact millions of people very quickly. That seems to be the collateral damage that big tech is willing to pay. They’re willing to pay very minor fines when they’re, for example, breaking the law or they’re found in violation of civil rights law. The penalties are pretty low, they’re kind of pocket change in relationship to the kind of return. Well, what that means then is the most vulnerable people in our society are the people who pay the highest cost and have the least amount of protection.

I think that’s something that it’s just it’s unsustainable. Of course, every day we just saw, in the last few hours, Jacob Blake being shot in the back, Ahmaud Arbery, George Floyd, Breonna Taylor, there’s so many people we could name. Part of the, I think, way in which African-Americans in the United States, for example, are dehumanized through so much kind of vicious incorrect and stereotypical kinds of information that moves through these social media platforms really has a cost. It’s not just the people that are harmed, but it’s all of us who are witnesses to the harm and have the secondary trauma associated with the harm.

I think we have to think about how we are going to re-imagine the kind of societies we want and what role these technologies play in bringing that forward.

Ethan Zuckerman:

I think we should dig in a little bit to the situation with Jacob Blake, because it has some very direct situations where we might be able to trace harm back to social media. So, three days ago, Jacob Blake was shot in the back Kenosha, Wisconsin by Kenosha Police. He had been trying to break up a fight. He’d been entering his own vehicle and he was shot seven times in the back. As we’re recording this, his family’s reporting that he’s alive, but paralyzed below the waist. There have now been nights of protests in Kenosha. Last night, things got very strange where not only did you have the police attacking protesters with tear gas, but there was a group of what appear to be Boogaloo boys who were armed people claiming to protect businesses in Kenosha area.

What appears to have happened is that one of those people, who may be a minor shot, a number of protestors and killed at least one of them. So, we’re now at this situation where we’re having peaceful protest as the result of incidents that are being documented, and the documentation is being spread on social media. We now have people coming to these protests to protect or to counter protest armed, and this is a movement that seems to come entirely out of social media. Boogaloo is a reference to this idea that we’ll have Civil War, two electric boogaloo, and you have white ethnic nationalists as well as a lot of just anti-government forces who seem to be egging this on.

Is this the time that we just shutdown Facebook, and Reddit, and Twitter? What do we do at this point, Safiya? These tools that you correctly identify seem to be causing a great deal of harm. This really now seems to be turning into matters of life and death.

Safiya Noble:

It is, and I’ll tell you that if there were any other media sector that was allowing for the organization of armed militias to come out and kill Americans, let’s just keep it in the US context for a moment, those radio stations would be raided by the FBI. There would be potentially a crackdown. Certainly, we know that for civil rights organizations in the history of the United States, especially the more recent history of the last 50 years, whether it’s the American-Indian movement, the civil rights movement, Puerto Rico independence movement, the antiwar left in Vietnam protesting the Vietnam war. We know that if there were attempts and when there were attempts to organize people, for example, to stand up for civil and human rights, that the US Government, in fact, raided those organizations and facilitated the shutdown of movements.

So, it’s interesting to me to see right wing, far right extremist, fascist, white nationalist, neo-Nazis these are not all interchangeable terms, but these types of organizations are allowed to not only march up and down our streets with semi-automatic weapons and threaten and also kill peaceful protestors exercising their constitutional rights. But we also have companies that facilitate their ability to communicate, gather, and organize. Of course, I’m talking about Facebook, but also there are others. So, I find this incredibly disingenuous. I think that one of the reasons why these large social media companies have escaped accountability is because they have diluted the public officials into thinking that they are simply neutral tools.

That they are not responsible in any way for the content that moves through their systems. Of course, that’s like saying that the cable television industry isn’t responsible for the content that they broadcast because they’re just the cables underground. That’s just a ludicrous proposition. I think we’re going to have to really revisit the way in which these companies are framed. It’s not a fait accompli that they are here to stay either. Let’s remember other technologies. If we pull back and take a longer view, we might remember that there are different kinds of industries and products and services that have come to market. That have been based on different kinds of economic arrangements, that are no longer with us.

Because we decided that they must be abolished, or that they were too harmful, or that the cost was too high, or they were regulated in a way that really discouraged their kind of profiteering through exploitation. I think that we might need to take some of those lenses to some of the existent technology, too.

Ethan Zuckerman:

Before we get into abolition, because I do want to think about that with you, because I think it’s one of the most provocative ideas that you’re putting forward and one of the most important. I do want to push back slightly and say that there’s different models of infrastructure, cable TV carries Fox news even when it’s crazy. If it ends up carrying OANN rather than carrying Al-Jazeera America, those are very conscious content choices. I think a lot of these tools would like to argue that they’re closer to a common carrier. It’s closer to a telephone call or a private letter, where they’re not going to intervene and deal with the content. You obviously think that they lean closer to the cable system, making the programming choices than to the common carrier.

How do we sort of understand that in the case of something like Twitter? To what extent is it making those choices that are closer to cable TV and less like telephone or private letters?

Safiya Noble:

Well, it’s difficult to make a telephone call to 10 million people in 48 hours. So, I find that to be a limited way of thinking about it. I understand it kind of from the early ISP dial up model of connectivity. But I think that our infrastructures have also evolved significantly. Now, we’re talking about broadcast capabilities that are happening in something like Twitter, or in Facebook, or other types of platforms, YouTube. So, that seems to be, I think, the common carrier model. We’ve far eclipsed that let’s say maybe by 25 years, at least. Now, we’re into a situation where you can, one to many, one to millions of people, broadcast content.

Networks and other kinds of broadcasting industries have to think about their responsibility. Now, they have different kinds of oversight, let’s say, by the Federal Communications Commission. I think in the case of social media and big tech broadly that we might be thinking at looking at the Federal Trade Commission. This is one of the things that I argue in the book, is that there’s been so much focus, I think, on net neutrality in the FCC, again, in that kind of early model and early paradigm. But now we’re talking about direct consumer harm, and I think that we need to move. We could even go further then in the FTC and we should say, “Should the Department of Justice be looking at the kinds of activities that happen in these spaces and places?”

I think that the tech industries had such a powerful lobby and a powerful hold on our imaginary about what they are, and that’s dangerous. It doesn’t mean that necessarily we want to do a way with every dimension, but I think that we are not properly naming what these technologies are lends to the confusion. If we could frame them better, we might be able to better figure out what they are and what they should be responsible for.

Ethan Zuckerman:

A debate like Section 230 immunity feels very, very different when we’re talking about someone posting a webpage for three or four people versus someone with YouTube channels that are reaching millions of people. Somewhere in there, there has to be a distinction between what feels like that one-on-one private communication and what feels more like broadcast. You’ve talked a little bit about the notion of something parallel to a consumer finance protection bureau, some of the work that Elizabeth Warren and others have done. Can I get you to flush that out a little bit?

Safiya Noble:

Sure. I think this is a moment where kind of with the opening of the policy community thinking differently and more rigorously, I think of this as a time for scholars who really understand these systems to come forward with ideas. So, of course, I have many different ideas. I’m not sure that any of them will be taken up, but I certainly have talked with you about things like a consumer technology protection board, right? Modeling off of what the banking and finance industry did to really harm millions of Americans through their mortgage schemes and frauds. They’re kind of betting against Americans and betting on their failures.

One of the greatest, I think, things we want to remember about the mortgage crisis, which triggered Elizabeth Warren starting the Consumer Finance Protection Board and Rohit Chopra, who is our commissioner on the Federal Trade Commission also worked closely on that, is that we have the largest, for example, wipe out a black wealth in the history of the United States. Imagine all the gains, since emancipation, all of the gains in terms of wealth building effectively wiped out by Wall Street with the gamification, which quite frankly was facilitated by new predictive algorithmic modeling that was not previously possible. So, when we think about the implications of that, I think we have to get very serious about what kinds of harms and what kinds of remedies.

One of the, I think, failures of the consumer finance protection board is it really didn’t provide enough remedies to individuals. So, I knew many people, for example, who have lost their homes, this is anecdotal, but I’ll just say, okay, they reported Bank of America or they reported Wells Fargo. But in the end, it was the industry that got the bailout, not the homeowners themselves, they were never in a position to repurchase. Those same speculators went in and then snapped up all those properties for pennies on the dollar. If we think that’s not connected to things like the affordable housing crisis right now, then we need to dial into cities like Los Angeles, where I live, in New York, and other, Chicago, or other major cities, where people just will never be able to own homes again.

So, I think we’ve got to think about models that provide remedies to the public for the harms that they experience. Also, that maybe where damages might actually have to be part of the model, too. So that if you harm, you are responsible in terms of damages. The way that the American models of thinking through harm and protections through the legal system have really not often fallen on the side of victims, but rather on the side of perpetrators and large organizations and powerful people.

Ethan Zuckerman:

You and I are both very much engaged in this question of imagining different possible futures. One of the things that you’ve said that that has really stuck with me is this idea that we can change economic models. You’ve made the argument that the US changed one of civilization’s greatest injustices and economic model based on the enslavement of people, based on their ethnicity. The United States that was, in many ways, literally inconceivable without the institution of slavery became a United States that is trying to figure itself out post that institution. Changes on that scale have happened, need to happen, makes it possible to sort of imagine what a social media or an internet past surveil and advertising looks like.

Talk to us about what that idea of abolition means in the social media space and what sort of space you might imagine if we abolish some of these models that we’re currently thinking about.

Safiya Noble:

Yeah. Ethan, I appreciate you bringing in that because I have been talking for some time now about the, I guess, in situating my own stay in the matter if I have a say. I really do imagine myself as more of a tech abolitionist probably than anything. When I say that, what I’m invoking for example is the institution of slavery and enslavement. Of course, one of the reasons why we pull back when we look at history is because we understand that there were arguments to be made, that the US economy could not function without a slave economy. It was only a handful of people who were abolitionists.

Who argued about the ethical and moral imperative that was at stake for the country, when it built its society on the notion that there had to be a permanent disposable underclass of people who had no rights and had no say over their lives. The effects of that are still with us. So, it’s not like that was hundreds of years ago. We are still living. People, my generation, I’m Generation X, our grandparents, many of our grandparents were sharecroppers. They were just one generation out of enslavement in terms of it’s real practice and entrenchment. So, I think that when we think about taking on something like big cotton, or I’ll fast forward to another industry we thought was infallible, which is the big tobacco.

There was a time when people could not imagine not having big tobacco as a prominent marker in our society as smoking was. When I tell my students, for example, that probably the doctor who delivered me had a cigarette hanging out of his mouth while he was … Well, my mom probably chain-smoked right after she gave birth. People can’t imagine that now, young people, but that tells you so much, and big tobacco is also really interesting because in many ways it’s parallel to pouring millions and millions of dollars into research that was favorable to its own interests. It had powerful marketing and advertising that convinced people that they needed it, and they wanted it.

It had huge secondary effects on people who didn’t even choose and want to participate. It created a public health crisis that disproportionately impacted poor people and poor people of color. It was predatory. So, I think that’s also a model that we can look at and other industry and say, Well, what happened there?” Well, what we know happened was class action lawsuits. We know that the tobacco industry had to pay basically restoration back to the public. It had to put billions of dollars that were firewalled off from their ability to influence, that went into research, that helped the public understand the harms. Now, it doesn’t mean that people can’t smoke, but it does mean that people understand what they’re doing now when they’re smoking.

I think we need a similar kind of movement around big tech, where people at least have the opportunity to understand what they’re doing. It’s not just a fait accompli that it’s ubiquitous and it’s everywhere, even if it causes all of these various kinds of attendant harms.

Ethan Zuckerman:

So, if I’m hearing right, the model is, in some ways, sort of a post-tobacco model, tobacco settlement model, there is a documentation of the harms. There is a holding big tech [inaudible 00:27:18], the funds that are used for education. What do we build in its place? If we know that what we built now is dangerous and exploitative, but we also know that social media is unlikely to dissolve, what does the post-big tobacco, the post-big data social media look like?

Safiya Noble:

Well, I don’t think it’s a forgone conclusion that social media is here to stay. All right. The reason why I say that is because I think we’ve thought of other kinds of media that we thought were here to stay, that really changed a lot with other opportunities or other possibilities that came along. So, one of the challenges here is that if social media continues on its same track, it’s really going to devolve into such a cesspool. People are not going to want to be there. They’re not going to want to participate. We already see the makers. What if the cautionary tales, this is why I love being a researcher, because I feel like we spend our lives trying to see and understand, and then communicate out to the public what’s happening.

One of the things that I find interesting is that the makers, for example, of these technologies have their own nannies sign agreements that they cannot allow their own children to participate with these technologies, that they can’t post their photos to Instagram. That they can’t be on devices when they’re with their children. So, they already know themselves about the addictive qualities of these technologies and about what it means to be classified, and documented, and cataloged into these systems from birth, and how harmful, and what the consequences long-term might be.

So, I think we’re going to see, for example, the first-generation soon of children who are adults who try to run for office, who try to become teachers, who try to become social workers, who try to do all kinds of things, and who will be damaged, in fact, or barred or precluded from doing some of those things because of this long history of their lives lived on social media. So, I want to offer that we might, through time, see that this did not serve us. That it actually undermined participation in public life, participation in education and employment opportunities. We’re already seeing these kinds of things emerge and we’re documenting these things.

So, I say, it isn’t a fait accompli that these things will be here forever, and maybe, maybe we don’t need industrial skill technology. Maybe we will go like the … I’d look metaphorically always to other kinds of movements, maybe like the response to the industrial level food industry and farming industry, we will have a slow food or a slow tech movement. Maybe we will find that an organic, slow, localized kind of connectivity that doesn’t demand a 24 by seven type of connectivity and drive an addictive quality of life, might be more enjoyable. We have no idea what’s possible and plausible, but I think it’s incumbent upon us to try to imagine and try to dream about the kinds of lives and societies and communities we want to live in. That’s what I enjoy thinking about and trying to offer into the conversation.

Leave a comment

Your email address will not be published. Required fields are marked *