41 Wikipedia, The Last Bastion of Truth Online with Heather Ford

image of Heather Ford
Reimagining the Internet
Reimagining the Internet
41 Wikipedia, The Last Bastion of Truth Online with Heather Ford
/

Wikipedia is the Internet’s arbiter of truth, but why don’t we ever talk about the thousands of volunteers building it together? Heather Ford, an ethnographer of Wikipedia, joins us to talk about the power struggles and community governance that makes the site one of the most trusted information sources on the web.

Heather Ford is an associate professor at the School of Communication at the University of Technology, Sydney, and author “Writing the Revolution: Wikipedia and the Survival of Facts in the Digital Age,” out next year on MIT Press. This episode references Ethan’s blog post from 2010 about Makmende and the Wikipedia article for the 2011 Egyptian revolution.

Ethan Zuckerman:

Hey, everybody. Welcome back to Re-imagining the Internet. I’m Ethan Zuckerman. I’m your host. I am thrilled to be with you today. Today, we have a good friend and someone that I admire immensely, Professor Heather Ford. She is Associate Professor at the School of Communications at the University of Technology in Sydney. She’s the Head of Discipline there of Digital and Social Media. I know her from previous incarnations, where she was involved in founding Creative Commons South Africa.

She’s also been involved with Info Camp, the Oxford Digital Ethnography Group, Ethnographymatters.net. We’re going to be talking today about her forthcoming book, Writing the Revolution, Wikipedia and the Survival of Facts in the Digital Age. It’s going to be coming up shortly from MIT Press.

I happen to have read the book because I’ve written the forward for the book and I can tell you it is my favorite book that I’ve read in 2021. I really think it’s going to be a deeply important and fascinating book. Heather, it is so good to have you with us.

Heather Ford:

It is so wonderful to be here, Ethan. Thank you.

Ethan Zuckerman:

Heather, we met many years ago, talking about Wikipedia, which is a project that you and I both love deeply and have also been fairly critical of over the years. With the critique, really, I think coming from a place of love. You’ve been far more thoughtful about the project than I have. How did you get interested in Wikipedia and Wikipedians? When did this enter your world? How did you start working on these questions of open knowledge creation?

Heather Ford:

I was living in my home in South Africa and working on Creative Commons. That was in the early 2000s and I really recognized Wikipedia as one of the best examples of free knowledge in action. At the time, it actually was not using the Creative Commons License, but it was using the Free Documentation License and soon would be followed by the Creative Commons License. But it really demonstrated how open and free knowledge could work, especially for people in developing countries where I was based. And I was really excited by the fact that, in Wikipedia, you could, for example, write articles about issues and concepts from everyday life, according to your own perspective and in your own language.

So I helped organize one of the first Wikipedia editor funds in South Africa with others. I just was really excited about Wikipedia and was kind of an activist for Wikipedia. And then, after becoming a bit disconcerted with the way that the free and opensource software movement was moving, I was really upset about the focus on particular ways of thinking, and on the US, in particular.

So I went back to school, basically. I went back to graduate school at UC Berkeley. And that was when I actually read your blog post about Makmende. Now, Makmende, as you know, is a Kenyan superhero character. He gained resurgence in about, when was it, about 2009, 2010?

Ethan Zuckerman:

That sounds about right. He appeared in a video by a wonderful Kenyan indie band called Just a Band. And I would have to figure out when their albums came out, but it was probably 2009.

Heather Ford:

Yeah. And so you wrote about the story that Kenyan Wikipedians had tried to create an article in English Wikipedia about Makmende and it had been repeatedly deleted. One of the reasons was because apparently it wasn’t notable. And there’s a lot of complicating factors in this story, but what really interested me and got me excited about the story was it made me want to learn more about the ways in which Wikipedia was closing itself to knowledge from elsewhere in ways that really was surprising to a lot of people.

And that actually was a really big surprise. Your blog post, in those days, isn’t it funny to think about, your blog post really garnered a lot of attention in the community. It was a surprise. Nowadays, it wouldn’t be a surprise at all, right? We read these stories all the time about editors that are trying to prevent knowledge from coming into the encyclopedia, for legitimate or illegitimate reasons. But in those days it was really, really a big deal.

And so that started me on a path to try and figure out what was it about this community, this platform, that seemed to be so open and free to the world, but actually we were building these barriers towards recognizing knowledge from elsewhere. What did that actually mean? What did that look like?

And so I started working on Wikipedia bias, and in the early days it was a lot of working out how Wikipedia is biased. Because no one really knew. It was a massive surprise. Can you believe, it was a massive surprise when Mark Graham, my then supervisor at the Oxford Internet Institute, created these maps which showed how Wikipedia represented places in the developed world so much more extensively than those in the developing world.

And so a lot of the time in the beginning was about showing what that bias looked like. But then I started to kind of figure out what is it? I tried to think about what is it about Wikipedia that enables this bias? Is it just the fact that there aren’t that many editors from these places? And my conclusion, as you know, is, in the end, that no, it’s not just the fact that there are fewer editors of color, of a particular gender, and from a particular place.

Ethan Zuckerman:

I love this new book. I just adore this new book. I adore it because you are an ethnographer and at the heart of the book is a fantastic, well told story of how the article about the Egyptian revolution becomes the article about the Egyptian revolution, but you do something that is so profoundly gutsy at the beginning of the book.

You basically redefine the field of phenomenology. You basically said, “Look, let me explain to you how truth happens here in the world in 2021.” So, I’m going to put you on the spot. How does truth happen in 2021? And why is Wikipedia so important in that process of determining what is truth?

Heather Ford:

Thank you, Ethan. Yeah, I mean, I would love to say it’s all my own work, but it’s really building on science and technology studies, theorists and scholars who mostly studied how science was built, how knowledge and science was built. The way that I think about it, I’ve come to think about it, is that facts don’t only contain meaning, they also carry a particular materiality. That means that they are constructed in particular ways and it’s their material form that actually lends them authority.

And so my big argument of the book is that Wikipedia becomes a big source of truth in the world. Not only because it’s seen as a place where multiple views have reached consensus on a particular fact, but also because of the ways in which facts and knowledge are structured. They’re structured in modular ways that enable them to travel. They enable to travel beyond just Wikipedia.

And if you’re studying how facts work and how truth arises, facts have to travel beyond their point of origin for them to become authoritative, right? We know that. The first person who realized, the scientists that realized that smoking causes cancer, that fact wouldn’t have had any purchase if it had stayed within the lab or even within the documents that scientists produce, journal articles for Nature, for example.

It wouldn’t have had any purchase. Those facts needed to travel. And so what I argue in the book is that Wikipedia becomes this site of truth and of truth making because of the modular way in which its facts are structured. And, increasingly, it’s becoming a data project. So, increasingly, its facts are structured according to the rules and logics of data, and that that really enables its facts to travel much further and wider.

And that means that when you ask Siri or Google Home, “What is the capital of Israel?” It will extract facts from Wikipedia, in most cases. And so what the result of this is, is that Wikipedia becomes this massive site of struggle, because you know, people know, they’ve come to know, that it is on Wikipedia that you need to… And, in fact, Wikipedia is one of the only places on the internet today that you can edit and force change in the ways that facts are produced.

And so it becomes this really major site of struggle in which a number of different people from different groups and factions are trying to battle over the representation, because they know that it’s not just about what happens on Wikipedia. It’s actually going to influence how knowledge is represented many other places.

Ethan Zuckerman:

Right. So if you get it into Wikipedia, you also get it into Wikidata. If you get it into Wikidata, you get it into Google, you get it into Siri. And it’s remarkable just to think about this, right? You and I are both old enough to remember that there was a time where universities told you not to cite Wikipedia. I would argue you still probably shouldn’t cite Wikipedia, although you should read Wikipedia and figure out who to cite.

But there was definitely this moment where people asked this question, “How can an encyclopedia edited by anyone be decisive, be reliable, be a believable source of knowledge?” Fast forward 10 years into the future, YouTube looks for a way of debunking conspiracy theory claims in its videos and finds itself essentially saying, “Well, what is something that we can all agree is true?”

“Well, we all agree Wikipedia is true. We’ll just use Wikipedia to do it.” And of course, Wikipedia responds and sort of says, “Well, that’s very nice. I mean, you could pay us. You could license the content.” And, YouTube, bless them, rather than giving the money to Wikipedia, decides to use Encyclopedia Britannica instead.

But how did we get to the point where Wikipedia is somewhat unorthodox methodology, right? Neutral point of view, no original research and the process of battering things back and forth, to the point where you have something roughly resembling consensus. How did we get to the point where this becomes, quite literally, perhaps the most reliable anchor for knowledge in the modern world?

Heather Ford:

Yeah. I mean, it is pretty crazy, actually, to think back on the days when people laughed at Wikipedia. And other people have written so cogently about it, like Nathaniel Katz, for example, talking about how the truth of Wikipedia is built. And one of the foundational principles of Wikipedia’s authority, so its power, is that it appears, from its core principles, that Wikipedia doesn’t take a stake in truth telling.

Wikipedia represents the neutral point of view, according to its principles. And the neutral point of view, NPOV, basically means that instead of editors coming and bringing their opinions or their original research to Wikipedia about what a phenomenon is, they absolutely have to, rather, write articles and present facts according to what reliable sources say about a phenomenon.

And the way that reliable sources are defined is pretty circular, to be honest, on Wikipedia, which means that there is definitely a point of view coming across in the Wikipedia community about what constitutes a reliable source. But the appearance is that Wikipedia doesn’t take a stake in truth, that it stands back and it just lets reliable sources tell you what is the truth about a phenomenon.

Ethan Zuckerman:

How is this a political process, as well as a knowledge-making process?

Heather Ford:

It’s entirely political because there isn’t a two-step process to determine what is reliable in a particular context. Going back to the book, the moment… Well, there were actually multiple moments in which different news sources were calling what had happened in Egypt a revolution. And constantly, through the almost two-week process of the protests, editors were having arguments because some of them were determined to call the article a revolution. So it was originally called Protest, 2011 Egyptian Protests. And they wanted to rename it to Revolution.

And they were saying, “Well, this news source says that it’s a revolution.” And then the editors would say, other editors would come out and say, “No, but The York Times, for example, isn’t saying that it’s a revolution. And also, look at Google trends. You can see here, quantitatively, that revolution isn’t being used.”

And so, just looking at how the article, the name was changed, is really instructive about how authority is produced on Wikipedia, because it is through this constant wrangling over the meaning of what constitutes reliability, which is obviously very, very contextual, depending on where you sit in the world, what your point of view is.

Ethan Zuckerman:

Well, and in some senses, I think there’s this illusion of Wikipedia that these arguments are being made by purely neutral actors. I think one of the things your book does so well is explaining just how invested these actors are and just how important the politics are, behind some of this. Introduce us to Egyptian Liberal. Who is this person, and what is their work on what ultimately ends up being the article on the Egyptian revolution?

Heather Ford:

Yeah. So the Egyptian Liberal was the editor who first created the article that was originally called 2011 Egyptian Protests. He’s a 20-something, he was a 20-something Egyptian by birth. He was living in the Middle East, but he went back to Egypt a number of times. And he was in Egypt for the revolution. I talked to him a few times over Skype and had some wonderful conversations.

But he’s a fascinating character. Also, I make these claims about him, but I do so knowing that I will never really know exactly who he is. All I can go by is what he has told me and what others who worked with him have told me about him. But the important facts are that he actually created the article. He first drafted the article before the first protest had even taken place.

And I found that out because my first interview, he said, when I asked him to tell me the story about the article, he said, “Oh yeah.” I mean, he’s an activist. He was a Democratic activist and there was a lot of hope pinned on these protests. So the first version was drafted before the protests took place and it was published just a few hours into the first protest.

And you can kind of tell, because the only citation in that first version is to an AFP article that was talking about the protest the next day. So there were no current sources. It was too early, right? It was too early for that to happen. And the Egyptian Liberal really drove this article. He also participated a bit in the Egyptian Arabic and Arabic versions of these articles, but he was dominant on the English Wikipedia version.

Ethan Zuckerman:

But you make the argument, in some ways, that the article is actually part of the fulfillment of the truth on the ground, that this article, which is, as all spaces on Wikipedia are, a contested space. But this is a contested space where one of the powerful contestants is someone who understands this as a revolution in the making, really brings forward that frame. Should your account of how one individual manages to be so influential, should this be setting off warning lights and klaxons for Wikipedia administrators? Or have you instead just given us a portrait of how knowledge happens these days?

Heather Ford:

Yeah. That is a great question. I mean, this is the question I kept asking myself. What does it really mean that this person was so influential? I mean, the conclusion that I come to in the book is actually that individual editors are powerful. I mean, the Egyptian Liberal, as you say, demonstrated his understanding of how Wikipedia works. He also did things like he did some lobbying, which is illegal. In Wikipedia, you’re not supposed to do it, but everyone does it.

And lobbying, just in the sense that he notified other influential editors that the article had gone live and that they needed to support it. So he kind of rallied the troops around this new article. But what I argue in the book is that these historic events on Wikipedia are not created just by single individual editors.

There’s a lot of safeguards in place for Wikipedia to push back against single editors working alone. But what I argue is that Wikipedia is actually quite vulnerable to the work of crowds. And these are crowds of people that descended onto the article and really wrested control pretty violently away from the kind of democratic process of consensus building.

One example is that, when Mubarak’s resignation was announced, within minutes, editors, and these are new editors, so people who haven’t edited Wikipedia before, descend as a crowd on Wikipedia and they try to repeatedly change the name of the article. Now, what ideally should happen in Wikipedia is that people should have a calm, rational discussion about what should happen to the title of the article, because it’s contested, right? There’s no simple way of answering that question of what the title should be.

So they should weigh up reliable sources and then decide what had happened, what the article’s title should be. And Wikipedia policy says that Wikipedia isn’t a crystal ball. It’s okay for us to be late. It’s okay for us to be after the sources. But what’s actually happening, again, because of the power and the authority of Wikipedia and the fact that all these people wanted to be the first to determine history. They want to write history, because that’s what you do when you are the editor who creates the edits that changes the name of the article from protest to revolution.

You have written history, in no uncertain terms. Despite the fact that all these people who, some of them had been involved in the article and might have been on the other side of the world and were asleep at the time. They came in and they thought that their voice was being heard in this consensus process. Actually, it wasn’t being heard. The article name change had happened and everyone, all the editors that had been previously editing the article, these are the experienced Wikipedia editors, they basically gave up because the crowd was so powerful. There were so many editors and they really just, I mean, they gave up.

I’ve spoken to some editors who say that, Jake [Akashi 00:23:59], who is a great Wikipedian, and he was really instrumental in this. He said, “Well, it was the right decision. There were lots of reliable sources saying that it was a revolution and it was okay for us to do that.” But, for me, it’s more important to look at what process is being followed here?

It’s fine maybe for the Egyptian revolution to be called, but what happens when it’s another scenario, another scenario where there aren’t those reliable sources or there is just a small faction that are trying to control the narrative in ways that we’ve seen, actually, in other articles on Wikipedia.

Ethan Zuckerman:

I want to pivot a little bit. I mean, we’ve just beaten up a little bit Wikipedia’s processes for finding truth and made very, very clear in your book that they’re more complex and political than we might think. I think you and I would still agree that it is an absolutely remarkable project that has done wonderful things for the world.

On this show, we spend a lot of time thinking about how we fix social media, in particular. And we’ve hit this point where social media seems very troubled in any number of different ways. And most of the proposals that are out there are not volunteer and communitarian in the way that something like Wikimedia is, despite the fact that it’s been ludicrously successful as a model.

You’re much more likely to find someone offering the suggestion that, if we put everything on the blockchain and issue tokens, that will be the future of community. Not, let’s recruit millions of volunteers from around the world and they will jointly create this remarkable asset. So, that’s the background to say, “Yeah, why don’t we talk about Wikipedia more? Why isn’t the Wikipedian model the model for more problem solving?”

Heather Ford:

Ethan, it’s such a great question. I think about this question a lot, and I’m really surprised that more people are not paying attention to what Wikipedia has sown and how Wikipedia has solved some of the big questions around scalability and the maintenance of relatively good quality content that’s distributed across the world.

So what I think is that most people just think that Wikipedia’s an anomaly. As Richard Cook says, “Wikipedia is the last best place on the internet.” People, I think they just think that you can’t apply any of the same principles. I think differently, as you can imagine. And one of the things that I talk about at the end of the book… If I’m beating up on anyone in the book, it’s really beating up on Google and other platforms that are extracting facts and statements and claims from Wikipedia that aren’t then adequately attributed, but also that they don’t provide the right context that Wikipedia…

Wikipedia provides a lot of nuance around the facts that it presents. One of the nuances that Wikipedia provides is the breaking news template. Now, if you go to an article about a phenomenon that is just happening right now, there will be a warning template at the top of the article and it will say, “This article is subject to breaking news. Just be careful of how you read the article, because it’s changing.”

Ethan Zuckerman:

It’s a very helpful caution.

Heather Ford:

Of course. It’s unstable, right? It’s unstable. There’s a lot of different people that are trying to edit this. And so what I find so troubling when these more powerful platforms are extracting claims from Wikipedia without attribution and without that context, is that they don’t actually provide you any of that nuance or any of that context when they’re just spouting these facts as if they are the truth from up high, as Donna Haraway would say. That they seem to be all-seeing and all-knowing because there’s no acknowledgement of the human and material conditions under which these facts were created.

Now, that is my problem. And that is the thing that I am now really interested in looking how to solve. So when we extract facts as the web moves towards the semantic future, where we have these claims being extracted from multiple different places, how do we provide that nuance, that context, that recognition that some claims are more stable than others, not more true than others, but more stable than others? How do we present that in a way that’s useful to the reader? Now that is my next challenge.

Ethan Zuckerman:

I would add an economics question to that, which is, I think many people, I don’t know if many people, I can certainly imagine a pundit saying with a straight face, “Google’s most important asset is their Knowledge Graph.” Someone’s written that somewhere as a thought piece. And Google’s Knowledge Graph, to some very large degree, is Wikidata. It is Wikipedia turned into a fashion that Google can use to drive its own products. And it drives those products to provide a very, very useful service and to sell an enormous amount of advertising on top of it.

And yet that money, for the most part, is not making it back to the Wikimedia Foundation. That credit does not accrue to Wikimedia. Wikimedia ends up being that edge case, that one strange thing that no one can describe, rather than the model that we should be trying to emulate.

What I love about your work on this is that, in many ways, you sort of attack the magic of Wikimedia. And, rather than there being some mysterious process through, add a little NPOV and no original research and stir, and somehow the truth comes out of it. What you end up saying is, “No, these are actually ferocious debates. They’re hugely sequential, they’re hugely important, and they’re complex and nuanced.” Does understanding how Wikipedia actually gets made, make it less magical or does it make it any more magical?

Heather Ford:

I really hope that it makes it less magical. I really hope that it demonstrates how peopled the process is, how economics, how friendship, how all these different types of sociality and politics goes into making knowledge. And that’s not just in Wikipedia, that is everywhere. So I really hope that it shows that there’s no magic to this and that we really have to pay attention to the materiality of this, what really goes into making knowledge and how best to do it in a way that is inclusive and that results in good quality knowledge that people can not only share and consume, but also that people have agency over.

As soon as you cut those facts off from the source from which they were built, you cut off people from being able to actually influence them, because on Google, you cannot influence. If you see something that is wrong, there is nothing you can do about it, other than to create a movement around it.

And so my big problem with this is not only that it’s about money. And I definitely think it is about money because it’s public resources that are being extracted by a private firm for commercial revenue. But it’s also about politics and power and agency and people don’t have the same agency over the knowledge that they created. And I think that is what’s unfair.

So, for me, the solution then is a public campaign to demonstrate the importance of verifiability, the importance of people being able to have an influence on the facts that are produced and reflected for the whole world. That, to me, is the important problem and challenge.

Ethan Zuckerman:

Professor Heather Ford. She is the author of the forthcoming Writing the Revolution, Wikipedia and the Survival of Facts in a Digital Age, be coming to you next year on MIT Press. It looks like it’s a book about Wikimedia. It looks like it’s an ethnography, a deep ethnography of a single Wikipedia article. It is, in fact, a book about the nature of truth and, I would argue, about the nature of reality. Heather Ford, such a joy to read this book, such a joy to talk with you about it, such a joy to have you here.

Heather Ford:

Thank you so much. It was an absolute pleasure.