Erin Kissane

111. How Facebook Stoked Civil War in Myanmar, with Erin Kissane

Reimagining the Internet
Reimagining the Internet
111. How Facebook Stoked Civil War in Myanmar, with Erin Kissane
Loading
/

In part one of our interview with researcher, designer and, in her own words, “ancient Internet person” Erin Kissane, we look at what happens when a platform swallows a whole society’s media landscape but abdicates all responsibility to the people using it. In this case, we’re talking about Myanmar but it’s understandable if your mind went to a certain X factor.

Erin Kissane runs wreckage/salvage, helmed the COVID Tracking Project for The Atlantic, wrote Meta in Myanmar, and recently published a report on governance in the Fediverse with Darius Kazemi, which we’ll talk about in the next episode.

In edition to Meta in Myanmar, linked above, Ethan and Erin discuss her essay “Bad shape.”

Transcript

Ethan Zuckerman:

Hey, everybody, welcome back to Reimagining the Internet. I’m your host Ethan Zuckerman. 

I’m here today with an old friend of mine, Erin Kissane. I’ve known Erin since the late 1990s, early 2000s, when she was working in the web design mines in New York City. It was clear even then that alongside the world of web design, she was a profoundly talented writer, editor and researcher. 

In the years since then, Erin has been involved with an incredible number of public good projects. 

She was one of the principles of the COVID tracking project at the Atlantic, where a group of individuals tried to provide the numbers that we could not get from the US government or from public health institutions to find out what was really going on with COVID. 

In 2023, she published an absolutely amazing 40,000 page book, essentially in about two weeks, called Meta in Myanmar. It’s a deep dive into how Meta failed to take responsibility and had a great deal of power over a terrible genocide in Myanmar. 

She’s now doing some of the most interesting work out there about the Fediverse. She’s just one of the most thoughtful folks I know of about platforms and platform power. Erin, where are you joining us from today? 

Erin Kissane:

Hi, Ethan. I am coming in from the Oregon coast. I just want to say it’s actually 40,000 words and not 40,000 pages, just in case anyone else just had a convulsion at that idea. But thank you.

Ethan Zuckerman:

Sorry for that, Erin. My part 40,000 words is still pretty darned impressive for a couple of days. I used to live blogged TED and that was about 49,000 in four days. So I know a little bit about what that’s like. 

Erin, you run a group called studio wreckage/salvage. Looking at the web page to refresh myself on what you’re doing, there’s just a beautiful quote from Ursula Franklin. “Not all problems can be solved, but all problems can be illuminated.” Who’s Franklin and why is that quote so important to you? 

Erin Kissane:

Yeah. So Ursula Franklin is one of my all-time favorite technology thinkers. She was a physicist, a materials scientist, a German Canadian peace activist and Quaker of Jewish descent who with her family was a Holocaust survivor. 

Her perspectives on the reality of technology and the way technology moves in the public and private sectors. She has such a refreshing, warm, humane and extremely pointed approach to all of these ideas. She did a wonderful series of talks, the Massey Lectures for the CBC. Those are all freely available to listen to. There’s a great book that came out of it, The Real World of Technology and it’s expanded and wonderful. 

But I actually, I always tell people, try to listen to the actual lectures because hearing her real human voice, she’s very funny. It’s a fantastic, very humane experience and then read the book because it’s also wonderful. 

But she’s able to cut through, I think, a lot of the kind of needless assumptions about how things have to be and work in a very pragmatic register about what should we be doing to put guardrails on technology. She was especially focused on the way public works projects were and were not properly evaluated before they began affecting communities and lives where she was in Canada, but also with the peace movement and nuclear disarmament and all of that. So a wide-ranging, very sharp thinker. 

That particular quote, she was actually in an interview with Meredith Whitaker, who some of us may know from Signal. And I just think it’s so heartening. I find it so, like, as someone who’s sort of mode of love in the world is research, I find it very encouraging. 

We can’t fix all the things, but I genuinely believe that trying to understand the things is still really important, even and maybe especially in a moment where keeping touch with ground truth and with reality is especially challenging online and off. 

Ethan Zuckerman:

Illumination to me feels like a wonderful escape from a path that we talk a lot about in the world of public interest technology. So I spend a lot of time at UMass helping think about how we train young people to use technology critically, to look at the powers of pattern that inform the technology that we’re using. I’m often trying to help young people solve the problem of I want to work with technology, but I’d prefer not to sell my soul in the process. 

And we often get trapped in this sort of I’m going to fix things, I’m going to critique things, side of things. It’s not always possible to fix things. You’re not always the right person to fix things. It’s often possible to critique things. Critique is often very helpful, but critique can also get in the way of actually fixing things and it can feel like a duality that you can’t get through. 

Illumination actually feels like a very helpful synthesis. There’s critique in it, but it’s critique to remove obfuscation and to get to the point where we actually could look at things and whoever ultimately is responsible for fixing it might have a clearer path to do so through it.

Erin Kissane:

Yeah, if you can see the ground in front of you, you can maybe take more confident steps, right? 

Something that was just that just came up at a conference where I was talking with people was the idea that it’s so difficult to figure out the balance between gaming out everything that can go wrong, doing all of this research, talking to people, really building a solid, sturdy understanding of the problem space and threat modeling, and actually also wanting to do things as an agent in the world. 

I think even for me, there’s been moments where I’ve learned too much and just felt like maybe we should stop, maybe we should just not do the things anymore, and that’s not of course like a viable mode and that feeling passes. 

But I do think that Ursula Franklin’s orientation and that idea of illumination, that’s the mode I want to work toward. The work that I do these days is about trying to produce and distribute knowledge that will be helpful to people actively trying things, doing experiments. 

And her work was very oriented toward doing experiments. She did a lot of very practical guidance for the necessary work of trying things in a careful way, recognizing that you’ll get things wrong and doing it in a way that you can, if necessary, reverse some of your steps and try a different trajectory. So, yeah, that feels very close to the bone for me right now. 

Ethan Zuckerman:

It seems to really inform a lot of the work that you’re doing. I’m thinking about a blog post that you wrote about three months ago. It’s basically a blog post about platforms, and I think mostly social media platforms, but maybe platforms more generally, being in bad shape. You do a wonderful job of explaining what a bad shape would mean in go, that it would be a set of stones that are not fulfilling their purpose of controlling the board and seem likely to be cut off and die. 

But in a less poetic but very straightforward way of explaining the situation, you say, “The evidence of the past decade and a half argues strongly that platform corporations are structurally incapable of good governance, primarily because most of their central aims (continuous growth, market dominance, profit via extraction) conflict with many basic human and societal needs.” 

Erin, what’s wrong with platforms and how they’re handling governance? 

Erin Kissane:

Oh, how long do we have? 

Ethan Zuckerman:

It leads 40 more minutes, yes. 

Erin Kissane:

I mean, I think probably your listenership is relatively up on events, so I don’t want to do a full, so I won’t attempt a historical recounting of everything that’s gone wrong. 

But I want to take a little step sideways and just say, in retrospect, maybe we can see that the idea of giving speech governance into the hands of private corporations is just kind of a goofy thing to do. 

Like, I understand why it happened. We don’t want governments to be controlling speech and platforms were run by tech companies. Therefore, it just winds up being the case that we want to give speech control, global speech control, across all of these different norms and contexts in all of these international and domestic contexts into the hands of companies that just were in no way prepared to do that work. 

I mean, who would have chosen that? No one would have said from first principles that we think technology entrepreneurs are exactly the people who are most suited to fulfill the role of governors of this precious common good that is the local and global conversations. It’s just kind of a bonkers idea. But that’s where we are. 

Ethan Zuckerman:

And Erin, it’s arguably even worse than that because I was running trust and safety—we didn’t call it at that point, I was calling it abuse and customer service—for one of the earliest internet platforms from 1994 to 1999. 

What we wanted more than anything else was nothing to do with this problem. My boss looked at me and essentially said, you are a cost center. The point of cost centers is to cost as little as possible. Stop costing me so much money. And that was essentially our approach to governance. How can we take up as little space as possible? 

Erin Kissane

That’s exactly right. It’s not like the tech companies didn’t sign up for this work. I can think of very few who actively wanted to do that. They didn’t come in to do that work. No one decided that was a great way to do things. It’s just what happened. 

And it’s what happened for a variety of complicated reasons. But it’s where we are and where we have been. And now it feels like we’re discovering that we took a really strange and dangerous path. 

And now we’re seeing an explicit rejection of that responsibility on the part of many tech companies, which isn’t better because then no one’s doing it. For many years, as I know you are particularly well aware of, a lot of the large platforms tried to push the hardest work onto NGOs, civil society organizations internationally, which were largely underfunded, but especially not well heard, responsible for things but didn’t have the authority to actually do the things that they understood were necessary in order to keep their communities and people safe. 

So tech companies tried many ways to push that responsibility away from themselves. And now we’re perhaps succeeding, but the result on those centralized platforms is that certain kinds of things are still happening. It’s the case that certain political speeches being censored. All kinds of restrictions are still in place. 

But some of the ones that I thought we’d sort of landed on as necessary, like not being the infrastructure from which genocides are launched, those have kind of gone by the wayside, which is extraordinarily precarious for the millions of people who inhabit those platforms.  So it’s just a bad situation at every level. 

Ethan Zuckerman:

Let’s just double click on the genocide link for a moment, because you describe a situation, you’re describing a general situation, which platforms are putting too much responsibility on NGOs that are incapable of doing it. For those of us in the know, you’re talking very specifically about meta in Myanmar. 

Can you talk a little bit about how ill-prepared meta was to become the dominant internet player in Myanmar, and just how poorly positioned they were to take on that responsibility? 

Erin Kissane:

Sure. And I should say, I’m speaking from the perspective of someone who was really immersed in that specific situation, but it is one of many very bad situations. It happens to be especially well documented. So that’s why that was the one that I spent the time on that I spent, which was actually like four months of research. 

So first I have to say, it wasn’t the case that Facebook became the internet in Myanmar by accident. Facebook worked very hard, and through its partnerships with telcos became the internet in Myanmar very much on purpose, through the zero rating work that they did to make Facebook free where other internet services were not. They shouldered their way into being the vastly dominant force in internet tech in this country that had been, this is important. So Myanmar was under authoritarian, extremely strict rule right up until the very beginning of the mobile internet. 

So what happened there was so unusual in part because the country had been so cut off from the rest of the internet. Facebook rocked up, made the mobile internet through Facebook accessible widely to many people in Myanmar. 

And so you had a relatively naive population there. There were farmers who had mobile phones plugged into car batteries who had no wired electricity, but who had the internet through Facebook. And it was incredibly useful for them because they were able to access weather data that they’d never had before, which was extremely meaningful for their agricultural operations, right? 

So you have a huge uptake using cheap mobile phones and free internet from Facebook. And then it happens that Facebook, because no one else was set up to do it—they were the speech governors. And they were governing speech in a context where they had almost no Burmese speaking moderators. I won’t say from memory how many, but we’re talking about a very, very tiny number: one or two in Dublin, if I remember correctly. For this large populace coming on fresh with none of the knowledge that the rest of us had sort of accrued, none of the immune system responses to things like scams and misinformation. 

And Facebook was just not doing the job of content moderation there, even in the way that they do it in the US and Canada and many places in Western Europe. So that was part of it. 

There were some really peculiar but bizarrely meaningful situations involving fonts and the way that the fonts that displayed Burmese language content, there were two different systems for that and Facebook couldn’t handle it. There were a number of like bizarre technical circumstances that led to the inability to read and moderate content according to community guidelines. 

And that’s the part that I kind of knew going into this research, you know, it was just it was a content moderation negligence situation. But what I did not understand was that what they did went so far beyond that. 

There were NGOs in Myanmar, there were very technologically savvy people in civic society and digital rights organizations that were watching building hate speech, what appeared to be coordinated genocidal campaigns, building in what seemed to be not necessarily, it wasn’t clear where it was coming from. There was clearly, there were religious extremists who were doing some of this work. There was what looked like maybe astroturfing happening. These organizations were watching this very closely. Even the Burmese government, they were in contact with Facebook. 

And then as surges of violence began, even folks in Europe and the US like Susan Benesch began speaking directly to people at Facebook to say, “It looks like something very bad is happening here.” 

And what Facebook and more broadly Meta did was push that knowledge away from anyone who could act on it. They were warned over and over again for years. And they were warned not like something might happen. They were warned, this is actually happening. We can tie these riots, this ethnic cleansing to these things that are happening on Facebook, we need help. And they did not do it. 

At the same time, they were doing things like rolling out monetization that wiped out the media landscape in Myanmar, the digital media landscape and essentially replaced it wholesale with click farms and other sort of adversarial organizations. 

It turned out that the Myanmar military was running a very sophisticated, extremely well-trained and well-staffed round the clock propaganda operation that laid the ground work for the genocide on mostly Facebook, also YouTube, also they were on blogs, they were everywhere running like massive inauthentic account networks that really made it seem to ordinary people in Myanmar as though the vastly prevailing sentiment is that this demonized ethnic minority, the Rohingya, were doing all of these horrific crimes, were committing atrocities, were a real danger to their women and their children. And that, according to the UN and their team who were on the ground did lay the ground work for atrocities. 

Ethan Zuckerman:

I was in Yangon in 2013 and 2014. I was there in 2013 with Open Society Foundation visiting some of those independent media organizations that were trying to figure out how to cope with the shift to Facebook. And we’re essentially saying, “Our websites are useless, everything has to go to direct to Facebook, there’s no other way for people to look at it.”

In 2014, I was invited to give a speech at a media freedom conference in Yangon, which on the one hand sounds a little bit absurd given that the country had, Myanmar had emerged from military dictatorship not long before. But in the neighborhood at that particular moment, Myanmar actually looked pretty good compared to some of its neighbors. 

Unfortunately, a big part of that was that the military government was figuring out how to use this new media environment, how to use some of the new freedoms within it to put forward these false narratives. I think what I was really struck by, so like you, I’ve heard statistics of anywhere between one and four Burmese speakers working for the entire nation at the moment where the genocide came to light. 

I also know a lot of those organizations that were being asked to report not only were those very fragile and under-resourced organizations, but they basically told Meta, “We can’t get you on the phone, you’re not listening to us. You think, you know, we’re helping you do this, but literally your systems to listen to us aren’t operating. 

How do we think about both this abdication of responsibility, but as you just pointed out a moment ago, rolling out technology in a way that sets up a set of incentives that make disinformation not only possible, but profitable. How do we address the responsibilities for a platform like this? 

Erin Kissane:

I think a lot about the international courts and their lack of traction on handling war crimes and sort of the traditional modes that we think of—of atrocities and crimes against humanity. 

I understand to some extent having been an ancient Internet person how we got here, but the fact that we still lack any, not just governing bodies, but ways of thinking about this work and governing the platforms in a way that is connected to the body of human rights law internationally is still kind of astonishing to me. We don’t have any framework for it. So I don’t have an answer for you on how do we address it. We can talk about it. 

I think the fact that the UN team that did an extraordinarily in-depth report on all the many factors, and I want to say of course Facebook and the Internet, that was one factor. There were many, many factors. The military that later did the coup and is now still running the official government in Myanmar, you know, there was that there was a long weird history of intense ethnic conflict, not just between Burmese people and the Obama majority, but also across many ethnic minorities in this very complex situation. 

But it does seem clear, at least to the people who were doing this research on the ground to try to sort out what had happened, that coming in in this way to this population in such a callous and careless manner did provide fuel and infrastructure for doing the necessary cultural groundwork you have to do before you can accomplish ethnic cleansing and this kind of state violence. You can’t do that work unless you’ve already done the cultural work necessary to dehumanize and denationalize your victims. 

So we are so unprepared as a set of societies for that kind of circumstance. It just feels like we’re, I don’t know, decades behind an ability to understand what’s happening to us in ways that feel…

I understand the people drawing parallels to what happened in the industrial revolution and the way in which that destabilized societies. But this is global and fast in ways that don’t really match other episodes in human civilization. So I don’t know if that was really an answer to your question. 

Ethan Zuckerman:

I do think it’s an answer in the sense of this is a fuck-up of the level where you can’t just sort of say, oops. You might actually be saying, is this a moment where senior meta executives should be facing the ICC or other courts of human rights in one fashion or another. 

As a comm scholar on all of this, it took us a while to sort of figure out, oh, hey, broadcast is really powerful. It turns out if you put a mic in front of Adolf Hitler all the time and he says the same thing over and over and over again and no one else can hear other voices, it’s going to turn out to build a lot of will within the population. We thought we knew how to deal with that sort of broadcast authoritarian media. And it turns out that the authoritarians were very, very clever and managed to figure out the dynamics of some of these new media environments faster than either that faster than the platforms did or faster than the platforms cared to admit. 

Let’s fast forward a little bit. So Facebook/Meta take a PR hit, which is pretty modest compared to the actual scale of what ends up happening. They end up reacting to mis- and disinfo as we get into the COVID era quite differently. 

They start quite actively taking down posts in the interests of public health. This turns out to alienate large swaths of the population. It becomes a hot button political issue. As we head through the 2020 election, Facebook is increasingly active in policing rumors of election fraud. And as we have the January 6 riot, you see Trump and others deplatformed from the platform in a way that looks at least like responsible governance from one point of view, perhaps an overstep against political speech from another point of view, clearly a place in which there’s going to be some counter-reaction. 

As you put it, we’re now in an entirely different moment with the platforms. So bring us from 2014 to today on how these big platforms are currently thinking about governance. And then let’s use that to pivot to the Fed-avers. 

Erin Kissane:

That’s a busy decade to do in one question. I think there are probably scholars of platform trust and safety who can go in on depth in the specific movements in the ways that platforms have gone through pendulum swings like the ones that you’ve described. Obviously, at the moment, we’ve swung the other way. 

A lot of the centralized platforms have drastically reduced their trust and safety workforces even before the very public sort of abdications of responsibility for monitoring hate speech and that sort of thing. We’ve seen that pendulum move multiple times. 

Obviously, there is a connection to what political force is in power in the US. So I see frequently the argument that the technology companies are just responding to whomever is in power, whoever can threaten them, they’re going to either moderate or not according to essentially whoever controls the White House and Congress. 

I’m not completely sure that that’s the full picture because to tie this back to an earlier piece of our conversation, the tech companies don’t want to be in that business. I think they are much happier when they can step aside from that work. 

It’s very difficult for me to feel like I can make any confident statements about how directly the platforms govern its choices during, for instance, the pandemic, the first couple of years of the pandemic in the US and as you said, through the election through January 6th, you can say there’s a straight line from that to a bunch of things. 

I don’t know. I think we’re too close to it to really know. There’s a strong narrative that they overstepped, people got really angry and that’s why we’re where we are. You can say that about maybe two dozen separate forces in society right now. So I don’t know that the explanatory power is actually there for that. 

But I think we can see that the platforms are deeply interested. If you look at all the various whistleblower disclosures and insider accounts that have come out over the last two decades, the interior narrative, and I’m going to continue to talk about Meta because they are the worst actor. In that way, they give YouTube and others cover by just being so blatantly terrible. 

But their internal narrative of really not wanting to do this work at all has never changed. 

You would see the Wall Street Journal did exceptionally good reporting on Facebook would make decisions to clamp down on certain kinds of speech either in the political zone or the public health zone. 

And then as soon as possible, there would be a directive usually from literally Mark Zuckerberg saying, “Okay, we’re done. So roll it back.” I think there’s a very clear narrative that they didn’t want to do it. They don’t want to do it. They’re not good at it. It’s the wrong model. 

So we can want them to do things and we can agree with their choices in certain periods, but it doesn’t make the structure any sounder. Ultimately, they are the wrong shape to make good governance decisions for us as societies. 

So the global population, the global majority was still experiencing really serious, severe under moderation on human rights concerns, on not just hate speech, but exciting violent speech. 

We saw that in Ethiopia. We’ve seen political destabilization campaigns globally that continued to happen even as Facebook having sort of had its hands smacked over Myanmar. [Meta] was trying to, I don’t know, LARP good governance in the US for a while, but as soon as they could stop, they stopped. So I don’t know. I think it’s probably wise for us to just accept at this point that these are not the companies. 

These are not the kinds of institutions that are actually structurally capable of doing whatever good governance of the internet really is, which I don’t think we have figured out yet. 


Comments

Leave a Reply