The TikTok ban decision passed down by the Supreme Court late last year is a clear violation of First Amendment precedent, but President Trump’s refusal to enact it as law is a constitutional crisis in the making. We brought Stanford Law professor, rising First Amendment star scholar, and Moderated Content host Evelyn Douek on the show to ask her one simple question about this conundrum and so much more: what the fuck?
Evelyn previously appeared on the first episode of our series on trust, “Do We Trust the Internet?”
Transcript
Ethan Zuckerman:
Hey, everybody. Welcome back to reimagining the internet. I’m your host, Ethan Zuckerman. I am so thrilled about this.
I have with me in conversation today, really one of my very favorite people to talk about the internet with Evelyn Douek is an assistant professor of law at Stanford Law School. She’s the cohost of the Moderated Content podcast, which is one of those podcasts that I absolutely do not miss. I think it’s utterly essential. I prescribe it for my students as well.
Evelyn is a really accomplished and influential legal scholar, even though she’s early in her career, has had pieces published in the Harvard Law Review, the Columbia Law Review, writes for publications like Lawfare, The Atlantic, and Wired. She’s been with us before an episode 71 of our show in the first episode in our Trust series, “Do We Trust the Internet?”
But we’re bringing her here because she’s really my favorite expert on the First Amendment, despite being Australian and Australia doesn’t actually have a First Amendment. So I’m hoping that Evelyn is going to help us through some really thorny questions that are showing up in US constitutional law at the moment. Evelyn, thank you so much for being here.
Evelyn Douek:
Far too kind as always, Ethan. Thank you so much. It’s great to be back.
Ethan Zuckerman:
Well, it is seriously a joy to have you here. And I have to say, ever since January 17th, I have been saying, I really wonder what Evelyn thinks about this. And do you want to start by telling us what happened on January 17th?
Evelyn Douek:
Yeah, well, on January 17th, which feels like a million years ago. So let’s cast our mind back.
Ethan Zuckerman:
Like 20 years.
Evelyn Douek:
Yeah, it’s not like anything else has happened in First Amendment land since then. But on January 17th, the Supreme Court unanimously upheld against First Amendment challenge. The law that is colloquially known as the TikTok ban, formerly known as the Protecting Americans from Foreign Adversary Controlled Applications Act, which required that TikTok either divest itself from Chinese ownership or be banned in the United States starting January 19th, which obviously is also in the past.
And our listeners may also know that TikTok is not currently banned in the United or is currently operating in the United States. So there’s obviously more chapters to tell. But that’s where we start.
Ethan Zuckerman:
And in this decision, which by the way is unassigned, and that just means that we don’t know specifically which of the Justices wrote it, although people love to guess about these things.
The Justices noted that quote, “For more than 170 million Americans, TikTok offers a distinct and expansive outlet for expression, means of engagement and source of community. However,” comma, quote, “Congress has determined that divestiture is necessary to address its well supported national security concerns regarding TikTok’s data collection practices and relationship with a foreign adversary.” So I apologize for starting out with a very technical legal question. But what the fuck?
Evelyn Douek:
Yeah, that’s that was basically what I said.
Well, actually, no, honestly, if you’d listened to the oral arguments, it probably wasn’t massively surprising that that this is where they came out. And also, if you’ve, you know, spent some time with the history of First Amendment law, it also sadly isn’t that surprising that when the government comes in and makes big hay about national security risks, despite that being exactly when you would want courts to be very alive and suspicious to the government’s claims, history shows that that often is when courts will acquiesce. So yeah, what the fuck indeed?
Ethan Zuckerman:
And so in the simplest terms here, there is a First Amendment right to access information, right? Or at least we have some past jurisprudence that suggests that there’s a right of some sort. Everybody seems to be calling back to this case from 1965, Lamont v. Postmaster General. Can you tell us a little bit about that case? Is that is that a useful case instead of understanding why people thought this might turn out differently?
Evelyn Douek:
Yeah, so I mean, there’s a whole smorgasbord of First Amendment rights that are implicated here, right? Like the right to access information is absolutely one of them. And I want to talk about Lamont because it’s an incredibly important decision.
But we should just be clear that TikTok has a First Amendment right to speak in the United States. And they were a plaintiff in challenging the law and saying, “Hey, our First Amendment rights are implicated when you try and ban us and shut us up.” And then of course, there were also the users. So they were they were they were a set of plaintiffs as well. And they challenged the law exactly as you say, saying, you know, we have a right both to speak on this platform, and also to listen to access information and to access the network and the algorithm that TikTok uses on this uniquely valuable platform.
Ethan Zuckerman:
Right. So we’ve got at least three different groups whose First Amendment rights are being implicated. TikTok has a right to speak and depending on how we think about the Net Choice cases, which, you know, gets sent back down to lower courts, but essentially say, actually, it may be First Amendment protected speech to figure out how you’re going to filter the content platform, even if that means filtering out content disproportionately, that seems to favor Republicans over Democrats, because nearest we can tell there’s a lot more misinformation on one side of it than the other. There might be a platform right there.
There’s clearly a protected right of people to speak. And of course, normally what we end up saying is, you don’t have a right to say something on Facebook, because it’s a private space, and Facebook is restraining your rights. The First Amendment says Congress shall make no law.
But Congress here made a law and made a law essentially saying, you cannot speak on this platform because it’s owned by a foreign adversary and unless they sell it to someone who is no longer a foreign adversary. And then of course, Lamont and sort of the right to listen associated with this.
So on one side of this, we just have piles and piles and piles of First Amendment jurisprudence. Help me understand this distinction between strict and intermediate scrutiny.
It seems to be how the court ended up sort of saying, all that’s on one side of it, the government’s argument is on the other side of it. And maybe under strict scrutiny, we would take the First Amendment seriously, but under intermediate scrutiny, we don’t. What does that mean?
Evelyn Douek:
Right. So strict scrutiny is the highest most demanding form of constitutional scrutiny that can be applied to a law. So it just means that the court will demand from the government a really compelling interest that requires this law to be passed. And proof that this is the most narrowly tailored way of achieving that interest.
So there was nothing else they could do that would burden speech less in order to make sure in this case that the national security risk was mitigated.
And that is that, you know, there’s a famous saying that strict scrutiny is strict in theory, but fatal. In fact, it is really, really hard to satisfy strict scrutiny as it should be. The idea is that when governments are passing these kinds of laws, we generally assume that they shouldn’t be. And if the court had found that this was a content based law, that is that the government was passing this law in order to discriminate against certain kinds of speech or because of concerns about the kind of content that were that was on TikTok, then it would be subject to strict scrutiny.
Ethan Zuckerman:
Got it. So if as some people into the discerning, TikTok actually was trying to sway people towards a pro-Palestinian point of view and an anti-Israeli point of view, and the argument was about the ability to access that information, and maybe the idea that the platform was pro-Palestinian—maybe that actually would have triggered strict scrutiny.
But in this case, we fall under intermediate scrutiny, which is less strict. What does it mean once the court decides we’re not under strict scrutiny? We’re under intermediate scrutiny?
Evelyn Douek:
Yeah, can I just go back to the point about pro-Palestinian speech? Because I think this is really important.
Because, you know, there were many lawmakers who had voted for the bill who are out there saying, you know, one of them and there was evidence there was reporting that in the lead up to getting this bill over the line, that concerns that pro-Palestinian speech was especially prevalent on the platform—and this was, you know, post October 7th, this was in the middle of all of the campus protests, when this was like one of the biggest issues of domestic debate, domestic political debate in this country—there was concerns that TikTok was skewing that debate. And lawmakers were saying things that evidence that that was why they were interested in this law.
That is even worse than content-based discrimination. That is viewpoint discrimination. And that is like, really anathema to the first amendment, and basically, per se, unconstitutional. So if that had been the case, if the court had accepted that as the motivation, that would have been really exceptionally hard to justify.
Ethan Zuckerman:
It’s also, by the way, probably wrong. So one of the big things that we’ve been doing in our lab lately… As you know, we do lots of unpermissioned research, we take random samples of data sets. We have a big paper that we hope is coming out in the next month or so based on a random sampling of TikTok.
And maybe the most interesting thing that we find about a random sampling of TikTok is that the US is about the third biggest country for creators on TikTok. But it’s well behind both Indonesia and Pakistan.
Evelyn Douek:
Wow. Wow.
Ethan Zuckerman:
And in fact, it turns out that if you look at TikTok from a creator point of view, look at what videos are posted, you look at where in the world they come from. TikTok is a global South network. It’s very heavily concentrated in South Asia. Bangladesh is an enormous presence, even though it’s been blocked in India since 2020, India is still an enormous presence just because of how much was put up there.
But it is enormously popular within many Muslim majority countries. Even if you were just looking in terms of who’s likely to have a pro-Palestinian or pro-Israeli stance, just in sheer numbers, there’s a very good chance, even without thinking about, you know, our Chinese spy lawyers putting a finger on the scale, that you might have ended up with something that didn’t represent the balance that American lawmakers end up wanting to think of this.
But your point being had Congress gone after this and said, we are going to try to ban TikTok, either because it’s too pro-Palestine, that would be viewpoint discrimination, or because it has content that we think is damaging to the status of Israel in the world, that would be content discrimination. Instead, they come out and essentially say this is a national security concern. And how is that to use a phrase, a trump card that allows them to get around what otherwise would have been constitutional protections?
Evelyn Douek:
Yeah, so the content-based issue that it would have been content based discrimination if they had said, and they did in fact say this, and this is what’s really remarkable about the TikTok case, they said, look, we don’t know what it is that the algorithm might promote. But we’re just really concerned about algorithmic manipulation, and about that convincing people to believe things that that we might not necessarily want them to believe. Or just that they weren’t exactly clear on what the what the threat was, but that that the algorithm could be manipulated for pro Chinese interest.
Ethan Zuckerman:
So obviously, having a social network used by millions of people, where the algorithm might promote one point of view over the other, in a way that’s opaque, and might allow an individual or an entity to have disproportionate influence over a conversation. Yeah, this is unacceptable. And simply nuts to me.
So can we is this just nakedly racist? I mean, we just described X under mask, right? I mean, we just described, you know, what’s happened to a very, very powerful platform in becoming, you know, quite clearly weaponized to put forward right-wing ideology.
A number of people have done experiments in Europe, looking to see how X has been supporting AFD, the German far right party, they found very strong evidence that you can start with essentially an empty X account, and end up with nothing but pro AFD propaganda very, very quickly. Is this just that we treat China differently than everybody else?
Evelyn Douek:
So, so that is what is really jarring about this, right? Is because as you said, in the in the Net Choice v. Moody decisions, the court had just affirmed that that kind of algorithmic manipulation of a platform is first amendment protected speech.
So how do we get there that that with TikTok? And the answer is this national security interest, where they said, Well, this law was not just about algorithmic manipulation. This law was also about concerns about data security and concerns about the data going back to China, which implicates national security concerns, we don’t know what China is going to do with that data, we don’t know who they’re going to blackmail, and that the law would have been passed, just because of this data security interest.
And so we’re not going to worry about that other possible motivation that that Congress might have had. It’s enough that this data security interest would have supported the law, and that’s a content neutral justification.
Ethan Zuckerman:
All of which would be much more convincing if there were not a multi-billion dollar industry in the United States selling user data, including to the Chinese, right?
I mean, you can buy much of this data from data brokers who work internationally. This is a giant, largely unregulated industry. The European Union has actually attempted to go ahead and regulate this. This is where GDPR starts coming into play. The US arguably is massively far behind on all of this. This feels like naked sinophobia.
Evelyn Douek:
Sorry, were you wanting to rebut that statement, or? Because you may have got the wrong guest.
Ethan Zuckerman:
But Evelyn, what do you do? Right? I mean, like as lawyers, and as a law professor specifically, right? You teach about precedent and you teach, you know, sort of this notion that case law builds up into law that works in a particular way.
You know, we mentioned in passing this Lamont v. Postmaster General case from 1965. As I understand the case, and please, you’re the law professor, not not me. We have a wonderful activist and provocateur who publishes material from all over the place. He gets sent—Dr. Corliss Lamont gets a copy of the Peking review that’s been sent to him. He hasn’t actually ordered it. But someone sends him the Peking review.
And at this particular moment, you know, during the Cold War, the Postmaster General pursuant to section 305 holds up these publications and basically says, unless you send me a postcard saying you want to receive this, we’re going to send it back to try to prevent communist propaganda from making it into the United States.
Dr. Lamont brings this case and the Warren Court decides eight to nothing that having to ask to receive your mail is unconstitutional because it’s an affirmative obligation, which is an unconstitutional limitation of your rights under the First Amendment.
We are being blocked well, well, well beyond what would have been happening in Lamont v. Postmaster General. It’s very hard for me to believe that we are more sinophobic right now than we were in 1965, you know, not all that far removed from McCarthyism.
Evelyn Douek:
Yeah, so I mean Lamont is a wonderful case for exactly the reasons that you you’ve said it’s a wonderful summary, you get full marks in my in my First Amendment class.
Ethan Zuckerman:
This is great. I have meant to go to law school for the last 30 years or so. So there you go.
Evelyn Douek:
You got some credits on under your belt. So yeah, the court says, you know, not only that this this is unconstitutional, but that this this regime is at war with the uninhibited robust and wide open debate and discussion that are contemplated by the First Amendment because requiring Mr. Lamont to say to the government, yes, I really did mean to receive Chinese propaganda would obviously have a chilling effect on people from receiving this information.
Ethan Zuckerman:
It’s almost as bad as making someone show a government ID to agree that they want to see pornography.
Evelyn Douek:
Right, exactly. Yeah, well, that’s another one that there’s so much going on these days. So yes, so it really was this like loud and robust defense of First Amendment values and freedoms, even when the speech comes from abroad, right? This was to say that it’s not just American speech that we want to protect, but the right of American listeners to receive information even from abroad.
And you’re right, we are in this moment where you would have thought that that was really, really a very useful precedent.
And so, you know, we go back to well, what the fuck? And I guess the answer is something like, I mean, here, I think the government’s assertions of the national security fears was much greater in some ways.
This I think there is something sort of scary and unknown about what’s going to happen to the data, right? And like just not a lot of facility with all of the things that you talked about, Ethan, about the data broker market and the fact that this was a really ineffective way of dealing with data security concerns.
So when the government comes in and says, you know, we have bipartisan agreement, and we don’t get bipartisan agreement on anything, but we have bipartisan agreement that this is scary. I mean, yeah, the justices are going to be scared.
And then I do think there is just something about social media that maybe felt scary. Because you’re right. Otherwise, squaring this with the president is difficult.
Ethan Zuckerman:
So there is sort of a wonderful piece of the story that I want to make sure that we don’t skate past because it is just fantastic. And it’s basically the kids are all right part of the story, right?
So for a brief time, TikTok becomes inaccessible in US app stores. For a very brief period of time, TikTok is blocking connections from US users. We have this sort of hilarious one to first of all, we have TikTok users going on the app and saying goodbye to their Chinese spy handlers with gratitude for the wonderful relationship they’ve had together these are like memory reels of like, you know, how we were interacting with one another.
But then the pièce de resistance was people logging on to Xiaoheng Xu, another Chinese own social video app, sometimes called Red Note, which very clearly has, you know, every sort of concern associated with it but is not actively being blocked by all of this.
It seems like even if Congress is really afraid of Chinese influence over social media, and the Supreme Court is really afraid of Chinese influence over social media, it doesn’t seem like the users of social media are particularly scared of the influence of the Chinese over social media.
Evelyn Douek:
Right. And the government’s answer to that in court was, well, they’re not now but you know, what we’re concerned about is some kid that’s not being adequately careful now growing up and being some important public official or you know, having some position of influence. And who knows what China will do with the data that they have on this put on these people later or you know, in the aggregate, what kind of information are they getting. So I think that’s it is a fundamentally paternalistic response.
Ethan Zuckerman:
One of the things that I teach my students in my class Fixing Social Media is that as soon as you hear a politician say that they’re going to do somebody to protect children, you know, it’s a really good idea to apply shall we say strict scrutiny to what they’re proposing at that point, because it’s almost always bullshit to one extent or another. It’s often well intentioned bullshit, but much of the stupidest legislative thought we’ve seen has to do with kind of following that the various different sort of vague threats associated with this.
But of course, you know, we’re having this conversation. And I don’t know about you. I checked TikTok, you know, before I went to bed last night. Trump made clear that he wanted to have a chance to weigh in on TikTok, TikTok CEO Xiao Chu attended Trump’s inauguration with a veritable party of tech baddies.
Should we be more worried about the Supreme Court apparently overturning years of First Amendment precedent in order to make a national security decision? Or should we be more worried about the constitutional crisis that comes from the fact that Congress passed a law, the Supreme Court upheld a law, and now the executive appears to be ignoring it?
Evelyn Douek:
Yeah. So great question. I mean, what a clown show, right? Like it this goes to your point, actually, that when someone says they’re doing something to protect the children, we should call bullshit.
And that seems to be the case, because, you know, Congress passed this law, and then the Solicitor General came into court, and they all said this is so necessary for national security, there is nothing else that we can do to prevent this grave national security risk to this country court, you must uphold this law, we must ban this platform if it doesn’t get sold. And then Trump says, you know, I don’t know, maybe I can do a deal and everyone just goes silent.
Like what happened to the grave national security threat that they were they were so concerned about—suddenly has evaporated? There’s been very nary a peep out of out of Congress about how upset they are that this law that they thought was so important is not being enforced.
Ethan Zuckerman:
I mean, Evelyn, now that you mentioned it, there really has been nary a peep out of Congress. Has anyone checked in on them? Are they okay I mean, they know, surprisingly quiet lately.
Evelyn Douek:
You’re right, even like I would forgive them not speaking up about TikTok if it was that they were, you know, so busy addressing all of the other grave threats to the constitutional democracy and Congress’s Article One prerogatives that are going on in this country right now. But it’s it’s not that they’re busy, you know, speaking up for Article One in other contexts and just haven’t gotten to TikTok yet. It’s just stunning acquiescence all the way down.
And you’re right, because this is this is very it is stunning to see Congress pass this law, the Supreme Court uphold and the law is not ambiguous. The law is very clear about its terms. It’s very clear about when it starts. It doesn’t say the executive shall have discretion to enforce this law or you know, after this date, you know, the Attorney General may bring charges.
No, it says after this date, here are the consequences for continuing to host this application. And the President, I mean, to be fair, Biden also president, outgoing President Biden, because the law came into force on January 19th, also didn’t keep up about the platform there and about the application and said he was going to punt it to the next administration. And then Trump has invented this arbitrary deadline of 75 days, while he tries to broker a deal for the sale of the platform in the meantime.
Ethan Zuckerman:
And let’s talk about those deals because those deals are quite extraordinary. According to President Trump, there’s at least four players out there that have expressed an interest in buying TikTok.
We know that one of them is Frank McCourt. Frank McCourt is a real estate billionaire, perhaps best known for destroying the LA Dodgers for a series of bad business decisions. He’s fallen in love with the blockchain. He has a solution to all problems of social media called Project Liberty. He’s written a book about this. So far, the only company that has been willing to take him up in his solution is a social media company called MeWe, which I think most people either haven’t used or if they have used it have found that it’s basically occupied by right wing conspiracy theorists.
Evidently, his theory is that he wants to buy TikTok just for the videos. He doesn’t want the algorithm and he’s willing to pay $20 billion for this, which seems a whole lot less than TikTok would expect to get from this.
Do we think anyone is actually going to buy this thing?
Evelyn Douek:
I mean, your guess is as good as mine. I have no idea. I mean, one of the deeply problematic things about this moment is that this is all stunningly opaque. We have these little details dribbling out in the media. There’s no, I don’t know how much reporting you’ve seen, but not a lot of, there’s not, we can’t be sure if ByteDance or the Chinese government is going to go along with this and allow a sale to take place. So there’s that hanging in the background.
And of course, this is exactly why we should be concerned about this extremely discretionary process that’s going on because here you have a president who is famously thin-skinned, famously sensitive to how he is portrayed in the public and on social media. And he is engaging in these backdoor, closed-roomed conversations with people to control one of the country’s biggest social media platforms. What could go wrong?
Ethan Zuckerman:
And we should mention that Trump is also quite thin-skinned. It’s not just Elon who’s tremendously sensitive about this. I mean, Donald Trump also has a bit of a temper.
So what do we think of the rumor that this is an attempt to get Elon and X to go ahead and buy the platform? Do you buy that particular conspiracy theory? Or is there any theory you subscribe to within this?
Evelyn Douek:
I don’t have any particular insight. I have no idea. Do you? Do you think, what’s your bet of what’s going to happen?
Ethan Zuckerman:
So I’m not sure I have a bet on what’s going to happen. But the theory that I really want to put on the table, and I’ll be interested to hear your reaction to this as well.
Again, my lab is doing a lot of work these days studying short video. And one of the things that we found recently in our research is that there’s a huge surge in Indian usage of YouTube starting in 2021. And it makes perfect sense. TikTok gets banned in late 2020. And near as we can tell, there’s this mass migration from TikTok onto YouTube.
And so when we look at that TikTok ban in India, there’s some good reasons for it. There’s actually violence on the border between India and China. There’s multiple incursions. It is a significantly more serious security situation than the security situation we have with China. India bans TikTok, I’m not in favor of platform bans in any cases.
But what that ban does is kind of fascinating. India thinks it’s going to spark a whole wave of local short video services. There’s like 10 services that spring up to fill the void. And what happens instead is everybody flocks either to Instagram or YouTube.
So when I read this law, I basically read it as a hundreds of billions of dollars anti-competitive nationalist subsidy to Google and Meta. Because if you do end up blocking TikTok, you’re going to hand that traffic to those two companies.
And I think this may be where Congress has not actually calculated how this is all going to work yet. It’s not going to kill TikTok. TikTok is a global south network. They’re going to be okay. The US is a pain in the ass. It’s probably an attractive advertising market for them. But it is not the whole thing.
TikTok is actually a fairly small fraction of ByteDance. And my guess just based on our analysis is that the US is certainly more than you know, 10% of the advertising market, it’s about 10% of the creator market on TikTok. It’s just not that big a deal.
This is a subsidy to Google and Meta. And I think once Congress figures out that they’re subsidizing companies that they otherwise are spending a lot of their time beating up on, they may not be quite as thrilled about this in the long run. But I am so baffled by how our country runs at the moment, Evelyn. This is why I’m, you know, finding myself reaching out to people like you for some insight about this.
Evelyn Douek:
Yeah, I mean, I, I don’t know that Congress thought this through. So I completely subscribe to that part of the opinion. I don’t know that that that there was a really… I mean, the scuttlebutt is that they thought that this would just be leveraging a sale and that it would inevitably get sold. And so that would be how it worked out. And so we’ll still wait and see.
I mean, I think you’re right that, you know, politicians love beating up on Meta and YouTube and Google. And that is a favorite pastime. But I do think there is also this American pride in the American tech industry. And of course, that’s what, I mean, people around the world are so outraged about, right, is that we have for, you know, decades now, accepted the American internet, we have been beholden to your American platforms coming in and being the dominant, you know, public square in Europe or Australia or wherever it has to be or in the global south, all around the world, these American platforms were the dominant platforms. And in many cases, they played absentee landlord, and were very irresponsible in the way that they ran, ran their services in those countries, of course, we know many tragic stories about that.
And now, when it’s another country that that that is doing it, American gets worked up about it. Now, I don’t want to minimize that that’s that account slightly minimizes the Chinese threat. And I do think the difficult thing about this, of course, is that there, there is something, you know, different about data security and China. So I don’t it’s not quite that simple. But certainly, I do think they’re the protectionist element here is not hard to miss.
Ethan Zuckerman:
Given that I have you on the line, and given that you wrote one of my favorite papers, trying to understand what happened with social media and censorship in the 2020 election, your content cartels article, I feel like I’ve got to talk with you a little bit about how we understand what’s happened in social media over the last eight years or so.
So I’m going to give you a very compact history. And I’m going to ask whether you’re reading it sort of the same way that I do.
2016 comes about two enormous political surprises, Brexit, Trump’s election. Many, many people, particularly academics, particularly left stand up and go, “Oh, my God, miss this information. It’s going to break democracy. We don’t have a common set of facts anymore. This is why people make terrible decisions. Maybe we should do something about this.”
Evelyn Douek:
Yeah, the big fake news crisis of 2016, 2017.
Ethan Zuckerman:
2020 rolls around. And we have genuine misinformation that’s out there. We have bad information about COVID that is probably leading to people getting sick and dying. We have claims of a rigged election. And the platforms apparently act in lockstep to shut off the sitting president, which is wild, like, absolutely crazy to sort of think about.
You correctly point out, this might not be a good thing. You actually want these platforms to think for themselves, you want them to make their own decisions, having them act in concert is probably not a great way to be.
2024. All of these platforms apparently are announcing that they’re no longer going to try to do fact checking. They’re not interested in figuring out what truth is. The community can figure it out. It works so well for Wikipedia. I’m sure community notes will be perfectly fine for this. And as you noted earlier in our conversation here, the platforms now feel comfortable in public having algorithms that are clearly politically biased in one fashion or another. The current US co-presidents, both own social networks that clearly are putting forward a particular point of view, whether it’s Truth Social, whether it’s X.
How do we read this overall situation? Have we given up on this notion of platform responsibility? Are we just accepting at this point that social media platforms are tools of propaganda?
Evelyn Douek:
Yeah, I mean, this feels like a much-needed therapy session for me, because I’m trying to puzzle through this moment as well. It feels like, I mean, it really does feel like a profound shift is going on and I’m trying to make sense of it.
I think that one of the defining dynamics of this moment is the backlash to the reaction to the tech clash. So we had the tech clash, the platforms were bad. So the platforms, you know, put on their suit jackets and tried to be responsible adults and, you know, invested in trust and safety and said, we’re going to do, we’re going to do better. Please forgive us and rolled out many greater, you know, content moderation rules.
And then, and I think there are legitimate arguments that in certain cases they overstepped. They went too far. They got it wrong. And so then there was a backlash to those, to those programs. And that is the moment that we’re leading, living through now.
And so platforms are now rolling back all of those programs and, you know, literally taking the CEOs are literally taking off their suit jackets and becoming bro-y again.
And, you know, if there was evidence that this sort of reeling it in was as a result of sort of considered action or, you know, trying to work out exactly what is the right balance to strike and how do we think about these really difficult issues of free speech versus online harm and, and balance these really difficult equities and the concern that, you know, platforms have too much decision making power and we don’t want to overly constrict the online public sphere, then, you know, maybe I’d be all for that if it was really evidence based and it looked like it was something other than what it is, which is just trying to curry favor with a vindictive and mercurial president, or co-president as you’ve been saying.
And so, yeah, it is extremely worrying. It is all extremely opaque. It’s not exactly clear the extent to this is actually being enacted in practice. And the other thing is that, you know, we live through this era, this, this era of like trying to create legitimacy and all of these, you know, hand-wringing and then heavy is the head that wears the crown, Mark Zuckerberg saying, I need to set up an oversight board because I don’t want to be making all of the decisions all by myself. It’s so hard. I understand. It’s important.
Ethan Zuckerman:
I’ve got to open it up to great experts out there and pay them lots of money to do it. And even if it’s slow, it’s the right way to do it because it’s so important.
Evelyn Douek:
And then he comes out with a little Instagram video or whatever it was saying, actually, I’ve changed my mind. I’m going to change the rules unilaterally overnight.
Ethan Zuckerman:
Right. And so, yeah, it does trans people can be cockroaches. And that’s just fine. Yes. Exactly. So to be very clear, that is not the position of the show or the vision of me or so on and so forth. Yes. Right.
Evelyn Douek:
That’s, you know, the oversight board has spent all of these years making these decisions like finally passing the hate speech rules and saying, here’s how you need to change and make more certain. And then Mark Zuckerberg literally makes those changes by himself overnight. And so, yeah, I think that we are in a profoundly different moment.
And I don’t know if it’s just going to be cyclical that we’re going to have a backlash to the backlash to the reaction to the tech clash or, or whether this is now a new normal. But, but yeah, I have real whiplash.
Ethan Zuckerman:
So cyclical in many ways these days feels like an optimistic right. Right. I think that’s right. I suggest that we might come back from what seemed like a bunch of very extreme points. If this is a therapy session, you know, maybe this is the point where we both break down in tears.
But a lot of hope for the continued success of the American experiment seems to rest on the courts at the moment. We have a legislature that appears missing in action. We have an executive that clearly is behaving in a monarchical fashion.
The only branch that seems to be worth having any hope in right now is the judiciary. And I got to say, it’s not a moment for a lot of confidence in the US judiciary. We were having a long national conversation about eroding trust in the Supreme Court, real worries about how precedent was sort of coming into play.
We’ve just had a conversation here in some detail that suggests that even people who follow the law very closely understand this quite deeply have some trouble understanding the analysis that leads us to a situation like the one with TikTok and then also raises this issue that even if the Supreme Court has made this decision, you know, the Chief Justice has made his decision, let him come enforce it. We’re not actually seeing these things hold. How confident should we be that the courts are going to help us out of this mess, Evelyn?
Evelyn Douek:
Gosh, I feel like I can only make a fool of myself by answering with any level of certainty in this moment. It’s, you know, one of my favorite, well, not one of my favorite, but my first question for colleagues and friends when I see them these days is, how worried are you? Because I’m trying to work out how to calibrate. And I don’t know. But the answer is often very worried.
So, you know, breaking that down, I mean, specifically about First Amendment, let’s just take that part of this, because that is what we’ve been talking about, and that is what I’m most qualified to talk about.
I mean, one of the silver linings, and let’s try and, you know, let’s try and bring this back to the TikTok decision that we started. One of the silver linings is I do think there was some deep unease with what the court was doing. The court spent a lot of time in that decision trying to narrow its precedential value, kept saying, you know, this is a really one-off, this is an exceptional case, you know, and noting that this might not be, or it should not be bound to this. It didn’t want to be bound to this for many future cases.
Now, what actually happens in practice is anyone’s guess, like they don’t get to decide how an opinion necessarily gets read by history.
But there was deep unease. And there’s even this concurrence from Justice Gorsuch, where he gets so close to saying, no, this is too far. He couldn’t quite get there. But he does recognize that this is like the red scare. This is like the famous Supreme Court cases that birthed the modern First Amendment. And he sees that that is what is going on here.
Ethan Zuckerman:
And we’re reading this in the wake of Net Choice v. Paxton, which suggests that First Amendment arguments actually still have some weight, still have some precedent that there is the possibility of thinking hard about a very complex issue, like what the speech rights of a platform might end up being. So maybe this is just a bad decision and a really weird time?
Evelyn Douek:
I think it’s possible that, I mean, so far, the courts have been standing up to a lot of the violations and unlawful actions of this administration. And I don’t think that that is uniformly going to, I don’t think everyone’s going to get everything that they want from the courts. But I do think that the First Amendment commitment does run deep. And maybe I’m going to sound really naive here in retrospect. It’s entirely possible. And I am very worried about it. And it’s also something I’m very worried.
I mean, the thing about the TikTok decision that always made it a potential weak point or a vulnerable point is the xenophobia and the concerns about foreign speech. And so when it comes to speech of immigrants or foreigners, that is an area that I am actually very worried.
But it is also possible that this deep First Amendment commitment in this country does mean that we end up with some decisions that that really vindicate that at a moment where that is only all too necessary.
Ethan Zuckerman:
Well, Evelyn on that small, tiny note of hope. Let’s grab what little hope we can at this moment in time.
Evelyn Douek:
Very, very tiny. But yes, let’s take it.
Ethan Zuckerman:
And thank you so much for being here. There really is no one I would rather talk to about these issues. It’s just such a pleasure to have you here. Thank you so much for being with us.
Evelyn Douek:
No, it’s a real pleasure. Thanks for having me.
