Eric Gordon has spent his career as an social scientists trying to understand how city governments can use technology to better engage their citizens. But he’s learned that technology doesn’t matter if governments aren’t willing to listen and citizens don’t feel listened to. Add to surveillance technology to the mix, and technology doesn’t seem to build trust in democracy but the opposite. This week on Reimagining, Dr. Gordon tells us about his new efforts to empower citizens with data and get their governments to listen to them.
Eric Gordon is Professor of the Practice, Journalism, College of Communication at Boston University and has written books including Meaningful Inefficiencies (Oxford, 2020) and edited volumes such as Civic Media: Technology, Design, Practice (MIT, 2022).
Transcript
ETHAN ZUCKERMAN:
Hey, everybody, welcome back to Reimagining the Internet. I’m your host, Ethan Zuckerman.
I’m here with my friend and collaborator and thinking partner and person who I’ve greatly admired for many, many years, Dr. Eric Gordon. Eric is a professor of the practice of journalism studies at Boston University, director of the Center on Media Innovation for Social Impact. Eric for many years was based at Emerson College, where he co-led the Engagement Lab.
He’s a founding member of the Rethink AI Collective, which is doing civic co-design with communities in Dorchester and Mattapan—those are two Boston neighborhoods. As well as some experiments with civic games in the US, Bhutan, Egypt, and Romania.
He’s the author of lots of books. One of the latest is Meaningful Inefficiency: Civic Design in an Age of Digital Expediency from Oxford University Press. He’s the co-editor of a volume called Civic Media: Technology, Design, and Practice, which I’m proud to be part of and has really ended up being one of the central books around this idea of civic media.
We’re going to end up talking about a new book that Eric is working on and this broader question of civic media, community engagement, and more than anything else, this question about whether technology might help us listen to communities. Eric, welcome.
ERIC GORDON:
Thank you. Happy to be here.
ETHAN ZUCKERMAN:
Really glad to have you. We had the chance to catch up recently an event focused on AI over at UMass Amherst. You have had a long voyage within the area of civic media that sort of led you to be thinking about and encountering the world of AI. Let’s go a little bit back to the beginning. Can you talk about some of your early work with communities with Boston and elsewhere—what led you to start looking at media as a path towards civic engagement?
ERIC GORDON:
Yeah, thanks for the question. Again, it’s really a pleasure to talk to you about this. I started my career looking at questions of media representation in cities. I was particularly interested in how media from the beginning of the 20th century onward was actually shaping the way that people experience cities and the way that ultimately cities became represented and captured in the public imagination.
As I started doing that work, it became pretty clear to me that what I was missing in these questions of representation was the question of generation, of really how people lived within cities and how people interacted with each other and the way that that actually informed representations outside of cinematic representations and television representations.
It was really what was happening on the street in community, in collaboration between community and institutions. I started early on in my academic career. I started turning my attention to these questions of interaction and ultimately the way that people experienced the city. That actually led me to these questions of in what way was that experience of the city mediated by technology. That really started my journey that I’m still on today.
ETHAN ZUCKERMAN:
Eric, walk me through some of the greatest hits of Engagement Lab, this very long-running project you had over at Emerson College, which did so much work in the city of Boston where Emerson is based.
ERIC GORDON:
Yeah, I feel like the origin story is it’s back in 2007. I started working with, I met some people over at the Berkman Klein Center over at Harvard, it was called the Berkman Center at that point, a guy named Gene Koo. I also started working with someone who is brand new to the city of Boston, a guy named Nigel Jacob, who then later on founded the mayor’s office at New Urban Mechanics in Boston.
When we started working together, this was in 2007, we began talking about this new technology called Second Life. There is all this hype around Second Life and there were cities that were building islands and Second Life that represented those cities. There was Berlin Island, there was New York Island, and these simulated places became just sort of reproductions of the physical environment. We started talking about what we could do different in Boston.
We decided that instead of just building a simulation, that we wanted to create a mechanism, a platform for interaction, where we could use the Second Life platform as a way of inviting people into imagining what was possible in the city. We connected to the planning department at the city. We worked directly with other agencies at the city. We said, “Let’s use this technology to do something.”
ETHAN ZUCKERMAN:
This was a heavily hyped technology at the time. This was an early virtual reality, although it wasn’t headset-based. It was a three-dimensional representation of physical space, a little clunky, more popular with the geeks than it ever became in the mainstream. You looked at this and said, “Everyone’s trying to do it. What would it mean to have meaningful civic participation in this space?”
ERIC GORDON:
Exactly. What we did was we created a process around it. We said, “Can this technology actually help people? Can it augment the way that people deliberate? Can it change the way that we interact with the things that don’t yet exist through simulation and visualization?”
We did this for about a year and we ran planning processes. We were doing something really interesting. I know that because the people within the planning office were very suspicious of what we were doing. They were very afraid. Always a good sign. Exactly. They were very afraid that what was happening was that people were—they had a sense of ownership of the things that they did within the media that they were uncomfortable with.
Essentially, what was happening is that in one project, there was a planning of a park. We created a simulation of the park and we had people take virtual objects and place them where they want to be and then essentially generate these visualizations that can then be handed over to the city. They did that in groups. We brought people together physically and they worked together and they handed them over.
The city came back and said, “No, we can’t actually… We can’t call these statements. We can’t call these expressions. We need to call them virtual sketches.” They said they wanted to back away from it and say, “These are just sketches. These are just ideas because we don’t want the responsibility of people wanting to see realized what they put in virtual form.”
That actually opened up my mind to this, is that there’s actually an incredible amount of power that this technology offered to people. That project was the beginning of, again, a pretty long journey of how else can we use this tech to unlock imagination and perhaps to shift the power dynamics involved in the way that people are interacting with institutions.
From there, I started developing games. I started developing both digital games and physical games. I started exploring uses of social media. I started exploring all sorts of ways to shift that power dynamic that I was seeing, that the institutions were putting a hard line or red line between what community voice was and what they were able to do with that voice.
ETHAN ZUCKERMAN:
There’s a real tendency to think of technology and civic engagement from an efficiency paradigm. We’re going to bring technology into the matter and everything’s going to move more quickly. We know through recent experiences like DOGE, efficiency is often code for something else. Efficiency feels like it should be a good thing to aspire to.
You’ve talked about constructive and productive inefficiencies in civics. Sometimes the forms of participation that you’ve been trying to cultivate are the furthest thing from efficient. Talk a little bit about getting away from technology and efficiency, which feels like two concepts that tend to be paired together.
ERIC GORDON:
The work that I’ve been doing has always been in some tension with civic tech and the whole civic tech movement. A movement that I feel like I’ve been a part of, but always a bit distant from. That means that so much of what was happening at that time and really continues to happen is that the technology was seen as a way of cutting the fat, of reducing the bureaucracy, which is all good. Then in the process was just getting to the end point as fast as possible. So many of the apps that were being generated at that time were doing that.
The work that I was doing was really very different. I was more interested in the technology as being able to, again, augment the meaning-making process, change the power dynamics that are involved in the interactions between institutions and people. That doesn’t happen quickly. There were opportunities to use the technology to actually slow things down.
In my book, Meaningful Inefficiency, I explore this in depth. That was informed by the work that I was doing in game design. I’ve been really inspired by a philosopher named Bernard Suits, who wrote this book called The Grasshopper. In this book, he said that games are necessarily inefficient. What he means by that is that when you play a game, you agree to enter into a system that has a certain set of rules with an end point that you understand.
Instead of trying to get to that end point as quickly as possible, you put what he calls unnecessary obstacles in your way so that you can play. He says that the goal of any game is to play the game.
I’m not suggesting that civic life is a game, but what I am suggesting is that human systems can be robust if you’re able to exist in it and play within them. That’s where meaning happens, not just getting through as quickly as possible. That was inspiring for me in a way that everything that I was designing, what I was doing is I was creating opportunities for meaning-making for the humans that were operating within these systems that were otherwise being squeezed out by bureaucracy and increasingly by technology. It was an opportunity to enter those elements into a system that was growing to systematically exclude them.
ETHAN ZUCKERMAN:
Inefficiency also feels like an opportunity for people to find their voice, to express themselves, to have as full participation as possible.
You’ve spent a ton of time thinking about how to get people to participate in local processes. These might be planning processes, these might be governing processes. What are the biggest barriers? Are the barriers existing institutions that don’t want to give up power? Are the barriers people’s own misunderstandings and processes? Are the people’s time and willingness to participate in these things? Why don’t we have the flourishing civic society that we hope for? Or do we and are we just not seeing it?
ERIC GORDON:
Well, we don’t. Democracies are based on trust. One of the ways that I would put all those things together that you just mentioned is that the lack of trust is actually the main barrier that’s manifesting in unwillingness to participate, in perceptions of the institution of not being willing to give up power. In some cases, the institution actually not being able to give up power because they don’t trust citizens. The trust moves in both ways. It’s not just citizens trust in government, but it’s both ways. I see this problem as all the research is showing that trust in institutions has been steady decline. There are not good trend lines here.
These questions of building trust or cultivating a concept of trustworthiness is something that we need to pay attention to. Getting through things quickly is not really the way to do that.
I point out the distinction between, on one hand, confidence and faith. I think we have to think about both of these things as we understand the trust landscape. Confidence is what happens when you do the same thing over and over again and it seems to work out well. We have confidence in a lot of the tools that we use. Many of us have confidence in systems like Amazon because we can use it quickly. We can get what we want quickly and ultimately that manifests in something that feels and looks like trust.
There’s a difference between confidence and faith, which is this idea that one has faith in an institution to act in a way that is predictable, is something that goes beyond the immediate transaction, but is a predictable interaction that is manifested through values of some sort.
I trust that an institution is going to do something independent of specific transactions. I trust that they’re going to act in a particular way. Faith and confidence sometimes are in some kind of tension.
My work, I feel, has been focused on this question of faith and so much of the civic tech work has been focused on confidence. I feel like we have to now sort of bend to that. This is where inefficiency comes in. This is where meaning making comes in, where that’s how people are going to regain some level of trust in the institution that are facilitating our democracy.
ETHAN ZUCKERMAN:
Are there success stories from focusing on the faith side of this rather than the confidence side of this? If it is that dichotomy, which I think is a helpful dichotomy, what’s the theory of change in restoring faith rather than restoring confidence?
ERIC GORDON:
I’m going to answer your question by skipping your question, Ethan. I’m going to say something a little bit different, which is I don’t know. I don’t know what the theory of change is because I’ve also had a reckoning with my own work and a sort of understanding that the work that I have been doing for nearly two decades may have been misguided. Not misguided, but ineffective.
I’ll tell you why, because my theory of change has been to create opportunities for people to make meaning in their interactions with institutions so that they can trust the process, so that they can trust each other, and so that they can be empowered to take action as necessary.
The problem was, well, let me say one more thing, I was so invested in creating valuable speech from the people who were participating. Give them the opportunity to deliberate, to get the ideas correct, to do it in a slow, methodical way, and then deliver it over to the institution.
What I learned over time was that the institution wasn’t doing anything with all of that speech. This is part of the problem. We were solving one problem, which we were creating a sort of civic capacity using the technology, which is still incredibly valuable to do. It’s just not the entire solution.
There’s this other side of things, which is what happens after people speak. What is the capacity and the willingness of the institution to do something with it? This is the transition into my work right now, which is focusing on listening, because I feel like this is what we haven’t been focused on.
ETHAN ZUCKERMAN:
I think these crises of faith are incredibly helpful and valuable, particularly when they come from people who’ve tried very, very hard to pursue a worthwhile and meaningful idea. I think whether we’re focusing on faith in institutions, confidence in institutions, trust in institutions, as my work has been.
I think coming to the realization that that hard work isn’t yielding as much as we would like it to, and that we have to reframe the problem is a really interesting one. I have the advantage of our listeners in that I’ve read at least the proposal for your new book, and I know a little bit about the shift that’s undergoing. The new book centers on this idea of generative listening.
If I were to try to summarize it, it would be that we keep asking citizens to speak up and raise their voices and participate in processes, and that’s incredibly stupid if we’re not going to find a way to listen to them. Particularly, we are challenged by this idea of listening at scale.
Maybe we can listen to a single constituent who manages to get us on the phone or manages to say something particularly articulate at a meeting that we’re at. But one of the fundamental challenges of democracy is that representatives would need to be able to listen to thousands, hundreds of thousands, millions of people to do their jobs. How does generative listening help us with this fundamental asymmetry in representative democracy?
ERIC GORDON:
Yeah. Like you said, the realization for me was that institutions are great at least saying they’re collecting data and they’re terrible at listening to it. Not necessarily because the people within institutions are bad actors. It’s often just a lack of capacity to do something with it.
I’ve studied mostly cities of that level of government. What I’m seeing and what I hear from people in cities is that we have so much data that we’re just sitting on and we don’t know what to do with it. What happened is that every time there’s a process, a community process, there are surveys, there are town hall meetings, there are even digital tools that are being deployed that may even be a little bit playful and they’re bringing in all of this data. Then ultimately what happens is that there is a very brief summary and there are a few humans that go through and pull out some insights and then take action.
Mostly what happens is that institutions are engaged in what I call closed system listening, which is that they reach out to publics on very specific things. A survey is a question and they can get some input into a specific question and then maybe that the institution can modify their decisions or their behaviors based on what they hear from that question.
What is not happening is that there’s not a sense of what’s often called open system listening, which is the ability to take in sentiment, take in understanding from a range of different people, even groups of people that you hadn’t even thought to look out for who are actually out there, who have ideas, who have formed groups within civil society. It’s the ability to take that in and do something with it, which is what is not being done effectively on the institutional level.
For me, the nut that we need to crack right now is we need to be able to build the capacity of institutions to effectively make sense of vast amounts of data in ways that they haven’t been able to before. I like to think about it as it’s like a dog chasing a squirrel. If the dog catches a squirrel, they have no idea what to do with it. That feels like what happens now. Governments can reach out to listen to people, but when they actually get something meaningful, they have no idea what to do.
We have an opportunity to, I think at this point—and this is where I’m a bit of a techno-optimist—I feel like the solution rests with technology here because we can’t do this on a one-to-one human-to-human scale. We have to be thinking about how institutions can listen at scale.
ETHAN ZUCKERMAN:
So one of the conversations you and I have had over the years is about the technologies of listening. I’ve made the argument that the petition is a way to try to listen at scale. It’s not a very good technology for it. Opinion polling was very much intended as a way of listening at scale. Much denigrated, often turned into a political weapon, but its own extremely powerful set of technologies around this.
Some of the technologies you’re starting to look at are in this sort of generative large language model space. Why might AI be a critical factor in this problem of listening at scale?
ERIC GORDON:
On a very basic level, what AI allows us to do is to summarize and pull insights from large amounts of unstructured data. It’s not always good at that, but we know it can do that. That is a technology that is an affordance. That’s pretty new for institutions to consider what to do with. AI, the possibilities of AI are the ability to sort of take in to analyze data. The other thing that AI is potentially able to do is to give access to analysis to people outside of institutions.
One of the challenges that we’ve had is that petitions and surveys, as you’ve talked about, have been data collection mechanisms where the data comes into the institution, the process of the data pulls out some insights and sometimes communicates what they learn. The people who provide the data are excluded from that process.
Then we’ve had really important movements to try to address that. For example, the open data movement for well over a decade has tried to fix this problem by making data available to people. It’s been really successful at making data available. What hasn’t happened so much is that the data has not necessarily been usable. There’s all sorts of open data out there, but unless you’re a data scientist or have some understanding of what to do with the data, mostly it’s just used by the institutions as it was before.
The possibility of AI that I get enthusiastic about is that we have this opportunity now of using all that infrastructure, all that open data that we have, other kinds of data, and then actually provide analysis capacity to people who have been denied it for so long.
ETHAN ZUCKERMAN:
I had a conversation on stage a couple of months ago with your colleague and collaborator, Santiago Garces, who is the CIO of the city of Boston. We were at an event at the Museum of Science in Boston on democracy and disability. We were talking about how technology could open democracy to participation from people who have visual or other impairments.
One of the ideas that really knocked me out that Santi put forward was, yes, we’re getting better at doing data visualization. We’re doing a better job of taking these data sets instead of creating pictures that people can encounter. Rather than trying to provide good descriptions of those pictures for people who are visually impaired, why not pair a large language model with that data set and let people ask questions and interrogate it? Me as an AI skeptic, it was one of the first moments where I said, oh yeah, that actually sounds much better.
You have been interested for a long time in this idea that tech can minimize barrier-specific engagement. Do we think that building systems like this, building systems to allow people who the data is about, who the data is meant to benefit, allowing them to engage in conversation with data, is that finally going to lower that barrier or is that part of a more complex equation?
ERIC GORDON:
Yeah, I think I’m old and wise enough to know now that it’s not going to solve the problem entirely, but I do feel like it’s a part of the solution and it’s what we need to be doing. Part of this is offensive and part of this is defensive. What’s happening right now with corporate AI is that it is being used to listen to people at scale.
We have not only an opportunity but a responsibility to figure out how the same technology can be used to listen to people at scale with an agreed upon set of democratic values. This is why I can’t sit back and give in to my AI skepticism, which I share with you, but I feel like there is again this responsibility that we have to come up with alternative models that are grounded in values that allow us to use this technology to move this power blockade or to push it away, or to chip away at it.
Let me give you an example. There is a platform called Zencity. It’s a social media listening platform. They use AI and what they do essentially is they listen to social media conversations and they produce insights based on social media conversations. A lot of cities have used Zencity. We are seeing hundreds of cities have used Zencity for things. This all started during COVID. Zencity was being used to identify how people were talking about vaccinations so that cities can better understand what their appropriate intervention could be.
A good example is that this happened in the city of Pittsburgh. They used Zencity. They had announced that the technology was being used for the purpose of understanding more about vaccination opinions and sentiments. Then what happened was that the city started using it also to understand what people were saying about the police and about instances of police abuse. While the city, people within the city, had perhaps good intentions in using the technology for that purpose, there was a justifiable outcry because all of a sudden, the thing that the listening that was happening from the city as they had disclosed that they were using it for COVID, all of a sudden that turned into surveillance.
It turned into a hardening of the power structure in that now they were secretively listening to people and people reacted. Then the mayor had to apologize. It blew up. Now, cities who use this technology are very, very careful about how it is that they disclose what it is that’s being used for.
There is a very fine line—this is how I talk about it—there is a very fine line between listening and surveillance. The difference is that listening has a clear articulation and negotiation of values, whereas surveillance does not. Surveillance is an imposition of values or a complete obfuscation of them.
Our institutions cannot be using these technologies surreptitiously. There needs to be transparency, disclosure, conversation about these technologies and our institutions have an extra added responsibility now to do that.
That’s part of the civic environment that we’re living in and part of the need for mobilization around this work, at least an enhanced awareness around this work, is that we need to make sure that institutions are doing this. Many institutions are doing this.
Many cities in the United States and elsewhere are not only disclosing but actually opening this up for conversation, for augmentation about how these technologies are being used.
We have an opportunity for these kinds of technologies to be cultivated and led by community in collaboration with the city. This is another way that we should be looking at this. This is not just a one-way, city is going to listen, city is going to disclose that it’s listening and that’s just how it’s going to play out.
What we see happening now is we see these community efforts where communities are using these technologies so that they can be listened to. That’s a different dynamic than the institution figuring out that it needs to listen out for communities. Instead, we have communities saying, “No, you need to recognize us. We’re going to actually use these technologies so that you don’t continue to misrepresent this community.”
ETHAN ZUCKERMAN:
Give me an example of how that’s happening where a community is choosing to use one of these technologies to say, “Hey, I want to make sure you’re listening to us.”
ERIC GORDON:
I’ll give you an example of a project that I’m working with and I can give you some other example too, but let me start with this one because I know a lot more about it. This is the project that I’m working with a community in Dorchester, which is a neighborhood of Boston. Dorchester is a majority BIPOC or this part of Dorchester that we’re working in is majority BIPOC and one of the, you know, sort of, and it has much of the violent crime that takes place in the city happens within certain communities in Boston. Dorchester is one of them.
I started working with this community because they had expressed that what was happening was that they kept asking the city for data about them. They kept asking for analysis from the city and they would often get slow response or no response. What they wanted was they wanted to have a data resource that allowed them to have their own data, to control their own data, and then to be able to analyze that data.
We started building what we’re calling a hyperlocal LLM that’s focused specifically on this community. We’re at the moment, we’re sort of building it using Gemini. We’re using corporate models as the sort of baseline, but we’re tuning it so that it’s focused specifically on the sentiment of this very specific community.
We’re working with 26 square blocks within the Dorchester neighborhood. What we’re using is we’re using structured data, like the open data that we were just talking about, plus we’re using sentiment data. We’re using community meeting data and we’re using informal conversation data that people have agreed to record.
We’re using all this data to actually inform this bot essentially that allow people within the community to say, “Hey, what’s going on in my neighborhood right now? How are people feeling about this?” They’re able to do is then take that information and then share that with the city.
From the point of view of the community, the community wants to own this resource. This is not something that they’re just going to hand over to the city, but we need to build governance models where the community can sustain this over time. And the city needs to be a partner in that. And this is part of that listening relationship.
Part of listening is always reciprocal. It’s not just a one-way thing. It’s not just that we don’t give information to an empty receptacle. We give information to an entity that’s actually willing to take that information in and to be transformed by it. And so in this case, the community is figuring out a way to speak through data and that they have control over. And the city in this case needs to be the willing receptacle of that data and be willing to be transformed by it.
And we’re working on that now. I don’t have a four-point answer to that. But this is something that is going to be necessary as more and more communities are taking control of their data using the affordances of artificial intelligence to do exactly this. And so we just, we need to figure out how to help institutions effectively take that information in and trust it.
ETHAN ZUCKERMAN:
One of the most striking stories in the sort of outline of your book that I’ve read was about a student, I believe it’s in Romania, having a conversation with a chatbot who’s pretending to be a teacher. And she has this really transformative experience of being listened to. And we know this to be a phenomenon. We know this back to research with the Eliza chatbot that people’s sense of being listened to can really be transformative for them and can get them to open up and can get them to share things in a way.
You seem to be talking about a scenario in which not only can everyone be listened to, but there’s actually this meaningful chance that their voice is heard and acted on, which sounds tremendously utopian to me. Eric, in the sense that I want it to happen, but it also seems to require a lot of things to work in the right way.
How do we turn that lovely idea that we all get to be heard and that our voice is meaningful and resolve it with this question you put on very early on, which is the decreased trust in institutions of all sorts. Is there a way that bringing technology into the mix helps us with that trust equation?
ERIC GORDON:
So let me start with the story of the young woman in Romania that was using this tool that we had built to bring high school students into a conversation about education reform in the country. And what was really amazing about this moment is that this young woman said, “I feel listened to.” And then I asked her, I just sort of clarified that this was a chatbot. This was AI that was listening to you, and she said, “Yes, it doesn’t matter.” And what was interesting about that is that it was the experience of being listened to.
She understood. There was no deception in that experience for her. She understood very well that it was a chatbot, but she had never had the experience of getting to talk to a person in a position of authority so honestly. And she felt that that honest interaction with that person of authority was very effective for her to feel as though she had a voice.
Now, back to what I was saying before, you could do that all day long. And she could say all sorts of things, and then if no one’s listening on the other end, it does not matter. However, if people are listening on the other end and on the front end, people don’t feel listened to, it also doesn’t matter. So we have to address both of these problems at the same time.
We have to create a sense of being listened to, which is—think about it this way. When you talk to another person and you are being honest with another person, you’re far less likely to be honest with that person if you feel that that person is not paying attention to you or is hardened in their beliefs and they’re not going to change them. You’re not going to be honest. You’re not going to disclose anything and you’re going to shut down.
So part of this dynamic then is to say, okay, we well-intentioned institutions that actually need to create, cultivate a sense of being listened to while they’re also addressing this other part of the problem. I understand that this is a utopian vision.
However, I also understand that, or at least the way that I’m thinking about it, I’m breaking it down into its composite parts. And I’m saying, we have to address all of these things. And this is a remarkably complex situation. I’m not suggesting there is one simple solution and all of a sudden we can snap our fingers and institutions and listen better.
I’m saying that this is a very complex dynamic that we can begin to sort of chip away at in multiple ways. And part of that is creating an experience of being listened to. Part of that is helping institutions take in and meaningfully analyze data. Part of that is helping communities use the affordances of new technologies to effectively represent themselves. All of those pieces are going to be necessary for us to move forward and to create a functioning democracy.
ETHAN ZUCKERMAN:
Eric, you’ve given us such a rich set of concepts here. This notion of putting listening centered in representative democracy. This idea of values centered listening, rather than surveillance. The importance of both feeling that you’re being listened to, as well as the actual ability to listen at scale and make it possible for institutions, if they are able to transform themselves and listen up to learn from it.
I’m hoping you are feeling listened to. I’m hoping that our listeners are feeling like they’ve got some concepts and vocabulary to help them deal with a really challenging moment in civic life. Eric, I think you’re doing some of the most interesting and important work in this space right now. I can’t wait to see what you’re able to do at BU. I really can’t wait for this new book. Thanks for giving us the chance to listen.
ERIC GORDON:
Thanks so much, Ethan. I really appreciate the time.
