image of Tracy Chou from podcast recording session

39 Tracy Chou Wants to Help You Avoid Trolls

Reimagining the Internet
Reimagining the Internet
39 Tracy Chou Wants to Help You Avoid Trolls
Loading
/

While some big social media companies are working to use AI to combat harassment, Tracy Chou has a simpler solution: put users in control of what and who they see on their feeds. In this week’s episode, Tracy tells us about he app Block Party, a clever and radical set of tools to protect users from trolling and abuse.

In addition to founding Block Party, Tracy is a founding memer of Project Include and activist advocating for more women-friendly tech workplaces, penning influential articles such as “Where are the numbers?”.

Transcript

Ethan Zuckerman:

Hey, everybody. Welcome to Reimagining the Internet. My name’s Ethan Zuckerman. I’m your host, and I am thrilled to be here today with Tracy Chou. Tracy is the Founder and CEO of Block Party. Tracy is also one of the co-Founders of Project Include alongside some other remarkable people in tech, including Ellen Pao and Yvonne Hutchinson. She’s was an early employee at Quora and at Pinterest, and ever since she’s been involved with tech, she’s really been involved with calling attention to questions of inclusion and diversity. And we’re thrilled to have her here to talk about that and to talk specifically about Block Party. Welcome.

Tracy Chou:

Thanks so much.

Ethan Zuckerman:

Tracy, can you talk to me about how you got involved in tech and also how you got involved with activism? I’m assuming the activism was a reaction to the tech culture of Silicon Valley, but I don’t want to make a false assumption there.

Tracy Chou:

Yeah, so I have what some would consider a very charmed story of entering tech. I grew up in Silicon Valley. Both my parents are software engineers, just surrounded by tech my entire life. When I went to school, I studied electrical engineering, computer science, so kind of like the straight shot into technical work in Silicon Valley.

But the flip side of that story is despite being on this almost perfect path into software, I actually felt a lot of headwinds. I would describe them as in hindsight, I think these headwinds were things like sexism and misogyny and just the general bias of Silicon Valley and tech culture.

So it was actually a bit of a struggle for me to enter the tech world, just feeling this resistance from peers, from managers. I did end up working as a software engineer in my first job out of school at Quora, and was kind of talked into that. I think partially out of the desperate need for software engineers that every company had, which was like, fine. Okay, please be an engineer. We need people to build things.

And in that first job was when I actually really fell in love with software engineering because I realized how magical it is to be able to build things, to make things exist where they didn’t exist before. And this was particularly pronounced at a really early stage startup. I joined as the second engineer hired onto the team. And so it really was creating things that didn’t exist.

Somehow I had gotten through two engineering degrees at Stanford without realizing that engineering was about building things, but finally, in this first job I realized what it was about. So that was really cool.

At the same time, starting to work professionally as a software engineer, I could see how much bias there was, also lack of representation, and seeing how those things would impact the quality of the products we were building.

And one example I like to share is that the first thing I built at Quora was the block button because somebody, even though at that point we only had a few thousand users. I think somebody was already harassing me, and I just really wanted to be able to block him and stop any notifications coming from him. And because I felt very strongly about this, I advocated for that to be my project. And the rest of the team was fine with it, so that was what I worked on.

But after I built it, I realized that if I hadn’t been there with that specific experience advocating for that, having that point of view, it was very unlikely that that would’ve been prioritized so early. And-

Ethan Zuckerman:

In fact, we’ve seen with companies like Slack that have had hundreds of millions of users who had to be essentially begged to create a block button by their users who pointed out that just because you work with someone doesn’t mean that they’re not going to end up harassing you. So Quora in that sense, thank to your experience, was way ahead of the curve.

Tracy Chou:

Yeah, so I felt very lucky. It goes back to that idea of like, oh, I can create something. I can make something exist that wouldn’t have otherwise. I think that was very powerful, but it also made me realize that for all the people who are not in the room, they can’t manifest those things that they desire into these products and other discussions we had around what do women want were very distressing to me as the one woman who is like a 22 year old, recent grad called upon to speak on behalf of half of humanity, where I did not want to say that this is what women want or not. But as the only woman in the room, I had to make my best effort to at least share some perspective.

So with some of these experiences where simultaneously there’s that power of technology and software engineering and also the lack of representation, which has very clear ramifications for the products we’re building. You could see that was a big problem.

On the cultural side as well. It just doesn’t feel as fun to be in a very non-diverse team. And when there are certain cultural norms that are not particularly inclusive, it’s just not a very fun day-to-day experience.

So that was part of what motivated my activism work, which I would largely describe as accidental. The earliest days it was actually because for Quora, which is a question answer site, a content site, we just needed to generate content. And so all of us who were early employees would be writing answers to questions. And the one thing I felt like I could speak authoritatively on was my own experience. I wasn’t going to be able to compete with other people early on on the site who were experts in startups or VC or whatever it was.

I was just a recent grad, but I could speak authentically to what is it like to be a woman in the computer science classroom? What is it like to be a woman in tech? And that was when I first started writing about that experience of being a minority in the field. And it was very encouraging to get the feedback from other folks on the platform. Even people who were not in the gender minority, but Black people would write to me and say like, “Maybe it’s not exactly the same experience. I’m not a woman, and I’m not actually an engineer, but I’m a Black person who works in tech, and I feel a lot of the same things that you do around being the only one being called upon to represent my entire group, feeling a bit out of place, and being forced to conform.”

So it was very rewarding, I guess, to get that feedback and realize that there were other folks out there who were having a similar experience, even though it’s not great that we’re all having this experience, but at least solidarity and validation that it’s not just me and it’s kind of a systemic issue.

Ethan Zuckerman:

Was Quora supportive of that? Were they happy that you were bringing up these issues and talking about this, I think, well-known, but no one loves the fact that the tech industry has real problems with bias and lack of diversity? Were you feeling pressured at Quora about that, or were they receptive to you talking openly about these issues?

Tracy Chou:

They were similar to many other firms at the time, which is to say not particularly progressive on this front. And one thing I found an interesting comparison was that on the engineering team, we would talk about how we wanted to build the best engineering culture, and the one every engineer would want to be in. We strived to be the best engineering team in the world with the best team for engineers to work at. And I posed a question once, which was, could we also aim to be the place that women most want to work at? And the response was like, “Well, we’re not worse than other places, so that seems fine.” So it was more of a, we’re happy to be average on this front.

I felt much more support discussing diversity issues when I went to Pinterest. So I joined Pinterest, I left Quora about a year and a half after working there, and I joined Pinterest, and I felt very supported there to speak about these issues. And I think that is credit to the leadership team, which is very empathetic. And I think that also speaks for itself in the product where actually the founding team of Pinterest was three men. And a lot of the early user base was women. And I think it’s due to their empathy and ability to listen to people who are not themselves and understand what they want, that they could build a product that could lead a team building a product that was serving a different kind of user base.

And it was an interesting contrast between these two startups for me, where Quora was a very Silicon Valley type of startup where the earliest users were all Silicon Valley insiders. And Pinterest was very dedicated to not being on the Silicon Valley radar and putting off PR that was in tech press and really focusing on being where our potential users and customers were. And I don’t think the focus on being very empathetic and understanding the users who are different from us, may not be that tech savvy, or they just have like different worlds, different aspirations. So I say Pinterest had more of a culture of empathy, which worked very well when I wanted to speak about these kinds of issues.

Ethan Zuckerman:

On one of the pages explaining Block Party, you share a sample of essentially some of the hate that you’ve received on Twitter and on Facebook for the work that you’re doing. Can you talk just a little bit about what does that feel like? Almost everybody listening to this podcast has had some experience of someone being mean to them on the internet, but you’re really experiencing something categorically different. You’re being attacked by people you don’t know, who are pissed off that you are talking about issues that are deeply pervasive in society and in tech specifically. And then those attacks are getting deeply personal, really because you’ve been brave enough to stand up and do this work. What does that feel like?

Tracy Chou:

I think at this point I’ve developed a much thicker skin. So I’m trying to remember what it used to feel like, which was not very nice. I think you can probably imagine, like for most people, when somebody says something mean to you that is personally targeted, whether or not you think it’s true, it could be feedback from a colleague or somebody horrible on the street shouting something at you. It doesn’t feel very nice.

And even if you know that you can ignore it because the person doesn’t mean anything or their opinion doesn’t matter, or they’re not going to do anything to you, it still feels bad for at least a short period of time, sometimes even longer. I’ve frequently had the experience of seeing a negative comment, then having it run through my head for hours or days.

In some cases, the more difficult cases are when there’s very long feedback written to me. And it takes me a while to parse through it and understand that it’s gaslighting or it’s very sophisticated trolling.

I’ll back up a little bit and talk about different types of harassment. There’s the high prevalent, low severity type of steps with a kind of like garden variety sexism, misogyny, drive by trolls ,people who are not particularly targeting anyone, but just have stuff they want to spew at folks on the internet. And then there’s the really targeted stuff that can be a lot more disturbing, because it’ll be very persistent, sometimes spanning many years. I’ve had a number of stalker harassers who’ve been after me for five, six, seven, eight years. They’ll go away and pop back up, so there’s this whole range of stuff.

For the really high severity stuff, very targeted. I would try to report it. In some cases it would go to the police to try to do things about it, but there’s not very much that can be done on that front.

The stuff that was not as serious, but very high prevalence I realized was starting to just like really interrupt my day a lot where I check Twitter all the time because I’m addicted to Twitter, and I should check it less, but at the moment I’m still very addicted. I just check it all the time and realized that at many points throughout my day, I would feel the negativity of somebody posting really nasty comment towards me. And then I would feel it interrupt my emotional stability and then just be distracted from having checked Twitter and seeing something really negative.

So some of the response to that I would take would be trying very hard to not check Twitter, just turn off my, delete the app or turn off my phone, but I always end up going back and then just be really frustrated because I still want to get all the good stuff, see all the news, interesting opinions from people I respect, all the different perspectives. There’s all this really positive value that I was getting from the platform.

Ethan Zuckerman:

Can’t I use this tool that I love and really enjoy engaging with without having constant toxicity that makes me feel bad about the world as a whole and my place in it, yeah. That does not seem like an unreasonable thing to ask of a platform.

Tracy Chou:

Yeah, and I think that is one thing that is useful to point out, which is there’s a lot of positivity on these platforms. There’s a lot of reason to be there. It’s not all bad, and I think sometimes when we fall into the traps that the platform play for us. We get very extreme in our opinions and talk about how they’re all so terrible, but-

Ethan Zuckerman:

We’re going through a real quit Facebook cycle, and of course people don’t talk about the fact that even with all the manyfold dysfunctions of Facebook, there’s some lovely thing about it, as well.

Tracy Chou:

Right, there’s really good stuff, which is why we’re there. If it were all a hundred percent bad, we would have no trouble leaving those platforms, but it is that balance of positive to negative.

So Block Party came about partially from these experiences dealing with harassment, but also the experiences that I had from working as an engineer at different platform companies and understanding how these platforms are built, how decisions get made internally, what is prioritized. Also where a third party could sit and improve the experience for users.

So it was the combination of these experiences, I think, put me at this very cool intersection where I could think about problem solving with product and engineering and with the lived experience of dealing with this problem, understanding the trade offs and the emotional impact of this stuff.

One thing I found very interesting around trust and safety and a lot of the anti-harassment work that has been done inside platforms and with some third parties is there’s a big focus on machine learning, which I think is useful in certain cases. There’s a lot of stuff that can be automated, but there’s also a lot that machine learning can’t do. I think the emphasis on using machine learning to solve all these problems comes a lot of times from folks who don’t experience the problem and are very enamored of the technology and would like the technology to be a perfect solution.

And as somebody who does experience the problem well, and in addition has built machine learning based moderation tools and worked on machine learning models and various capacities before, I also know what the limits of machine learning are, and I know how much, just really simple heuristics and user understanding can do.

If we just think about not showing potentially upsetting stuff to people, you don’t have to be perfect on your filtering mechanisms. You can drastically improve people’s mental health by just not showing them stuff and allowing them to review it when they want to.

So that is what Block Party does in its core product. We allow people to set up filtering options, and you can configure them to be higher or lower. You can change at any time. Also understanding that people’s desire to engage with different types of content can change depending on emotional capacity. I felt that with myself, sometimes I’m more open to seeing what the internet has to fling at me, and other times I don’t want to engage. I just want to know I’m going to be interact with trusted folks.

Ethan Zuckerman:

So you can be more open or less open depending on where you are with it.

Tracy Chou:

Yeah.

Ethan Zuckerman:

Block Party also has some features that at first glance to me seemed extremely creative and that I realized that they are extremely creative, but they clearly come out of experience. This ability to block people who had shared or liked, or retweeted a particular tweet, the ability to not only block people en masse in lists of a hundred at a time, but also pass those lists around and say, “Hey, I’m using this list to remove these people. You might want to do the same.”

How did those come about? How did you end up centering on this idea of you’ve retweeted this content that makes it likely that I don’t want to be hearing from you?

Tracy Chou:

A lot of this is just coming from user feedback and folks who’ve had to deal with harassment and the lived experience of trying to fend off the trolls and taking preemptive measures to protect themselves.

There have been various blocking tools that have been built in the past. Oftentimes side projects or just hacky things that people put together in desperation to deal with these issues. But if you spend enough time on these platforms, you can often discover these patterns of behavior and heuristics that work really well. They don’t have to be perfect. You don’t need a super fancy machine learning classifier to predict the likelihood that somebody is going to be annoying to you. You can just use a heuristic like they liked a really obnoxious harassing tweet about you. And I’ve had that experience of people tweeting nasty things about me and all the people who liked that tweet, I could know with high likelihood that I don’t want to hear them.

Ethan Zuckerman:

You can safely assume that those are probably not people that you’re going to have a productive conversation with.

Tracy Chou:

Yes. I’ve had the experience of people tweeting screenshots of stuff I’ve done. Like I blocked somebody and then she tweeted a screenshot of me having blocked her and then all her followers came to harass me. That kind of thing was like, okay, well, if you’ve been on the platform enough, you know these patterns of behavior, and you also know what are potentially the useful preemptive measures you can take or the counter attacks you want to be able to deploy.

Ethan Zuckerman:

But know who else should probably know these preemptive measures and these patterns of attack, Twitter, right? Why doesn’t Twitter do this? I love this product. I love that you’ve done it, but it breaks my heart that Twitter didn’t look at your product immediately and say, “Oh my God, you’re right. Let me incorporate those into our trust and safety tools.” What’s that relationship like? How does Twitter feel about the product?

Tracy Chou:

Twitter’s been very, very supportive, which has been great. They’ve obviously had a varied history with developers on their platform in the past. I think right now, generally folks are coming to a realization that it is good to have a developer ecosystem and have other people engaged in solving these problems that you potentially can’t solve on your own as a platform. There are certain things that the platforms, I think, should take a lot of responsibility for and other things that are maybe better to open up to other folks able to build more custom solutions.

So I think that is where we are right now in the thinking between platform and third party. There’s certain things that the platform is going to do, and then things that maybe they view as not on their high priority roadmap, but would still benefit a good chunk of users. Maybe that’s better for third parties to build.

There are some challenges that platforms will also experience when they build out too much of this functionality, where they could be accused of censorship, or you’re balancing a lot of different concerns as a platform, right, where oftentimes efforts to deal with harassment, like anti-harassment, anti-spam efforts may also have impacts on the spread of misinformation or disinformation.

So one example of this on Twitter’s side would be being able to limit who replies to your tweets. So in one way you can think of it as the person who tweets being able to control their space, where the replies to their tweet is their space. And the analogy to other platforms might be if you post something, you can lock the comments on your post.

It’s a little bit different on Twitter though, where the replies are the same primacy as the original tweet. Everything is a tweet in the system. People can’t post it from their own account and just tag you, and so it’s a little bit different.

And then the other thing with Twitter is oftentimes people use the replies to debunk misinformation. And I personally, when I’m looking around at tweets going viral, sometimes I’m not sure if I should believe what’s in the original tweet. I just go scroll through all the replies and see what other people said. If there’s something that was wrong in the original tweet, probably somebody has flagged it in the replies. So limiting replies totally and letting the original author completely control that can cut off one of the best mechanisms for counteracting misinformation.

Ethan Zuckerman:

Twitter is slowly taking responsibility for some of the toxicity of its own space. Twitter can’t really do anything about Instagram or about Facebook or things like that. Facebook is not known for being open to third party tools in the same way that that Twitter has been. Do you have ambitions to build Block Party for something like Facebook or is that just unrealistic given the way that they structure their agreement with third parties?

Tracy Chou:

We would love to be cross platform, including on Facebook platforms. Instagram is more similar to Twitter in terms of how poster structure, easier privacy models of public and private. And you don’t have as many complex constructs like groups. So it would be good to be able to build for Facebook’s properties as well.

What remains to be seen is if they’re going to change their stance around third party developers and tooling. You would know much better than I on this front, but it feels like regulators want to do something about social media platforms, and potentially that looks like enforced openness in some ways, whether it’s transparency, interoperability, or something else, like maybe enforced APIs around trust and safety. I would love for that to become the standard. And then that kind of facilitates the flourishing of third party customization on people’s experience.

The way that I imagine things could move in a much better future for social media is people can choose their own experiences. And there is that consumer choice around how you want to interact with a platform. So you’re not locked in just by what the platform has decided is the right algorithm for you to see content. You can actually choose your own experience based on what makes sense for you.

Some people have talked about this idea of middleware or choose your own algorithms. And in some ways, Block Party is an example of that. Right now our product is functioning on Twitter and for your @mentions, but it is a way of controlling what you want to see. It’s the algorithm determines your notifications, and you can change it, you can set your filtering options, and improve your experience because you’ve decided what you actually want to engage with.

Ethan Zuckerman:

So for me, Block Party is the example of the web that I want to live in. And it’s a web where we can address the problems of platforms both by pressuring them to do a better job on things like harassment, but also we can build tools that work for specific sets of users.

It would be my contention that me as a hyper-privileged white dude might want a different interface to Twitter than a woman of color who’s facing significant online harassment. Right? And so that ability to customize the experience, whether it’s just how the algorithm sorts or whether it’s how blocking occurs, that seems right to me. But it’s very, very different than how many platforms are thinking about it.

Twitter actually deserves an enormous amount of credit for making it possible for a tool like this to exist. Even very simple tools on Facebook, like unfollow everywhere, which is just the simple way of essentially saying, I want to do some thing that I could manually do and would be completely legal to do on Twitter. Facebook has said, “We’re not going to let you do that.”

Tracy, at this point, I often ask people to talk about their imagined better internet. You’re sort of two steps ahead of it. You’ve imagined an internet that’s being built by more women and more people of color and is transparent about where it’s falling short on that. You’re imagining an internet where people have the tools to control their experience so that they have increased control over countering harassment. What does that world look like? We achieve these things. We have a more diverse group of people building our digital future, and the users have more control of it. Envision that world for me for a moment.

Tracy Chou:

So I think a world where our digital society is actually set up with proper governance, and people can engage in discourse and sharing ideas and pushing things forward. We can actually live up to that whole dream of the internet as it was originally where you can connect with the most interesting people around the world, come together and collaborate in ways that never were possible before, work on some of the most challenging problems that we as society.

I do think an interesting piece of the anti-harassment work that I’m doing is seeing how much harassment disproportionately affects the people we most want to hear from. So the ones who are advocating for a better future, like women who are speaking up, minorities, marginalized communities, activists for climate action, or during the pandemic, the scientists and public health experts are trying to lead us into a better version of the world today.

Ethan Zuckerman:

I’ve taken to understanding harassment as a form of censorship, and in its own way as powerful, if not more powerful than censorship by governments or censorship by platforms. It’s censorship that disproportionately affects, as you said, the people that we most need to hear from, the people who are historically and systemically the most marginalized. On a very practical level, Tracy, how do people get involved with Block Party? How do they find the software? How do they support the work you’re doing?

Tracy Chou:

You can find us at blockpartyapp.com. So sign up for an account there. We also have subscriptions. If you want to be a supporter, we have a supporter plan, which will put you into a special category where we contact you and also share new features for opt-in if you want to test early on. You can find us on Twitter @blockpartyapp_. There’s a lot of other block parties. Do not confuse us for the NFT company or the real estate company, but yeah, blockpartyapp.com.

Ethan Zuckerman:

You’re probably missing an opportunity with the NFTs there, but that’s another conversation for another time. Tracy Chou, I’m just inspired by what you’re doing. I am such a fan of people who respond to problems by imagining solutions and actually carrying them out in the world. The fact that you’re working on these questions, both of this dead practical level of making Twitter love toxic for individuals, and then working on this broader work of inclusion and diversity in our space. Thank you, seriously, profoundly for the work that you’re doing. And thank you so much for being with us on the show.