82. Twitter Blocked Tracy Chou’s Anti-Harassment App. Now She Wants to Fix Your Browser.

Tracy Chou, founder of Block Party
Reimagining the Internet
Reimagining the Internet
82. Twitter Blocked Tracy Chou's Anti-Harassment App. Now She Wants to Fix Your Browser.
/

When we had Tracy Chou on the show in 2021, she was rolling out software to give users a revolutionary toolset to block harassment on Twitter, and she was doing it with the Twitter corporation’s help. Fast forward to today, when she’s one of Time Magazine’s 2022 Women of the Year and her work has been roundly banned from Elon Musk’s Twitter via a draconian API policy.

What’s next for Block Party? What’s the future of fighting harassment online? Couldn’t there be some kind of law? The brilliant Tracy Chou answers these questions and more.

And make sure to join the waitlist for Block Party’s new app Privacy Party.

Transcript

Ethan Zuckerman:
Hey everybody, welcome to Reimagining the Internet. I’m Ethan Zuckerman. I’m here today with Tracy Chou.

Tracy is the founder and CEO of Block Party, an amazing project that we’re going to talk a lot more about, and co-founder of Project Include alongside Ellen Pao, Y-Vonne Hutchinson, and more. 

Tracy is someone who’s been involved with the tech industry for a long time, an early employee at Quora and Pinterest, and has used her time in tech to advocate for more women and more women of color in tech. 

She’s been recognized for her work, particularly with Block Party, as one of Time Magazine‘s 2022 Women of the Year. Hey Tracy, how is Elon Musk’s takeover of Twitter going for you and going for Block Party? 

Tracy Chou:
Well, one of his latest weed jokes was to introduce Twitter API pricing tiers starting at $42,000 a month for a tier that would not cover Block Party’s access. So we have recently had to put our anti-harassment tools for Twitter on hiatus. 

It’s quite unfortunate. We had a long and fruitful relationship with Twitter till this change in ownership. We were working very closely with folks at Twitter across many different teams, developer platform, health, business development, content partnerships. 

It was a really great relationship with Twitter where we got to really prove out how useful a third party ecosystem can be in fostering a healthy ecosystem and giving users control. And we were able to hold on to access for a little bit longer than most because we had an actual contract with Twitter. But that ran out on May 31st, and new API pricing would have come into effect that made no sense for us to pay. 

So, yeah. We are hoping that maybe after things settle a little bit and there are saying folks on the team that advocate for supporting the ecosystem that maybe things will change. But that’s the current status. 

Ethan Zuckerman:
Putting aside the moment that it really would have been more appropriate for him to charge for 20 a month rather than $42,000 a month. I mean, I get that the man is desperate for money, but not only would some of us probably have signed up for the API for 20 a month, it would have been a much better joke. 

Remind us what Block Party is and does or did until Elon Musk made it impossible for you to keep offering that product. 

Tracy Chou:
Yeah. So what Block Party did was give folks more control over their day-to-day experience on Twitter as well as take control in moments of crisis. There was both an ongoing filtering mechanism. 

So something similar to spam filtering for email, but instead for social where we would allow you to configure different options around let’s say you don’t want to see anything from folks with fewer than a certain number of followers or don’t have a profile photo set or just created their accounts, or you want to be even more restrictive and say, “I only want to hear from folks that are people I follow or followed by people I follow,” then Block Party would go automatically mute those people and hide notifications from them, mentions from them in another folder that you can review at a later point if you chose to do so. 

You could also delegate access to somebody else to help review on your behalf and start building in more of the community aspects. There are tools around mass blocking. So if you needed to block not only the author of a post, but all the people that had liked it or retweeted it. For example, a post that’s trying to direct harassment at you would be very useful to cut off that harassment at the source. That was also possible with Block Party. 

Ethan Zuckerman:
And this is all coming from a pretty personal point of view. I mean, you designed this tool in no smart part based on personal experience. You’ve done some really interesting projects helping call attention to disparities in the tech industry, how few women are employed in engineering jobs in tech. 

Clearly, as we all know, people on the internet love having their shortcomings pointed out to them. They really love it when women stand up and talk about women being underrepresented. You’ve suffered a huge amount of harassment. As someone who’s really experienced this firsthand, what’s it like sort of realizing that Block Party is no longer something that you can do? What’s it like sort of knowing that you don’t have it as a tool for your own protection and you can’t offer it to the many, many other people who are experiencing online harassment. 

Tracy Chou:
The most immediate reaction I have had on a personal level is that I’m less willing to post because I know what the potential emotional impact will be on me if I do get a bunch of trolls coming after me. 

And I think that’s a natural reaction that a lot of people have had in the past as well before Block Party existed. They’ve experienced these sorts of incidents and had to deal with the onslaught of abuse. A very natural reaction is to try to limit the potential damage in the future. 

And that was actually a big part of the problem that we were trying to solve, which is the silencing of voices that we most need to hear from, despite people talking about how free speech is so important and we should allow everybody to say everything, it turns out that when you allow unchecked abuse, it actually silences voices and you get the opposite of free speech. So I’m experiencing that personally from that perspective: not wanting to post as much anymore, how it feels to not be able to support these tools for other folks. 

It just sucks. I want people to have these tools and I think about not just the impact on the folks who still need to engage or want to engage and no longer have these protections and are taking more risks, but also the second-order effects of all the people that want to hear from them, need to hear from them and won’t be able to anymore if they’re silencing themselves. 

The other part I’m bummed about, there’s so many things we bummed about, the other part is that it really did feel like there’s this new path opening forward. Here’s a whole different approach to solving these problems around harassment that is actually really viable for so long people have been saying this is just really hard to solve. 

“These are just difficult problems, don’t expect us to be able to do much better because the problems are just so hard.” And this was an excuse that a lot of folks have made from the platform side.

And it’s true that trust and safety is very difficult, but I think Block Party also show the elegance of simple solutions that center the end user and make it so that people can choose their own experience. And I felt like we were just starting to showcase the possibilities there with an actual application that helped people. And well, we still have that proof point, but it’s less powerful when it’s no longer live. 

Ethan Zuckerman:
Yeah, let me be totally clear on this. You know try to take the interviewer point of view and sort of you know. Get the guests to tell the story here, but let me not hide my cards here. You know Block Party is one of the projects that I most admire on the Internet. I think it was a brilliant response to an area where Twitter was just badly behind the curve. 

They weren’t doing as much work as they should have been to give people powerful anti-harassment tools, you were able to demonstrate the incredible value of having a third-party application, interface with the API, use a community potentially to be able to sort of look at tweets, figure out who needed to be blocked. So many of the things that you ended up doing were really best practices, this idea of making it possible for someone else to review harassment you were experiencing so that you didn’t have to go through it and experience it personally. 

Block Party was enormously influential in how we at Initiative for Digital Public Infrastructure talked about the Three-Legged Stool, this principle of design where we believe in lots of small platforms, a client that the user controls, and third-party services. Block Party was exactly what we were thinking as far as those third party services. 

There are many things to be angry about around Twitter, around Reddit, around all of these platforms that are pulling APIs, making them inaccessible. But Block Party is probably the single thing I’m most angry about at the moment. So I offer that in solidarity and as an explanation of why we wanted to talk today. 

Tracy Chou:
Thank you, thank you. 

Ethan Zuckerman:
So let’s talk about two things that are sort of coming out of the current situation. Block Party is not currently being offered as a service. Have you thought at all about releasing the tool set around the Fediverse, around Blue Sky, around some of these other platforms? Is that a direction for where this goes? 

Tracy Chou:
We have thought about it, never say never, but federation is a very different model from a centralized network and has different constraints and needs. So it’s not so straightforward to just immediately port everything over. We would want to do it with a lot of careful thought. 

More importantly, right now, although the decentralized platforms have shown traction, they’re still fairly niche. And there are hundreds of millions of people more who cannot leave the large centralized platforms. So we want to continue supporting those people right now where there is very clear demonstrated need. If you just look at scale, the latest numbers I saw for blue sky where that they are up to 150,000, maybe a little bit more. I mean, that’s still quite tiny. Mastodon is 6 million maybe, but also fluctuating. There’s bursts when Twitter does silly things and then usage trickles off. 

The big platforms are still where all, a lot of stuff is happening, where all that harassment is happening. ADL just released stats and numbers are across the board up for online harassment on these big platforms. Like 51% of Americans have experienced it at some point in their lives. That number is dramatically higher for like the young folks, women, people of color. We’re still on big platforms. So I like the flourishing of smaller platforms and different experiments, but we want to solve the problems where they are right now. 

Ethan Zuckerman:
One of the things that I’ve written significantly about over the last year is this difference between big rooms and small rooms. And I’m really excited about small rooms. I’m excited about community-based social networks. I’m excited about subreddits. I’m excited about communities that can set their own rules. 

But there’s no doubt that big rooms like Twitter are incredibly important. If you’re going to try to build a social movement, you need everybody to hear about it. You need the press to hear about it. You need to be addressing those giant audiences. And as you pointed out, speaking in those big rooms can be incredibly dangerous. It can have all sorts of implications. The sort of self-censorship that you’re engaging in, which frankly puts too much of the weight on you or anyone who sort of has reason to believe they’re going to experience harassment. 

It’s really the platform by its failure to deal with harassment that is sort of silencing the speech. So I get it that simply telling everyone to move to Blue Sky or Mastodon, it’s not the way to go on this. They’re not big rooms yet. We may all sort of experiment there, but it isn’t quite there yet. 

You’ve got a new product that you’re talking about called Privacy Party, which is focused on these very big platforms. It’s focused on Facebook. It’s focused on Twitter. Tell me what Privacy Party tries to do. Who is it for and how is it different because it’s quite different than Block Party? 

Tracy Chou:
Privacy Party is a browser extension that offers automated privacy playbooks for social media platforms. So allows you to be proactive in setting the boundaries for yourself and avoiding the dark patterns in social media platform defaults around sharing. We came across this problem, actually while we were still working on the classic Block Party product, in that our classic product is more of a reactive tool when you’re at the point that you have an audience or that you have attention on you and you may be dealing with unwanted attention and attacks. 

But there was something missing in the more proactive side of securing your social media spaces before that happened. And a lot of the security folks we were talking to who try to work with folks who experience harassment talked about trying to walk people through walking down their settings ahead of time so that this information wouldn’t be weaponized against them, in particular, a lot of data on Facebook, but also a lot of surprising other platforms people wouldn’t think about like Venmo and having your transaction feeds be public and what kind of information that exposes for people when they want to abuse or harass you. 

And so this is still within the same ecosystem of protecting folks who want to or have to be online, particularly for work, but sometimes also just for social reasons and helping people to stay safe. We want to remove that friction so that people can do the things that they also know that they should do. Like a lot of folks know that they probably have a little bit too much information on Facebook or whatever platform it is, but it’s so difficult to go back and clean those things up. 

One of their big beliefs of us, of the company and our philosophy is that safety should have good UX and it should be a consumer experience as opposed to we have these terrible pages whole bunch of toggles. I don’t want to explain what they’re doing. And you have to sift through 10 different links to even find the setting that you might want. It just makes it so that nobody actually ever is safe because they don’t want to dig through all of this mess. So we want to make being safe feel easy and good for folks. 

Ethan Zuckerman:
It sounds like in many ways Block Party was acute care medicine, right? You’re coming into the hospital with a gunshot wound and we’re trying to fix you up and help you deal with that existing onslaught. This is preventative medicine, right? This is take your vitamins, you know, make sure you’re going for a walk, watch your blood pressure, so on and so forth. Preventative medicine is hard, right? People have an acute need, they come and they take care of it, how are you thinking about getting people to think through needing Privacy Party? How are you planning on instead of talking to people about the need to tune up your privacy before you find yourself in an acute situation? 

Tracy Chou:
Yeah, that’s a great question. There are some folks who are already fairly privacy conscious. It’s easier to go to those folks first, identify those people, they’re already aware and show them a solution that they want. There are folks who are particularly aware of the need for this because of a role they may be stepping into or something that they’re about to go do. 

So for example, a journalist who’s joining a high-profile newsroom is about to transition into a role where there’s going to be a lot more visibility on them. Somebody who’s about to start a run for a political office, announce or candidacy and be very visible. There are certain moments where it’s very obvious that there’s cleanup to be done and proactive settings to be fixed up.  

The question of getting a broader audience to understand that privacy is important, I think is eased in that there’s increasing awareness generally as people realize that free isn’t really free because you’re kind of the product and your data is being sold and people are starting to experience in the harms of these attacks are starting to be some more acute consumer facing surface area of some of these issues. 

So for example, with generative AI and people realizing that deep fake porn can be created for their images. And that’s very personal and very terrible and very obvious to see the link between your lack of privacy because your data is out there and it being used for nefarious purposes against you. There’s also been an increase in fraud attacks, where people will pretend to be a family member who’s in distress and asked for money. 

And the more private information is out there about your family or these personal details, the easier it is for fraudsters to attack. And again, this is worse with generative AI because they can also simulate the voices of your loved ones if there’s any of that content out there. So I think it is becoming more part of the public awareness that these issues are real. And the attacks feel so much more personal, especially with generative AI. 

Before, when you’re trying to get people to care about privacy, and it was like, these cookies track you all across the web. People were like, what’s a cookie? Do I care? You maybe would feel the effect of it by seeing an ad follow you all across the web. But that was still kind of a tenuous link between I’m being tracked and it has some impact on my experience versus deep fake porn is very in your face. 

Ethan Zuckerman:
Yeah. It’s interesting. One of the things that I think characterizes both Privacy Party and Block Party is that arguably you are building tools that the big platform should have built before they rolled their tools out. I think you could make the argument that the sort of very powerful blocking features that Block Party offered should have been built into Twitter in the first place, that particularly knowing what harassment problems people were experiencing. 

Similarly, if privacy settings were given as much attention as ad sales, you wouldn’t have to go in and do these sorts of privacy tune-ups. I mean, Facebook and everybody else have their privacy tune-ups. They’re just not actually all that helpful, perhaps because they’re hoping that you’re not going to put as much of the stuff private. How do you think about that sort of balance between maybe I shouldn’t be the one responsible for doing this? Do you find yourself sort of in situations where you’re like, why the hell is Twitter not doing their job? Why do we have to do this work on their behalf? 

Tracy Chou:
I don’t actually think that for some of these things that they’re just being derelict in their duties. I trace it back to what are the system incentives and I can understand, even if I disliked the outcomes, how we ended up with these outcomes And looking at corporations like Facebook and Twitter and knowing what their business models are, I understand how all of this happens. 

I have also worked at tech companies where I’ve seen how we set OKRs, our objectives and key results. I know how companies set KPIs, the key performance indicators that we’re trying to drive. And I know how all of that flows through companies and how the individuals working in them are just trying to do their job well, hit their OKRs, get the promotions that they want. And the way to do that is by moving the company’s objectives forward.

And so I understand systemically how all of this happens, and I also think it’s very difficult to go against the whole system incentives. And so for things like the settings pages being terrible, I also have empathy for the teams have worked on those because I have also been responsible for updating settings pages when I was building a new feature and you are not incentive-advised within a company to make that a beautiful page because it does not drive any metrics that you care about. And so I can lament that the system is set up in this way, but I also understand it. And I do think there’s an interesting opportunity there then to find the incentive alignment of building tools for the end users, where it makes sense that we would optimize for something else than what the platform would. 

Ethan Zuckerman:
You and I have been talking a lot offline about an idea called the authorized agent. Block Party arguably was an authorized agent. it acted on behalf of the user to protect them against trolling on Twitter. The product that we’ve been working on for the last couple of years at IDPI, Gobo, is an authorized agent that lets you sort through your social media feed and display it whatever way you want to display it. 

Tell me about authorized agents as an idea and why it may turn out that the California Consumer Privacy act may be a solution to how Block Party might be able to come back to the market, how Gobo might be able to guarantee its legal existence. What do we need to know about authorized agents and CCPA? 

Tracy Chou:
I always like to draw analogies to the offline world and systems that exist already to help build her way of thinking about these things. So in our offline world, you can give power of attorney or medical power of attorney to other folks. So other people can act on your behalf, make decisions on your behalf at your directive. 

So I can tell my lawyer to sign forms in a certain way, given my intentions. A talent agent can also sign deals on behalf of someone that they’re working with, their client. In the technical world, this idea of an authorized agent is that you could have to make an application or someone who acts on your behalf with these platforms. 

They originate in a concept related to data opt-out, where in California and other states, you have a number of rights to understand and control your data and how it’s used by technology providers and get help from an authorized agent in order to do so. So they should be able to act on your behalf, at your direction, help you control your data. The tricky thing is that in practice, they’re somewhat limited in what they can do without stronger technical access. 

So where we are right now is things like the California Consumer Privacy Act enshrine this right to an authorized agent so that users should be able to have somebody else act on their behalf to control their data and engage with things like algorithmic decision-making, but it’s not fully defined. And so there’s still kind of space there where we think we have this right, but it’s not been spelled out so explicitly. 

What Block Party was doing was effectively acting as an authorized agent for you on Twitter where you could say, “Here are some rules around what I want to see or don’t want to see. You help me enact those rules. So I never even have to come across this content in the first place.” So if it’s that these are new accounts, I just don’t want to deal with hordes of trolls creating new accounts. Like you deal with that for me, so I never have to see it in the first place. 

And that’s what Block Party would do. It was taking directive from the user and then taking action on their behalf, which is very useful because then you don’t have to deal with all the harm of seeing those things. It is actually very useful that somebody else will act on your behalf. You can imagine other types of intervention this could make possible on the anti-harassment front, like filtering out unwanted sexual advances in your Instagram DMs, or like letting public servants filter out trolls in their mentions so they can actually respond to good faith messages and engage with their constituents. And also like as you’re mentioning with Gobo, there’s a lot of other things that an authorization could do on your behalf at your directive. 

By the way, California is not the only one looking at this sort of thing. So there is a bill that was introduced in New York this session, S6686, which is also looking to get API access for authorizing agents very explicitly. So there’s a lot of concepts mixed in here, but it’s really giving people the tools they need in the digital world, bringing over concepts from other parts of society that we already have. 

Ethan Zuckerman:
So I love this analogy to human agents because it sounds exactly right. In many ways, Block Party is sort of like the bouncer, or maybe like the bodyguard. It’s, you know, you’re out there as a celebrity. There are some scary people who want to approach you. You want to be able to interact with your fans, but, you know, let’s keep the stockers at bay. You might want an authorized agent who’s out there and providing that sort of level of support for you. 

Gobo is the information butler. It’s the reference librarian. It’s the person who is sorting through your incoming mail and saying, “You want to see these things? You probably don’t want to see these things. These all feel like things that we can imagine an analogy where we would delegate that responsibility to someone else. We might want certain things. We might want them to act as a fiduciary. We might want them to have a fiscal duty towards us. There’s a lot of different ways to think about it, but we might want to have that authorized third party. 

So CCPA is legislation. it exists, it’s been passed as of 2018. It sounds like the problem there is that while it made the right decision around the authorized agent, it didn’t demand API access and there’s probably rulemaking that needs to happen to make this real. Is that something that could happen with the New York State bill? 

Tracy Chou:
The New York State bill is a little more explicit in requiring the API access so that would be in legislation, But the concepts are quite similar in that for effectively implementing CCPA’s right to an authorized agent, rulemaking could require the API access to make it practical so that users can actually access this right that, in theory, they have. The New York bill is a bit more explicit about here’s the implementation of it. 

Ethan Zuckerman:
Do we know how the platforms feel about CCPA? Do we know how they feel about this notion of the authorized agent? You have a post where you mentioned Cambridge Analytica, which seems to be the bugaboo that comes up every time someone tries to create a third party application to make social media work better. What have you heard from the platforms about their support or opposition to authorized agents? 

Tracy Chou:
There are a lot of different opinions within platform companies, since they are very large and there are a lot of people who have different opinions. A few of the opinions I have heard are one, the specter of Cambridge Analytica introduces the idea of it’s just strategic liability and so we’re worried about what the potential bad repercussions might be. 

“Not necessarily that it’s a bad idea on its face, but there are some potential abuses, and we just don’t want to worry about it. But actually, APIs are great for ads, folks. So we have a great developer problem for people to create ads programmatically. So I guess APIs aren’t all bad, and there’s strategic upside there. So maybe it’s not so terrible. There is potential benefit, but we have to think about the strategic benefits versus strategic liabilities.” 

Interestingly from the trust and safety folks, I’ve been hearing a lot more people say regulation is the only way ’cause we can see otherwise. It’s just hard for the company to prioritize things that are better for the end users and their safety. 

And this goes back to what I’m always obsessing about, like what are the incentives for a company? It is making money. We live under capitalism. they are optimizing for returns to shareholders. And that is very much tied to, for some of these platforms, like adds money, right? So what regulation can do is put a dollar sign on what it means to not do the right thing by users if there are costs to non-compliance or if they’re actually violating, whatever they’re violating. So it starts to make the business case internally for making the experience better for end users and investing in trust and safety. 

Ethan Zuckerman:
I think that notion of incentives is so important. We’re recording this in late June. What’s going on in the world right now is Twitter has really shifted into quite a different platform in part because it has changed its revenue models to essentially let anyone who’s willing pay eight bucks a month, have a blue check, and essentially get unlimited amplification. Reddit, as we know it, may be about to close its doors, or at least change radically. Many, many Redditors use that site through third party apps. Most of those third party apps are going to disappear as of July 1st, in part because of API pricing. 

How are you just feeling at this moment in time. It feels to me, and I don’t want to lead the witness here, but I’m having a tough summer here. I’m sort of looking at online communities that I love, that I spent a lot of time in, and I’m feeling like they seem to be systemically making really bad decisions in the hopes of making some short term profits. Do you have any sort of hope in this situation or is this really sort of a battle? against the dying of a light? 

Tracy Chou:
Oof, tough questions. I think first to answer from the more personal perspective, I just feel sad that these spaces that people relied on and had as a very important part of their lives are going away or changing so much that they’re not accessible in the same ways. I find, I use Twitter very differently now. I used to rely on it to stay up to date with other folks in industry and how they were thinking about things. It was sort of like water cooler chat for everybody in industry, as well as outside of the industries I work in. 

And that’s just gone. I don’t get news in the same way. I don’t kind of see the industry news shared. And so for me, I have to recalibrate where do I get information from? do I socialize with these people that I used to socialize? And I feel like a lot of people are trying to figure that out as well. Like, are they moving to new platforms? I’ve been wondering if I should go back to reading a physical newspaper instead of getting my news through Twitter? I could just, what are the things I need to do to change my life to kind of meet all the needs that used to be served by these platforms? For more of the, like, thousand foot view of like what’s going on, how do I feel about it? 

I think we’re at a really interesting inflection point and I hope that regulation can steer us a little bit more towards the right directions that embody the values that we should want for society, knowing how important digital platforms are for society today. For a long time, the builders of these platforms have been able to run wild and do whatever they want, subject to the constraints of capitalism mostly. And it was useful for some flourishing of innovation, but I think we’re now at the point that we need to rein that in a little bit.

And right now we can see people just making very bad decisions with no guardrails, and hopefully regulation can provide some of these guard rails. Whether it’s regulation, whether it’s market movement, whether it’s public shaming, I sort of feel like it would be a really nice moment if we could move in some of the directions that you’re talking about. 

Ethan Zuckerman:
She’s Tracy Chou. She’s one of the very smartest and most thoughtful people about harassment, trolling, how people protect themselves on their privacy online. She’s the CEO and founder of Block Party. She’s working not only on Privacy Party, but on getting people to pay attention to this notion of authorized agents, which I think is an immensely important idea as we think about protecting the internet that we know and love. Tracy, it is such a pleasure. Thank you so much for being with us. 

Tracy Chou:
Thank you so much for having me.