Diana Freed

101. Reframing Digital Safety with Diana Freed: For Survivors and Youth, The Biggest Threats Come From Everyday Tech

Reimagining the Internet
Reimagining the Internet
101. Reframing Digital Safety with Diana Freed: For Survivors and Youth, The Biggest Threats Come From Everyday Tech
Loading
/

CW/TW: Intimate partner violence, child exploitation

Diana Freed has spent the past several years radically reframing the threat model in cybersecurity with groundbreaking research into how domestic abusers utilize everyday technology like smart phones and tracking apps. Diana sits down with us to talk about digital safety for survivors of intimate partner violence and youth, as well as the incredible work she’s doing to educate her colleagues and the public about these threats.

Diana Freed is a fellow at the Harvard Berkman Klein Center and will be joining Brown University as an Assistant Professor in the Department of Computer Science and the Data Science Institute.

Diana’s talks mentioned in this episode are:

Transcript

Ethan Zuckerman:

Hey everybody, welcome back to Reimagining the Internet. I am your host, Ethan Zuckerman. I am really thrilled about today’s guest, Diana Fried. Diana is starting as an assistant professor at Brown University in the Department of Computer Science and the Data Science Institute this fall. At the moment, Diana is a joint fellow at the Berkman Klein Center for Internet and Society at Harvard and the Center for Research on Computation and Society at the Harvard John A. Paulson School of Engineering and Applied Sciences. Diana finished a PhD in Information Science at Cornell recently.

And Diana’s been high on my list of people I’ve wanted to get on the show for quite some time because a conversation I had with her a year and change ago kind of completely changed my understanding of computer security. Her work focuses on people who’ve experienced intimate partner violence, domestic violence, children and how they deal with digital safety. It’s some of the most important work being done in the world right now. Diana, I am so glad to have you here.

Diana Freed:

Thank you so much. It’s wonderful to be here and I’m so excited to share my work with you.

Ethan Zuckerman:

So Diana, we met when you came and gave a talk over here at UMass Amherst. It was a talk about intimate partner violence and this idea of a UI-bound adversary. What ends up happening in situations of digital security around intimate partner violence and what is a UI-bound adversary?

Diana Freed:

Sure. So what a lot of my work has focused on is technology facilitated abuse in intimate partner violence. And what has been thought to happen is that there would be very sophisticated technology tools used to access devices or technologies of survivors. And what’s actually the case through a lot of the research that I’ve published along with colleagues is that we’ve surfaced the idea that many of these attackers of users are UI-bound adversaries, meaning that they have access to the user interface as an authenticated user and they’re actually not technologically sophisticated.  They exploit the technology by interacting with standard user interfaces.

And that’s important to understand because what it means is that there’s a very strong sociotechnical aspect to abuse. And it’s not necessarily that they are sophisticated or getting access to information through other types of tools.

Ethan Zuckerman:

So this isn’t about breaking encryption. This isn’t about putting sophisticated backdoors into someone’s operating system. It’s as simple as the fact that in cases of intimate partner violence, the intimate partner often has direct access, the ability to log into a tool that the person who’s being affected is using. Give us a concrete example of how this might play out. How might this play out for someone who is exiting an abusive relationship but finds themselves targeted through the technology that they’re using?

Diana Freed:

Absolutely. So in my research with survivors over the last number of years, and this is all published in peer reviewed journals and conferences, what we’ve learned is that there’s patterns of behavior. And what happens sometimes is that an abuser may actually purchase the device for a survivor at the beginning of the relationship.

They might set that device up. They might have access to over time any type of iCloud, any type of Gmail account, other email. And they’re logging on and setting that up. And that’s unbeknownst to the survivor. And so that they maintain access and information to those accounts. And when someone leaves the relationship, they might have been unaware during the relationship that was going on. And it sometimes actually increases when they escape, but the person has access.

And the concern in these cases is always disconnecting the abuser from the technology and when to safely do this because the issues in intimate partner violence with or without technology is always around coercion and control. And technology is now an extension and used as a tool of control and coercion. So that’s an example how someone does it.

And we see that they also sometimes will, especially at times of ending a relationship, also get a device for a child and gift a device at that point in time. So if they are not directly tracking the other parent and there are children that might be in a shared custody situation, the child’s device is now being used as a way to surveil another individual because that is gifted to the child.

Ethan Zuckerman:

But it is really interesting to think about how these tools built into things, this idea that an iPad tells you where it is as an anti-theft device that could end up turning into stalker technology. You’ve seen this happen in the real world. Can you talk about how your work with survivors of IPV has informed your research in this space?

Diana Freed:

Sure. So a lot of what I do now and spend my time in this providing trainings to survivors, to advocates, to clinicians in terms of just how information flows and how to protect oneself on technology. I do this with governments. It’s all service work.

And what I find is that a lot of people don’t necessarily recognize all the different ways that information flows and technologies that they need to be aware of or think about. And so we just interact with so many different things during the day, whether it be a car or like you’re saying something for a GPS on a dog or some people are using family apps like Live360 or other things to keep people safe. And inadvertently, it’s giving you information about another individual and it’s not always recognized.

And so really understanding the different, you know, basic ideas, just understanding the apps that one has on their phone in any relationship and things that are shared.

And people might remember the shared bank account, but they might not remember the shared food account or other things. And so when they leave the relationship or the relationship might turn unsafe or become unsafe and they’re trying to protect themselves, they are not always aware of how, you know, settings are aligned on their phone and how to protect themselves or when they need to delete an application. And we see this also in payment apps, right, in terms of people doing public payments on Venmo when they intend to keep that discreet.

And so I think it’s just understanding who can see what you are doing and who you’re sharing information with. And companies have taken somewhat of a step to try to make this a little bit more overt, but it still becomes an issue like we’ve seen with air trackers and other types of devices over time, you know, that there are these ways that it kind of creeps into the IPV context. And a tool that’s well-meaning now becomes a tool of surveillance.

Ethan Zuckerman:

You gave a talk at the CHI conference, the SIGCHI conference in 2018 called, “A Stalker’s Paradise, How Intimate Partner Abusers Exploit Technology.” I think for a lot of people who study cybersecurity, this was a really novel way of thinking about things. After your talk at UMass, I had one of those conversations with a fellow faculty member over in computer science. And it’s one of these conversations that you sort of only have in academe, which is to say it’s kind of stupid. But he came out of it essentially saying, brilliant research, so wonderful, but is it computer science? Isn’t this really sociology? Isn’t this really something else?

And I sort of fumbled for a little bit and I finally ended up saying, look, you’re a cryptologist, you’re used to thinking about Alice and Bob and Yves the Eavesdropper. And usually Yves the Eavesdropper works for some secret government agency and is cracking codes or tapping wires or something else. In this case, Alice and Yves dated for eight years and then had a terrible breakup. But Yves knows everything about Alice. Yves knows Alice’s mother’s made a name because Yves knows Alice’s mother. Yves knows what road Alice grew up on. All of those security questions are going to be compromised.

We Yves of Eve as the KGB agent or the NSA agent. We rarely think about Yves as dating Alice. And the truth is, this matters not just for intimate partner violence, it matters immensely for intimate partner violence. This matters for security within an organization where you have someone who leaves an organization and goes rogue, but still might know critical bits of information.

How do we bridge between the incredibly important sort of one-on-one work you’re doing, providing workshops for survivors of intimate partner violence and getting the field as a whole to understand that your work challenges everything we know about cybersecurity?

Diana Freed:

Well, as we mentioned in the beginning, my work deals with intimate partner violence. It deals with technology facilitated abuse in context of children where the perpetrator might be someone known or unknown. But there’s a lot of account impersonation. Also I study elder abuse where it might be a family member or caregiver. And in all these different contexts, often people have information.

And I think that what we need to really understand is as new systems are being introduced or as we’re thinking about security and privacy and how people understand the systems that we’re using, we need to have transparency and system design that allows people to easily understand information flow and who has access to their information while they’re using the technologies and also how to disconnect and protect themselves.

It’s very much involved in terms of system design and helping people. So that’s why it’s socio-technical work where there is a society implication, but it’s also very much grounded in computer science and in technology to help people understand the systems and also to design improved systems that allow for safety by design.

Ethan Zuckerman:

One of the things that’s been interesting and by interesting, I mean terrifying to watch is that deep fakes haven’t been super successful yet in terms of convincing us that Joe Biden said something silly or Donald Trump said something silly.

But anyone who’s spending any time online is encountering dozens of ads for sites that promise to undress anybody. And clearly we are on the crest of a wave of AI-generated non-consensual intimate imagery. Is that something you’ve seen in your work with people who are experiencing intimate partner violence, whether that’s non-consensual intimate imagery of release of photos taken during a relationship or this notion of creating novel AI as a pathway towards harassment?

Diana Freed:

Yes, I’ve seen it with adults and I’ve seen it with children—teenagers—and I’ve encountered it through my work with different organizations and I’m currently conducting research in this space and it presents the same types of challenges that NCII—non-consensual intimate images—present in terms of getting these images down. I think what I hear survivors encounter is well—sometimes to push back now as well, “It’s not really you so you don’t need to worry about it,” or other people not understanding it and it only adds to the trauma and the difficulty.

But ultimately the same methods to get this type of information down are being used, notifying the company, notifying the platform, informing police. There are laws being passed or the deep fake technologies are being regulated within the similar context of NCII and so I think it’s being addressed and recognized as a problem.

What’s different is the frequency by which it’s surfacing among young people and I think that’s something that is a bit more pervasive in terms of what we’re hearing. So we’ve seen celebrities recently experience this, government officials experienced this and there have been public cases that have been covered whereby children now are falling prey to this where they have never consented to any image but it’s distributed through their school and it’s presenting new challenges in terms of who created this.

The tools are not very sophisticated or hard to access and I think that there’s some catch up being done in terms of digital literacy programs and what’s acceptable to teach to or about because this is ramping up very quickly and it’s easy to unfortunately easy to create this type of deep fake or synthetic type imagery of someone.

Ethan Zuckerman:

I have a 14 year old son and so I am familiar with what an incredibly challenging environment middle schools and high schools end up being and it is not hard to imagine what sort of harm deep fake non-consensual imagery could do in a school setting.

There’s a level of complication on this as well which is that I think by most interpretations that deep fake imagery also qualifies as child sexual abuse material despite the fact that it may be AI generated, and you and I both know from our work that working with child sexual abuse material is incredibly challenging legally. There’s very few groups—mostly NCMIC the National Center for Missing and Exploded Children—who are actually able to work with that content. Our friends over at the Stanford Internet Observatory have a new report out on the danger of NCMIC being overwhelmed by deepfakes within this.

But you note that this is not the only challenge that youth are navigating and that being a young person online at this point actually represents all sorts of attack surfaces for abuse that we have to consider. Give us a sense of that landscape that you shared in your CHI paper in 2023 understanding digital safety experiences of youth in the US.

Diana Freed:

So what that paper surfaced was really the different types of threat models that youth experience today and how the perpetrators are very broad and so you know in that paper we discuss everything from sextortion, extortion, different types of digital harms that youth experience things known like cyber bullying, misinformation and all the different contexts that youth experience.

But what was very poignant is just understanding and speaking with the 36 youth that were involved in that study as well as the 65 adults and what we learned really is that sometimes the perpetrators might be a peer that the child goes to school with. But they can also be an adult that the child knows so somebody that the child is well aware of.

And at the same time at-risk youth might also be experiencing harms because they’ve engaged with someone online that could be appearing as a child. Understanding that the perpetrator in these cases are known and unknown to the child so it adds a level of complexity and intimate partner violence. The perpetrator is someone who is familiar, someone who has been usually an ex-partner, someone involved.

We don’t know quite often who the perpetrator is in these cases with youth and it’s a much more complex issue and part of working with kids is understanding do they know this person in the real world, how did they meet this person, what did they share and how many different apps do they know this person on. Because also what we see with kids is that things move across platforms so it’s not a specific harm happening just on one app.

A lot of times perpetrators will—and this is from the research—move a child from a more public space where they might first engage and then move the child to apps that they might consider a little bit more private and conversations move from different spaces so not thinking of this type of harm is happening on a specific app and we only need to watch out for this particular app. It can happen in many different places.

Ethan Zuckerman:

Right. The situation of sex distortion is something that I think many people were not aware of until maybe the last two or three years. Now patterns where someone might be approached on Instagram and then talked into moving over to a platform like Telegram and situations where someone is then asked for intimate imagery and then that’s used to blackmail someone into producing more intimate imagery. These are just horrific stories that we’re sort of starting to hear how prepared are the platforms that are being used for this to understand their responsibilities in this.

I understand that putting “responsible” and “Telegram” in the same sentence is not necessarily something that parses and how prepared is law enforcement to deal with this when you’re working with these cases when you’re trying to help someone, a youth or a parent who’s sort of experiencing and trying to unravel sex distortion.

How responsible are the platforms with this and does law enforcement understand what’s going on or is this novel territory for them?

Diana Freed:

So I think the platform to recognize it’s a problem at this point and different platforms are taking different measures. That’s one component of it. I think law enforcement, it really varies in terms of which a lot of times this presents to local law enforcement. And so I think it really depends upon what different departments have in terms of their preparation or their understanding to engage with technology.

And sometimes these cases get moved up into different areas of safe harbor, child protection, and it gets very involved in different areas of protection depending upon what’s happening. I think an important part of this is how we respond to children who are young people experiencing this and what kind of safety we put in place.

Sometimes there is unfortunately blame put on the child or children will communicate that they are concerned about telling an adult about what happened because of the stigma associated with an action they might have taken. And the action being maybe they thought they were sharing an image with somebody that appeared to be another youth. And despite whatever judgment there is about that, we need to create safe spaces for children to be able to talk to an adult about this and not react punitively and help provide tools and understanding and education around this.

I think some of the challenges as it’s happening at earlier ages where we haven’t traditionally spoken to young children about these types of harms. Figuring out a way to incorporate this into digital literacy education at the appropriate time because it is happening sometimes in middle school and at ages that seem younger than we want to understand or not that we want to understand that we think about sometimes this happening to. We need to be able to talk to kids about not sharing this type of content.

It’s difficult to think about having conversations with very young people about this but it seems to be a reality that we need to evaluate carefully depending upon the school systems or inner views we have.

Ethan Zuckerman:

This takes us in some ways to one of the pieces that I’m tremendously grateful that you have put out which is a tech talk on securing devices. You’re not just reporting on this stuff, you’ve also been sort of training people on how to think about locking down the devices that they’re using at a moment where they’re going through a breakup or divorce where there’s the risk of stalking or intimate partner violence. You just made reference to this fact that we have to rethink how we do digital literacy and try to sort of bring these things into the curriculum.

Talk to me a little bit about the relationship between your research and the sort of practical outputs and the sort of advice that you’re producing. Do you want to point people towards some of the work that you’ve been doing if they’re listening to this and sort of feeling like, “Oh my God, I need to get a handle on what’s going on in my own digital life?”

Diana Freed:

Sure. In terms of the work that I’m doing, I generally provide along with colleagues trainings to different organizations or schools or corporate that are free and they’re available and we allow people to see the topics that they’re concerned about and cover everything from AI to financial safety and really helping people to understand what to do in these cases.

I’ve done some recent tech talks with the National Network to End Domestic Violence. I have one coming up on AI and I’ll be speaking at the National Conference and those are available online and I’ll be speaking at the National Conference this summer in August on AI and Domestic Violence.

A lot of what I do is also updating the talks constantly because they’re changing but really the essence of these talks is to help people feel empowered in their own technology. Whether it’s a data breach that they experience, what are the actions you can take, what are the tools you can take, how to protect your own data online, so very general security and privacy that I think are relevant to all of us today and also relevant to survivors and then talking about that sometimes specifically to kids.

A lot of the talks that I do or trainings that I do, I don’t always record online just because things change very quickly and so there’s always a concern that someone watches something and then follows that but it was from a few months ago and things have changed and so the reason that the talks don’t exist in volume right now online is because of that. But the trainings are available.

Ethan Zuckerman:

Diana, to the extent that you’re comfortable talk to me a little bit about how this became the focus of your research. Obviously these are some of the most important issues out there. I think this is a vastly understudied field. I am grateful every day that you are taking this on but there’s an enormous amount of psychic weight involved with this. You’re dealing with some of the really the most challenging sort of behaviors on the internet and people who are going through enormous amounts of pain and struggle.

What brought this into focus for you? What got you involved with this?

Diana Freed:

So I have a background in psychology and I was fortunate to train at Cornell Tech, a new campus at the time in New York City and one of the focuses of the Cornell Tech was to try to do applied work and to help the city of New York and get involved with organizations and really look at public interest technologies and in that capacity myself and collaborators became engaged with the New York City Mayor’s Office to End Domestic and Gender-Based Violence and they brought up this issue that they were having with survivors talking about their concerns about being tracked and surveilled. You would hear things early on that someone always knows where I am. Just hearing about these different levels.

And this became the focus of my research at Cornell and I moved from adult intimate partner violence to focusing on child violence, interpersonal violence, using digital technologies because I was hearing so much in the adult work about how children were being used in these situations and the types of forms happening to them.

And since that time I’ve now also expanded my work into caregiving facilities in older adults. And a lot of times the threat models are somewhat similar in these spaces but people are dealing with different issues.

I find the work very rewarding and at times it is very challenging to hear the harms that people have. Trauma-informed trained as are many of the colleagues that I work with which helps but I personally am very passionate about the work that I do so I don’t necessarily experience it, the difficulties of it I think.

It’s always self-selecting to go into service type fields and I consider my role as a researcher to be somewhat of a dual role and I focus a lot on public interest work as you know and this is just a very important area and I think that my goal going forward is to engage with more people in computer science and help provide trainings for other people to engage and help provide the services that I do because I think it’s very important and we need more and more people on the ground doing this kind of work.

Ethan Zuckerman:

I think both encouraging more people on the ground to do this work I also think getting people who are designing systems to be able to design from the perspective of how do we protect elders, how do we protect youth, how do we protect survivors of intimate partner violence.

When I was doing a lot of work on anonymity I spent a while working with people who were building TOR including Roger Dingeldine to talk about what the actual live scenarios are for a pro-democracy activist in Vietnam or a dissident journalist in Ethiopia, and in the long run I consider it some of the most important work that I’ve ever gotten to do, so I understand what you’re saying about how rewarding this can end up being.

A lot of your work at the moment is about youth. One of the things that I teach when I teach students about internet policy is that everybody loves to shape their preferred internet policies in terms of protecting youth.

We are seeing a wave of policies from a TikTok ban to the Kids Online Safety Act that are demanding very, very large changes within the social media environment in the United States intended to protect kids. Any reflections on whether these bills are addressing the sort of harms that you’re seeing or are they addressing something else sort of under the headlines of child protection?

Diana Freed:

So these are very challenging questions. I think that there are well good intentions to people trying to address difficult problems that have happened in society and there is a call to do something, right? We need to do something. It’s just how we need to look closely how the policies are designed, as you’re alluding to, to make sure that they are not limiting protections or freedoms of speech or creating a false sense of safety in situations where it’s not actually going to protect against the threats that people experience.

And then also understanding clearly in situations where companies might be held liable. Is it reasonable that companies should be held liable for these things? And the policies are constantly being redrafted and reproduced. And I think that it’s just very important to focus on the mental health right now of youth, what we can do to support them, what we can do to educate them and protect them and design safe systems.

I just say that we need to really understand what the regulation will accomplish and if it aligns with the threat models as whether or not it’s reasonable and making sure people understand what these things will protect against. Because sometimes I think that it’s easy to or for us to want to think that it’s protective, but it might not actually align with what’s happening in the real in some of the real environments and it might not protect in those cases.

So having that type of clarity and having people on the ground who have experienced these harms as well as having the different voices in terms of crafting these different regulations is important.

Ethan Zuckerman:

Diana, thank you so much for being with us. This is really just a wonderful overview of a really complicated, challenging field that it’s hard to really get your head around. But you are so practical and so rooted in the lived experience of people doing this work. I think it’s just something that researchers around technology and society should really aspire to.

And like I said, you were one of the people that I most admire in this field and I’m just really grateful for the work that you’re doing. And I am insanely jealous of Brown that they get to have you there and that we weren’t able to lure you over to UMass. But I wish you nothing but the best of luck.

Diana Freed:

Thank you so much. It’s been an honor to speak with you and thank you for having me.

Ethan Zuckerman:

I’m Ethan Zuckerman. She’s Diana Fried. I hope you enjoyed this episode. If you did, you may want to go back and look at some of the other conversations we’ve had about related issues. There’s a great episode with Catherine D’Ignazio talking about data and feminicide, which relates to these issues of intimate partner violence.

We had a two-part episode with Brian Levine who does work on child safety. Those are also things you might want to look at.

Thank you for being generous with allowing our listeners to reach out to you. And again, thank you for all the work that you’re doing.

Diana Freed:

Thank you very much.


Comments

Leave a Reply