The former director of one of the world's most secure Web browsers talks about holding harassers accountable while protecting the digital privacy of the everyday citizen.

Open the door to a brick building in old-downtown Seattle, climb the stairs, turn left into suite M101, labeled "Seattle Publishing," and you won't find any editors or writers in the publishing sense. What you will find is a small group of people toiling away to make sure one of the world's most secure Web browsers—called Tor—stays safe and current.

An application that can access the regular Internet as well as the dark Web, Tor protects the transport of data and helps users roam online anonymously and untracked, wrapping their interactions in onion-like layers of protection and routing them in a way that provides more autonomy over personal information than Chrome or Explorer or Firefox. In an era when digital privacy and security seem ever more eroded, such a tool can prove invaluable to people's ability to communicate without being watched.

At the back of the suite, at a desk behind the beanbag chair, catty-corner to a cache of promotional stickers and the espresso machine, I found Shari Steele, who has served as executive director of the Tor Project for the past three years. (Note: Steele stepped down last month and was succeeded by Isabela Bagueros.) Upon my arrival, she offered me an espresso, gesturing to the machine, and then shuttled us both into a conference room, where Steele, wearing a flannel shirt, leaned back, crossed her legs, and prepared to tell the story of taking over Tor in late 2015—and of the problems she inherited.

"Young women never stick around," someone warned her early on. "Wait, what?" she, an older woman, thought. It was one of the first hints that on her way to protecting vulnerable people online, Steele would first have to figure out how to protect the vulnerable people on Tor's payroll, and then lead them through a harassment scandal that preceded the #MeToo movement by two years.

But first, a little more about Tor. Tor's aim is to keep secret who you are and what you're doing on the Internet. When you type in http://www.disney.com, instead of sending a signal directly from your (identifiable) computer to Disney's servers, it sends your request—encrypted, so it can't be read—through a series of "relays," which are computers around the world that volunteer some of their power to this purpose. So your request goes to Mike's computer, then to Tara's computer, then to Schmiddy's computer, and only then to Disney. All Disney sees is Schmiddy's computer, and because it also sees that you arrived via Tor, it knows Schmiddy didn't really type its URL. To trace the signal back to you is not impossible, but it is extremely difficult, involving complex statistical analysis.

On top of that, Tor can access a unique kind of URL that ends in .onion instead of .com or .gov. Onion URLs, often random strings of letters and numbers, can anonymously connect users to parts of the Internet that search engines don't index. It's where people stash assassin services, sure, but also where they host forums for discussing political action.

Steele, who has worked in digital rights and protections for decades, took over at Tor partly with the idea that she could make this secure browser more accessible for the everyday citizen, and not just for hacktivists or the information security—or "infosec"—community, a group of people actively concerned with and working to improve the privacy and protection of the online world.

Jacob Appelbaum speaking at a conference in Germany.

Jacob Appelbaum speaking at a conference in Germany.

But soon after Steele stepped in, she began to hear stories about a Tor Project employee named Jacob Appelbaum.

Appelbaum was one of the program's superstar developers. With his flop of hair and his plastic-framed glasses, his outspoken association with WikiLeaks and Julian Assange, his coding chops, and his flashy public persona, Appelbaum had become a key figure for the Tor Project. But just before Tor's semiannual meeting, Steele's first, she caught wind of secondhand allegations against him: Women were accusing Appelbaum of bullying, of harassment, of assault.

And so Steele's first tasks were to hold Appelbaum accountable, to protect the accusers, to provide a fair assessment of what was actually going on, and to ensure that the Tor Project didn't foster an environment that allowed harassment to flourish. (Appelbaum denied all accusations of criminal sexual misconduct. "Inevitably, there may have been moments in my professional or private life when I may have inadvertently hurt or offended others' feelings," he wrote in a social media post in June of 2016. "Whenever I was aware of these instances, I have, and will continue to, apologize to the friends and colleagues in question.")

Asset 1Onion Break

What was it like when you joined the Tor Project and began to hear rumors about Appelbaum?

Everybody knew that Tor was mismanaged. And it was badly in need of someone to come in and be a strong leader. But I didn't really understand the social dynamics. I couldn't figure out exactly what was going on or who was at fault, so I asked Jake [Appelbaum] to sit out the meeting. I was just trying to figure out what was fact—what was happening. And it turns out that was a really worthwhile thing for me to have done, because there were several women who came to me during that meeting and told me their stories about things that had happened with Jake. I don't think that would have happened if I hadn't asked him not to come. There was a level of trust being built without my even consciously realizing it.

It was during that time, hearing so many stories, that I started to come to the conclusion that he had behaved badly and that this was a pattern. During that meeting, I realized that the Tor board and the leadership at Tor weren't asking enough questions about Appelbaum and the women's experiences. Strong leadership requires you to not just hear these stories and go, "Well, that's just the way it is." You have to be the one to take responsibility, because you are the leadership within the organization.

I was feeling sick to my stomach hearing all of this stuff and not knowing what to do with it. After the meeting, within a couple of weeks, I heard from a woman who said that she was raped by Jake. So I contacted him and said, "I need you to resign, and I need you to do it quickly."

He did, on May 25th, 2016. Soon after, Steele announced Appelbaum's departure, addressed the variety of allegations (although, for legal reasons, none of them in detail), and promised the community both an investigation into his behavior and a course correction for the organization. In the midst of these announcements, a new website—www.jacobappelbaum.net—went live. It detailed anonymous allegations from eight people in the infosec community, ranging from sexual harassment to sexual assault to bullying to plagiarism.

What did you think when you saw the site? And do you think the anonymity of the accusers helped the situation?

In a way, it freed me up, because I could refer to their stories by their pseudonymous names. I was proud of them. I was proud of them for finding their voice and finding each other and being able to take comfort in that. I think that a lot of times survivors have been severely disempowered, and that's how sexual harassment and assault get to be pervasive—because nobody has the power to be able to come forward and say, "This happened to me." The fact that they got together gave them power.

Other workforces—in Hollywood, Capitol Hill, and New York media, for instance—have gained that same power since 2016, when you began investigating Appelbaum. Many of the publicized solutions to harassment or assault, though, have simply been, "Get him out of here." But if you don't change the soil composition, won't the same bad apples continue to grow?

For a single person to be able to run rampant in an organization or within a community, there has to be a culture of allowing that to happen. And it starts with the leadership of the organization not stepping in and doing what needs to be done. Everybody kind of allows it to happen. People develop a whisper network, talking about what's going on but not dealing with it directly, feeling unempowered. So, yes, at Tor, there was a central figure in all this, but he was as much a symptom as he was a problem. And he was a problem.

A thing I heard from people in this community that I never accepted was, "Well, was it during worktime?" Who cares if it was during worktime? Hi, I'm a mass murderer, but I don't do it during worktime.

So you began to figure out how to alter the growth medium, to replicate, within the organization's form, some of the principles that underlie its function: Make the Tor Project—not just the Tor browser—safe for, equitable toward, and open to diverse groups. But how?

There was a whole lot of healing that needed to go on with this community. There was a lot of figuring out how much damage was done. How do people who knew but didn't do anything deal with their guilt? How do people who didn't know but go, "Gosh, I should have seen these signs," deal with their guilt? How do the people who survived this feel like they can continue to thrive and move forward and become strong and active contributors and feel comfortable hanging out in this community?

A lot needed to be done beyond just telling [Appelbaum] that he couldn't be part of Tor anymore. And that's a continuing thing. From now on, our Tor meetings will have a community health component—to give people an opportunity to share how they're feeling about things and get help for whatever is causing them pain at that time.

You really need to hear what's going on, and be honest about what's going on in your environment. Fire someone if necessary. Bring in more education. Let people know that certain behaviors are not acceptable behaviors. And come up with some policies so people know that they can talk about issues if they need to.

After a professional investigator hired by the group found that people inside and outside the organization "experienced unwanted sexually aggressive behavior from Appelbaum," the Tor Project announced changes that aimed to forestall future harassment and to appropriately address it if it happened. Several policies that had been drafted were released: an anti-harassment policy, a conflicts-of-interest policy, procedures for submitting complaints, and an internal complaint review process.

The Tor Project has 45 paid employees, but thousands more are part of its extended community, voluntarily running relay nodes and developing new applications. The organization does not have direct authority over all of them. Nevertheless, the people of the extended Tor community decided to take the task upon themselves. Alison Macrina, a Tor contributor and Library Freedom Project founder, has helped guide this ongoing effort, which includes a public social contract, a community council for resolving disputes, and joining forces with other infosec efforts—like Protect Our Spaces, which demands codes of conduct from conferences and events, and organizes boycotts against those that have "known sexual predators in attendance."

There are, of course, still problems in the wider infosec community—still harassers, still discrimination, still tilt where the playing field should be level. But the Tor leadership has tried to do, within infosec, what the Tor Project hopes its browser does: Help people come together, communicate, and make the world a safer and better place.

Tools like Tor allow people to stay anonymous, and that principle is a core value of the infosec community. In addressing harassment in that world, how do you balance accuser anonymity and public accusation?

You have to. You have to balance it. From my perspective as the head of an organization, victims who want to remain anonymous are my No. 1 priority. That anonymity is extremely important for safety reasons. Many times, people who are harassers also have other tools in their arsenal to shut down people, by ostracizing them or by taking revenge on them or otherwise harming them. You have to be sensitive to the fact that the person who had done harm can continue to do harm. And so a lot of the dealing with it internally was really talking about patterns of behavior.

And externally, you have Tor, this tool that allows people to have secure and private interactions online, but the other side of that coin is that bad stuff can also happen in that space. How do you and how does the Tor Project think about that tradeoff?

I had certain things I wanted to accomplish when I came on, and one of them was to rehabilitate our reputation. There was a time before I came on when people believed Tor was just for the dark Web. We've been working on trying to fight that reputation, to separate the two. Yeah, there are some bad guys using Tor. But the people who are working on it, we're not doing it for the bad guys. We're doing it for the good guys.

The reason that Tor exists is to save lives. We have instances of Tor being used where communicating is dangerous, and being able to tell your story could actually cause you to be killed. That's what it was created for. Tor is, at the center, the infrastructure for the Internet freedom movement. It is absolutely essential technology.

The fact that there are uses for it that are nefarious? That's just collateral. We're willing to accept it because the need is so very important. OK, you do your thing. We're not policing it. We're just creating the tools so people can do their work—activists, journalists, and just regular people. The majority of communications that are happening over Tor are not dark-Web kinds of things. It's really becoming much more normalized—and, for certain countries, much more essential.

We've been talking about how to normalize it more. The reality is, the entire Internet of Things should be on the dark Web. You shouldn't be able to communicate with your car over the general Internet. It's been shown that they get hacked. All you have to do is kind of know what the formula is. You can hack into your own car. You can hack into somebody else's car.

Of course general data—from your phone, your social media accounts—is also insecure. And the ways that companies use that data have been in the spotlight recently, like when Cambridge Analytica, Ltd., obtained information about Facebook users and used it in political campaigns. That revelation led to a lot of public outrage, as well as to a congressional hearing with Mark Zuckerberg. Do you think that scandal changed the average person's relationship to privacy and made them think about it more?

I think it made them think about it more, and I think it hasn't changed anything. It's actually kind of depressing. We've done studies for years and years, where if you ask someone a generic question—"Do you care about privacy?"—the answer is always yes. And then if you ask, "Are you using grocery store cards to get discounted prices?" they don't care about that.

The fascinating thing about Cambridge Analytica is that this kind of stuff has been going on for years. Cambridge Analytica just all of a sudden became something the public was interested in. And everyone here is kind of scratching their heads going, "I don't know why this is the one that made everybody outraged, but OK." So it's become part of the public conversation, but I don't think people really want anyone to make it so they can't use Facebook anymore. Maybe some people are becoming more educated. I guess I'm kind of pessimistic that it's going to change very much.

Why do you think that is?

Part of it is that people don't actually really care. They care on a very generic level, but they'd rather get that discount at the grocery store, or they'd rather be able to use Facebook for free. And if that means their data is going to get sold, then that still feels free to them. I also think people feel powerless. I don't think they feel like they have a lot of control. They could move to another app, but that app's probably going to do the same thing.

Digital harassment and surveillance often target groups that are already marginalized: radical political groups, whistleblowers, human rights activists, religious minorities, victims of domestic violence, people with stalkers who want to track their locations. Since the advent of the Internet—but especially since social media companies became giants and smartphones became ubiquitous—tech companies, run and staffed largely by white men, have wielded more and more power over personal data. They collect it and analyze it and sell it in opaque ways that exceed assumptions and sometimes go beyond the boundaries of user agreements. Alexa may be listening all the time, not just when you tell her to; apps like AccuWeather track your location without your permission or your knowledge; if you put a picture on Facebook, Facebook owns it—unless you delete the content or delete your account, and even then it may live on the site for some time after.

And one problem for disadvantaged groups is that more and more of that data collection feeds into surveillance. Smart algorithms can now learn, say, to identify cats by looking at a million pictures of cats. That's a simple example, but if you zoom out, the same technological process can "social sort" people into groups. Depending on who develops those algorithms, and which data sets the programs learn from, the results can misunderstand gender identity, be deeply racist, or reveal something about you that you don't want the world to know—that you're queer, or that you're pregnant.

Given the current state of online privacy, who do you think should be using Tor?

Tor is for anyone who needs anonymity and who needs secure communications that aren't going to be attacked by others, including nation-states. A lot of times, that means activists in countries where the government is actively going after people's communications or limiting their ability to communicate. But it also includes people in the United States who are sitting in a coffee shop and want to do a credit card transaction and don't want everybody in the coffee shop to be able to suck down their information. Tor is useful any time you're in a situation where you're on an insecure network and you want to be able to do a transaction.

Or if you want to look things up that you don't want anyone to know you're looking up. For example, you want to know about ISIS—not because you plan on joining ISIS, but because they're in the news all the time and you want to know what they're all about. You don't want to get flagged as an ISIS sympathizer because you aren't an ISIS sympathizer. You're just interested in finding out. Tor can help you with those kinds of searches. It can help you look into gay meet-ups in your area if you haven't come out, if you're a teenager and uncomfortable with your parents finding out. There's all sorts of reasons that people want to have communications that are anonymous but that aren't illegal and aren't nefarious. It feels to me like people don't really understand how pervasive the surveillance is.

Asset 1Onion Break

Steele stepped down in November of 2018, having led the Tor Project through a complicated time. Much of her tenure was focused on making Tor more user friendly, particularly for people in less technologically advanced countries, and on releasing its new mobile platform for Android users, so people can communicate securely from their phones—the only devices with Internet access that many people have, especially in developing countries. Steele now sits on the Tor Project's board of directors, where she remains committed to giving people security online to enable expression and disable discrimination, creating privacy in a world where, increasingly, it seems like someone is always watching. 

break-black

Author: Sarah Scoles
Sarah Scoles is a Denver-based freelance science writer, a contributing editor at Popular Science, and the author of the book Making Contact: Jill Tarter and the Search for Extraterrestrial Intelligence.

Editor: Jennifer Sahn
Researchers: Emily Moon & Jack Herrera
Copy Editor: Leah Angstman
Illustrator: Ian Hurley

Related