Did Google and GoDaddy Set a Dangerous Precedent by Dropping a Neo-Nazi Website? - Pacific Standard

Did Google and GoDaddy Set a Dangerous Precedent by Dropping a Neo-Nazi Website?

A digital civil liberties lawyer weighs in.
Author:
Publish date:
A man makes a slashing motion across his throat toward counter-protesters as he marches with other white nationalists and neo-Nazis during the Unite the Right rally on August 12th, 2017, in Charlottesville, Virginia.

A man makes a slashing motion across his throat toward counter-protesters as he marches with other white nationalists and neo-Nazis during the Unite the Right rally on August 12th, 2017, in Charlottesville, Virginia.

On Sunday, the domain registration service GoDaddy announced that it was no longer providing a domain name to the white supremacist website the Daily Stormer, citing a terms-of-service violation. This announcement came in response to outcry over an article the Daily Stormer published mocking a counter-protester killed on Saturday at the Unite the Right rally in Charlottesville, Virginia. Shortly after, GoDaddy re-registered with a Google domain name. Google then followed suit, citing a terms-of-service violation to drop the Daily Stormer.

GoDaddy's decision comes at a particularly fraught moment in the debate over whether freedom of speech can be reconciled with attempts to quell hateful discourse and actions. Additionally, with the Internet becoming the preferred mode of public discourse, abusive trolling and rampant falsehoods have led some to call for increased accountability from Internet service providers and social media companies for the content they host and support. The central question of this debate continues to be: Is freedom worth its consequences?

For the Electronic Frontier Foundation, a non-profit group that has fought unwaveringly for Web freedoms since 1990, the answer to that question is a resounding yes. Pacific Standard spoke to Nate Cardozo, senior staff attorney at the Electronic Frontier Foundation, to discuss GoDaddy, the monitoring of digital hate speech, and whether our current laws are equipped to hold accountable Internet service companies for the content they enable.

section-break

How do you view the GoDaddy decision?

Cutting off the Daily Stormer's domain is a purely symbolic move. GoDaddy is, of course, free to do so. They're not required to host anyone; they're allowed to cut service off to anyone who they want for essentially any reason, and cutting off the Daily Stormer is well within their rights. But it's symbolic. The Daily Stormer is going to find another domain provider. Or if they can't find anyone to host dailystormer.com, they'll find a different top-level domain to host under, so it's not like the Daily Stormer is going to go away.

Preventing people from reaching the Daily Stormer's website does nothing to actually combat the ideas. There's the old, famous saying that the remedy for bad speech is more speech—it's not silencing the bad speech. Hate speech is legal in the United States. And people are going to continue to express themselves in awful ways, and cutting off the domain name isn't helpful for the dialogue.

The Daily Stormer's logo.

The Daily Stormer's logo.

If GoDaddy kicking the Daily Stormer off their domain name is purely a symbolic move, how is it a substantive threat to free digital speech?

At this point it's not a particularly bad threat to free speech—the registrar market is broad enough, the Daily Stormer is going to find somewhere else. But if you look at it in other contexts, in contexts that aren't so distributed—payment processing being the most obvious example—there's only a handful of payment processors in the United States, and when payment processors cut off a client, such as what Visa, Mastercard, and Paypal all did to WikiLeaks a few years ago, that actually does have a real world effect. It made it next to impossible for WikiLeaks to raise money. That does have problems for free speech. So although in this particular instance with domain names, this corporate enforcement of norms isn't going to have a particular effect on people's ability to reach the Daily Stormer, the idea that the information that we can receive is moderated—not by the government, not by courts, and not even by the court of public opinion, but by a handful of small companies—should be troubling to all of us.

Airbnb removed some of its users who tried to book housing for the Unite the Right rally. Do you view that similarly, as a threat to online speech?

That's more complex. If GoDaddy had been the Daily Stormer's hosting provider and not just the domain registrar, I would actually feel significantly different. You don't have a responsibility to host activities that go against your core beliefs. Hosting [a website] is building a house. And if GoDaddy doesn't want the Daily Stormer living in its house, I have a lot more sympathy for that. Similarly, if Airbnb doesn't want people living in their houses, literally, then I have a lot more sympathy for that.

If both are essential to the Daily Stormer's operation, why distinguish between the domain name and the hosting infrastructure?

The way that the domain system works is that the registrars are really just resellers for the registrants, so that's all that GoDaddy was doing here. The domain name registrar isn't actually providing anything other than resale. So it's not like the Daily Stormer was using GoDaddy's servers; GoDaddy was just the agency that accredited them as the owner of dailystormer.com. That's just a line item in a database; it's not providing substantive service.

GoDaddy said they booted the Daily Stormer because the type of article they posted "could incite additional violence, which violates our terms of service." Isn't that different than kicking someone off for their viewpoint?

I think those are actually pretty similar, because GoDaddy, as far as I understand it, was using "incitement" in the plain English meaning and not in the legal use meaning of "incitement." To be incitement in a legal sense, there has to be an immediate threat of violence. You can't say, "I wish someone would kill the president." It's more like, "You there, with the hammer, standing next to Trump, hit him in the head." That's incitement. "Let's go beat up some antifa protesters" isn't incitement if there aren't a bunch of people standing in front of a bunch of antifa protesters. It has to be much more immediate than the speech that was on the Daily Stormer.

Public Reaction to GoDaddy's decision has been largely supportive. Do you think that the political tide has started to turn against Section 230 of the Communications Decency Act, which prevents Internet services from being held accountable for the content they host or support?

That's a perfect segue, because Section 230 does two different things: It immunizes service providers from most types of actions against them for the speech of their users, but it also immunizes service providers for the editorial judgments they make. And here, actually, GoDaddy is relying on Section 230 to shut off the Daily Stormer—230 immunizes GoDaddy for that editorial decision in a very important way. So no, I don't think this shows a tide against CDA 230.

Nate Cardozo.

Nate Cardozo.

So GoDaddy was able to kick the Daily Stormer off their domain name because of section 230?

That was actually the original point of 230. 230 was passed as part of the Communications Decency Act, which would have made porn as we know it illegal on the Internet. And one of the points was that, if service providers were going to be required to police the content of their sites, Congress wanted to immunize them for the decisions that they made, so that they couldn't be second-guessed. 230 is the only part of the CDA that survived, so service providers are certainly not required to police their content like the Communications Decency Act contemplated, but they are immunized from doing so. So, CDA 230 is amazingly versatile in that respect.

Given the kind of hateful trolling and fake news that have become prevalent during recent years, is there justification for more robust accountability of these online providers of domain names or hosting infrastructure?

Any attempt to try to hold service providers responsible is absolutely bound to backfire. In the marketplace of ideas, we need to have exposure to all sorts of ideas. Good ones, bad ones, fake ones—all of them are valuable in their own way. The reader is the only one whose judgment matters. If we want to hold infrastructure providers responsible for fake news, then we're happy with the state of human knowledge and we don't need to try to advance it in any way. A good deal of scientific theories that are now accepted started out as fringe. And if we were to try to hold infrastructure providers acceptable for hosting scientific theories that turned out not to be true, I don't see how we, as a free and open society could progress. The concept of holding infrastructure providers responsible for fake news or bad ideas or hateful ideas is a quick recipe for disaster.

Our experience on the Internet is shaped by algorithmic and financial forces that elevate certain ideas and forms of content above others—these forces are not neutral. So it seems like the "marketplace of ideas" is not really "free."

That may well be true, but the solution, again, the remedy for bad speech is more speech. It's not shutting off the bad speech.

The exponential growth of the Internet has resulted in vastly more speech. Has this growth only resulted in better speech?

It hasn't only resulted in better speech; it's resulted in more speech of every kind. But that's not necessarily a bad thing. The problems in Charlottesville were not problems of speech, they were problems of violence. And I think the concept that the violence was caused by speech and speech alone is silly; the violence was caused by people committing acts of violence.

So the best way to prevent hate online is just more speech against hate?

Well, there are all sorts of ways to prevent hate online. One of them would be trying to increase media literacy among the general public. Another would be for platforms to up their moderation habits. YouTube is doing something pretty innovative with hate speech. They're not shutting it down, they're doing other things. They're de-promoting it in search results, they are de-monitizing it, but it's all still there. And I think, taking YouTube's lead as an example, as opposed to what GoDaddy did, is useful.

This interview has been edited for length and clarity.

Related