The Electronic Frontier Foundation (EFF), an organisation
whose aims and work I broadly support, has published a blog post entitled
“Fighting Neo-Nazis and the Future of Free
Expression.”
The authors argue forcefully that GoDaddy, Google, and Cloudflare all made a
mistake when they rejected the Daily Stormer from their platforms, and that
their actions have set a dangerous precedent.
I think EFF is wrong on this issue.
EFF’s post makes a “slippery slope” argument: steps taken by companies to remove
neo-Nazi publications from the internet will lead to censorship of groups whose
views are not widely considered abhorrent. The example given is Black Lives
Matter. The implication is that the authors of
the post can find no relevant moral or legal distinction between the publishers
of the Daily Stormer and the organisers of Black Lives Matter.
Once more: the post’s authors imply they cannot see a clear line that would
allow them to distinguish an organisation whose aim is to perpetrate violence
against groups identified by race and religion from one which aims to defend
a group identified by race from violence, and do not believe that private
commercial enterprises should seek to define one. Either that, or they do not
believe that such a distinction is relevant to the discussion.
This is, in a word, ridiculous.
First, let’s get the discussion of “free speech” out of the way. Despite
frequent references to legal free speech rights within EFF’s blog post,
including their belief
… that no one—not the government and not private commercial
enterprises—should decide who gets to speak and who doesn’t
this discussion does not in practice revolve around a legal issue. Being kicked
off a private commercial platform does not infringe a US citizen’s free speech
rights. GoDaddy, Google, and Cloudflare all presumably believe that they are
fully within their legal rights to exclude neo-Nazis from their platforms, or
they would not have taken the action that they did.
Since there doesn’t appear to be a legal argument to make, EFF is relying on an
instrumental argument for why these companies should not have acted as they did.
The authors imply that normalising the banning of neo-Nazis will inevitably lead
to bans on Black Lives Matter activists. But in what way is this inevitable,
given the clear moral distinction between groups that seek to deprive others of
their constitutionally protected rights through violence, and those that do not?
Displaying Nazi symbols or performing a Nazi salute in public is illegal in
Germany, but this
law doesn’t seem to
have been inevitably followed by a crackdown on other political speech.
It is simplistic in the extreme to believe that such an action inevitably opens
the door to discrimination of a different kind. In the UK, for example, the
Equality Act 2010
would render many kinds of corporate discrimination illegal, because race and
religious belief are “protected characteristics” under that law. Denying service
to neo-Nazis, on the other hand, remains legal as far as I can tell: belief in
the racial inferiority of other humans is not a “protected characteristic.”
But there is a more important point here. Even if this were somehow a legal
issue—and perhaps there are jurisdictions where it is—then I would make this
stronger claim: if the law criminalises the “de-platforming” of neo-Nazis by a
corporation, then the law is wrong and must be changed.
Real Nazis came to power in 1933 by legal means, and much (but not all) of the
vile discrimination they perpetrated over the following years had a legal basis.
The fact that something is legal doesn’t make it right. The fact that
something is illegal doesn’t make it wrong. If there is a serious risk that
the law will allow neo-Nazis to use these platforms to amplify their voices,
seize power and achieve their hateful aims, then the law must change.
EFF’s post also notes that the actions taken by GoDaddy, Google, and Cloudflare
did not follow an established process or policy. On this point, I can agree.
These companies (and others like them) should, at a minimum, ensure that they
have “Terms of Service” which make clear that they will not allow their
platforms to be used to spread intolerance and hate, or to promote violence.
They should have clear internal processes to handle reports of content which
violates these terms.
In summary, I think EFF needs to reconsider their position on this issue, and
establish whether they are confusing questions of what is legal with questions
of what is right. They also need to consider whether the “slippery slope”
argument is, in fact, valid. Are we really so simple-minded that we cannot
distinguish between the speech of Black Lives Matter and the speech of
neo-Nazis?