Saturday 19 August 2017

We've crossed the free speech Rubicon

After decades of operating in the shadows of the web, online racists are now the focus of deserved intense anger and sudden action by the entities that once blithely hosted them.

While there's never been a tacit endorsement from the internet's gatekeepers (those who host content, services, product offers, etc.), neo-Nazis, white supremacists, and racists do spread their message of hate through an ever-growing network of websites, social media accounts, video and newsletters. 
But after the riots and deadly attack in Charlottesville, Virginia, on Saturday, the gatekeepers are finally closing the gates.
Airbnb banned white supremacists from using its platform, OKCupid booted a white supremacist off its matchmaking service, Spotify banned hate music from its catalogue, and PayPal and Apple Pay will no longer let anyone use their payment services to sell paraphernalia associated with far-right hate groups.
The wave of stricter digital policies regarding hate sites and services started, for the most part, with GoDaddy and Google's decision on Monday to remove the Daily Stormer, a neo-Nazi website which posted an abhorrent article about the victim of the Charlottesville attack, Heather Heyer, from their web-hosting services. Cloudflare, a CDN that hosts the traffic of countless websites, followed suit, pushing all the site's traffic off its servers, as well.
It's worth praising GoDaddy, Apple, Spotify, and others for drawing a hard red line on hate. I want to applaud all of them, loudly and with vigor.
But it's hard to deny that this moment could mark an important shift in our digital existence and signal the potential end of broad-based free-speech online.
It's hard to deny that this moment could mark an important shift in our digital existence
Defending the Daily Stormer's right — or, heck, the right of any hate speech — to exist is a complicated and unpopular task. They publish despicable things. But having witnessed the full history of the modern internet, I know online hate isn't some social media cancer that only recently metastasized on more traditional websites — it's been there from the start.
Neo-Nazis identified the internet as a powerful tool for disseminating hate early on in the web's inception. In 1996, The New York Times interviewed George Burdi, whom they described as a racist and "archetype of the forward looking neo-Nazi." The then 25-year-old record producer had turned to the internet to spread his message of white supremacy and looked forward the spreading influence of the then nascent online platform.
"We have big plans for the internet," Burdi told The Times, "It's uncontrollable. It's beautiful, uncensored."
And just like online hate speech has survived on the internet for decades, attempting to censor the web's worst impulses is as old as dial-up. 
At the same time, that very freedom that Burdi prized was already under constant attack from both sides. The Jewish Defense League had in the early 1990's demanded that America Online, then a near de facto gateway to the internet for millions of Americans, monitor neo-Nazi recruiters on it network. Simultaneously, big New York Investment Banks were suing Prodigy (RIP) for allowing allegedly libelous statements about them to appear on the online service.
For hate groups, the internet was the long-hoped-for accelerant to spread their message. Stopping this in its tracks still seems smart, just and right. 
We've never acted as we are today in eradicating such hate from the internet. That’s what makes this moment so spectacularly singular.
And though we've talked about the need to balance free speech and moderating hate for decades, we've never acted as we are today in eradicating such hate from the internet. That’s what makes this moment so spectacularly singular.
It's Google, GoDaddy, Spotify, Apple, and others' right as private businesses to decide who can and cannot operate on their services. And in this case, what the Daily Stormerfounder wrote is incontestably hate speech which is often listed as a violation of platforms' terms of service.
But will the litmus test always be so clear?
Acting to remove a website from systems designed to host millions of websites raises some important questions about hate content and other broad-based online systems and services:
  • Should Google, GoDaddy, Amazon, Squarespace, and Cloudflare begin a systematic sweep to remove all neo-Nazi-leaning websites?
  • Can Apple and Google pore over hundreds of thousands of apps to ferret out any that might be hosting neo-Nazi sympathizers?
  • Should Etsy look at all its third-party vendors to ensure that none are selling Nazi or White supremacists paraphernalia?
  • Should Amazon and Netflix remove films depicting Nazis and racists?
  • Should iTunes, Google Play, and Amazon Prime Music scrub their music libraries of songs that sound like they support hate or violence?
I'm thinking: yes, of course—but then, begin to wonder where, exactly do we draw the line? And does removing these sites and content actually help? 
GoDaddy didn't stamp out the Daily Stormer by removing it. The site switched to Google's servers. When Google did the same, did it stamp out some small portion of neo-Nazism? Of course not, no more so than Twitter deleting Twitter Troll accounts stops anyone from being an online troll. Trolls find a way back to the platform, even if it's under a different guise or identity.
Google, GoDaddy, OKCupid, and Spotify's reasonable choices are, potentially, the top edge of  a very slippery slope. The further these online services go into policing the internet for hate, the more they will be faced with nuanced choices about what's pure hate, and what's reasonable rhetoric.
And while we know what hate is, it's harder to identify the massive moat between love and abusive hostility. There's a world of ideology in the spectrum in between.
Others are, naturally, asking the same things, too.
On Thursday, the Electronic Frontier Foundation acknowledged how how "deeply fraught with emotion" this situation is, but pondered how removing these sites and content might impact the future of free expression.
"We must also recognize that on the internet, any tactic used now to silence neo-Nazis will soon be used against others, including people whose opinions we agree with," wrote the EFF authors.
The reality is: there are simply no easy answers here, no quick solution to waving away online hate.
Make no mistake: What happened in Charlotteville devastated me. I've visited that beautiful Main St. often enough to know its rhythm and contours. I know the corner where Heather Heyer died. 
Which makes it especially difficult to accept that some of the action social and digital services may be taking in the face of this hate and violence could be taking us in the wrong direction.
Freedom of speech is a deeply-held American value—it's one that's propelled our democracy forward as much as it has exposed us to things we find objectionable. But it's a value that protects all speech. Not just what rational, caring, loving, intelligent people say and agree with.
The harsh light of truth and visibility, however, can be the world's great disinfectant.
I've often thought that the First Amendment protected hate speech and Nazi marches so that they could be subjected to the brilliant illuminating power of truth and reason. Nazi sympathizers in regalia always look ludicrous in the sun. Let them march, as is their First Amendment right, so we can shout and challenge them, as is our First Amendment right.
However, what's crystal clear in real life can be murkier online.
My fear is that when you delete hate-fueled accounts, sites, and content, you simply push the hate further underground, and to other platforms and avenues that are, perhaps, friendlier and more accepting of such hatred, where it will continue to fester and grow. 

Hate doesn't need the light of day to flourish. It's a cruel mass that feeds on the darkness of ignorance. Silencing these people won’t make them disappear. The harsh light of truth and visibility, however, can be the world's great disinfectant. 

http://mashable.com/2017/08/18/crossing-a-rubicon-of-free-speech/?utm_cid=hp-r-3#TGeyJO1RjaqD

0 Comments:

Post a Comment

Subscribe to Post Comments [Atom]

<< Home