What were Facebook staff doing while a homicidal maniac was planning his deadly assault on two mosques in Christchurch, New Zealand?
I’ll tell you. They were conducting focus groups and consulting academics and civil rights watchdogs on whether or not it should bundle white nationalism under the same umbrella as white supremacy and other hate groups – it took them three months!
In a post on its website, titled “Standing Against Hate,” the company explains its about-face on white nationalism:
“Over the past three months our conversations with members of civil society and academics who are experts in race relations around the world have confirmed that white nationalism and white separatism cannot be meaningfully separated from white supremacy and organized hate groups … It’s clear that these concepts are deeply linked to organized hate groups and have no place on our services.”
Really? Well, it took you long enough Zuckerberg!
The statement continues:
“Our own review of hate figures and organizations – as defined by our Dangerous Individuals & Organizations policy – further revealed the overlap between white nationalism and white separatism and white supremacy.”
The company claims it will begin banning content that “praises or represents white nationalism and white separatism” on its Facebook and Instagram platforms.
But by the time the ban goes live, it will be nearly three weeks since the shootings – that’s not acceptable.
It’s great they’ve finally decided to act, but it’s too little – and 50 lives too late.
Especially when you think that within days of the attacks, New Zealand Prime Minister Jacinda Ardern had banned the sale of military-grade assault rifles.
That’s real political action and leadership for you (Take note America! How many more of your children need to die before you ban yours?).
And how many people accidentally viewed the killer’s livestream as it was beamed around the world?
How many children mistakenly believed they were watching a stream of a first-person video game, as the terrorist calmly and methodically recorded his slaughter of innocents from his head-mounted camera and shared it with friends?
How is it that Facebook executives did not act more quickly to halt the poisonous ideologies it allowed to fester on its platform?
Facebook’s statement attempts to offer an explanation:
“We didn’t originally apply the same rationale to expressions of white nationalism and white separatism because we were thinking about broader concepts of nationalism and separatism — things like American pride and Basque separatism, which are an important part of people’s identity.”
Clearly overwhelmed, the company says it is resorting to artificial intelligence and machine learning to police its platform:
“We have improved our ability to use machine learning and artificial intelligence to find material from terrorist groups. Last fall, we started using similar tools to extend our efforts to a range of hate groups globally, including white supremacists. We’re making progress, but we know we have a lot more work to do.”
Facebook admits it needs to get better at removing hate speech from its platform, which is a positive development, but how much of a difference will redirecting people who search for extremist ideas to the Life After Hate site – a website founded by “former violent extremists” – really make?
Moreover, in an age where anyone can share pretty much anything across a plethora of social media platforms, what can really be done?
Here’s one step: delete your Facebook account – permanently.