NEW YORK: On August 25, Hannah Gittings watched in horror as her friend Anthony Huber was fatally shot during a demonstration in Kenosha, Wisconsin, to protest the police killing of a Black man at the hands of police. The events turned violent when an extreme right militia group called the Kenosha Guard called on Facebook followers to "protect" the city, and a 17-year-old member of the group opened fire on Huber with a semi-automatic rifle. Gittings blamed Facebook for failing to take down what seemed to be a clear incitement of violence.
The page "was left up and not only left up, it was deemed not threatening, not a danger when they're clearly people blatantly inciting violence, saying they're going to shoot Black people," Gittings told a news conference organized by the activist group Avaaz. The tragic incident highlighted concerns that social networks such as Facebook are being used to foment real-world violence with little or no control by the platforms.
Facebook and other social platforms, which are also often used to organize peaceful events and pro-democracy movements, have been condemned for failing to stop a range of abusive and hateful content including organized violence such as the massacre of the Rohingya minority in Myanmar and the beheading of French schoolteacher Samuel Paty near Paris. A Facebook spokesperson, queried by AFP, said, "We remain vigilant when it comes to policing hate speech, calls for violence, and misinformation."
The company said that since August it identified over 600 militarized social movements, and removed their pages or accounts, as part of an effort that took down 22.1 million posts containing "hate speech." "We always know there is more to do, which is why we're constantly working to improve our technology and tighten our policies when necessary to keep dangerous content off our platform," the company said.
Falling short
But critics say Facebook still falls short on many occasions. Executives "need to understand that what happens on Facebook doesn't just stay on Facebook," said Joyce Jones, a mayoral candidate in Alabama who had to fight Facebook rumors that dogged her campaign. "It goes home with us, it goes to the grocery store with us, it goes to our jobs. Our children are affected."
Social platforms have also been criticized for doing too little to stop deadly misinformation about the coronavirus and in some cases even amplifying hoaxes and false information through algorithms designed to boost engagement. Kristin Urquiza, whose father died from Covid-19 this year, saw the memorial page on Facebook flooded with abusive comments from people minimizing the health crisis or questioning the use of face masks. "Facebook may not have pulled the trigger, but Facebook did drive away the getaway car," she said the Avaaz conference.
Critics of Facebook and other social networks argue they should be held accountable for violence organized on their platforms, calling for reforms of a law which shields internet services from liability for content posted by third parties. But some analysts argue that the platforms can't bear full responsibility for the deeper social problems which have led to extremism and violence in the streets. Mark Potok, a fellow with The Centre for the Analysis of the Radical Right, said social networks such as Facebook have been useful to extremist groups in "very quick mobilizations" in Kenosha another places.
But he added that "there's an inclination to say it's all about social media, and that's not true." Potok said militants have found ways to organize with or without Facebook, and that many extremists are now gravitating to fringe platforms with little or no moderation. "I don't think these platforms will be able to police all the extremist content out there," he said. "There are such enormous numbers being put up every day. I doubt these companies can eradicate their influence." - AFP