WASHINGTON: Facebook came under fresh criticism Tuesday for its hands-off approach to political speech, as a group of employees and US lawmakers called on the social network to fact-check politicians spreading misinformation. A letter from employees urged the company to crack down on "civic misinformation," saying the spread of debunked claims is a "threat to what FB stands for."
"We strongly object to this policy as it stands. It doesn't protect voices, but instead allows politicians to weaponize our platform by targeting people who believe that content posted by political figures is trustworthy," said the letter first obtained by The New York Times, which said more than 250 employees had endorsed it.
At the same time, US lawmakers critical of Facebook stepped up their calls to revisit its policy, which exempts comments and paid ads on the platform from fact-checking - an issue that has become heated with President Donald Trump's online ads using what some called "provably false" claims. "Facebook's new ads policy allows politicians to run demonstrably false advertising on its platform. I don't think that's right," said Senator Mark Warner, a Virginia Democrat who added that he sent a letter to Facebook chief executive Mark Zuckerberg calling on him "to reverse this decision."
Other Democrats joined the effort, welcoming the letter from Facebook employees. "Being a politician shouldn't be a license to lie - especially to spread hatred. If Facebook employees get it so should Zuckerberg," tweeted Senator Richard Blumenthal. Those comments were echoed by Senator and presidential hopeful Elizabeth Warren, who tweeted: "Facebook's own employees know just how dangerous their policy allowing politicians to lie in political ads will be for our democracy.
Mark Zuckerberg should listen to them - and I applaud their brave efforts to hold their own company accountable." Some Democrats have challenged Facebook's policy by running their own false ads, and one California entrepreneur announced he would run for governor of the state to be immune from fact-checking.
Not an arbiter
Facebook said in response to an AFP query about the controversy that the social media giant's culture was "built on openness so we appreciate our employees voicing their thoughts on this important topic." "We remain committed to not censoring political speech, and we will continue exploring additional steps we can take to bring increased transparency to political ads," it said in a statement. Zuckerberg earlier this month articulated Facebook's policy, saying it is based on a long tradition of allowing free expression.
"I don't think most people want to live in a world where you can only post things that tech companies judge to be 100 percent true," he said. The policy at Facebook, and a similar approach from other platforms such as Twitter, creates a challenge for online firms seeking to avoid the role of being an "arbiter" of truth and entering the fray of politics.
Nina Jankowicz, a disinformation fellow at the Wilson Center, a Washington think tank, said Facebook has a unique responsibility to root out false information because of its scale and its ability to allow advertisers to use "microtargeting" of users. "If they don't want to be involved in fact-checking, they shouldn't take that content," Jankowicz said. "They have become the world's biggest gatekeeper and they need to shoulder that responsibility."
Contentious decisions
Other analysts say the question is more complex with various standards for different media: broadcast television, for example, is not allowed to reject specific political ads, but cable outlets may, and internet platforms are not subject to specific regulations. Shannon McGregor, a University of Utah professor specializing in political communication, said Facebook would run into a thicket of problems if it sought to verify ads and other statements from candidates and politicians. The current standard "leaves them open to criticism, but that is the policy that makes the most sense," McGregor said.
"I don't think we want private companies like Facebook and Twitter making decisions on what speech is allowed or not allowed." Darrell West, head of the Brookings Institution's Center for Technology Innovation, said social media firms are struggling for the right guidelines to handle these controversies. "Technology firms increasingly are being asked to police national conversations because of their central communications role," West said. "This puts them in the middle of many contentious decisions (with) no real guidelines on how to make decisions."- AFP