In Sri Lanka and Myanmar, Facebook kept up posts that it had been warned contributed to violence. In India, activists have urged the company to combat posts by political figures targeting Muslims. And in Ethiopia, groups pleaded for the social network to block hate speech after hundreds were killed in ethnic violence inflamed by social media.
“The offline troubles that rocked the country are fully visible on the online space,” activists, civil society groups and journalists in Ethiopia wrote in an open letter last year.
For years, Facebook and Twitter have largely rebuffed calls to remove hate speech or other comments made by public figures and government officials that civil society groups and activists said risked inciting violence. The companies stuck to policies, driven by American ideals of free speech, that give such figures more leeway to use their platforms to communicate.
But last week, Facebook and Twitter cut off President Donald Trump from their platforms for inciting a crowd that attacked the US Capitol. Those decisions have angered human rights groups and activists, who are now urging the companies to apply their policies evenly, particularly in smaller countries where the platforms dominate communications.
“When I saw what the platforms did with Trump, I thought, 'You should have done this before and you should do this consistently in other countries around the world,'” said Javier Pallero, policy director at Access Now, a human rights group involved in the Ethiopia letter. “Around the world, we are at the mercy of when they decide to act.”
“Sometimes they act very late,” he added, “and sometimes they act not at all.”
David Kaye, a law professor and former United Nations monitor for freedom of expression, said political figures in India, the Philippines, Brazil and elsewhere deserved scrutiny for their behavior online. But he said the actions against Trump raised difficult questions about how the power of American internet companies was applied, and if their actions set a new precedent to more aggressively police speech around the world.
“The question going forward is whether this is a new kind of standard they intend to apply for leaders worldwide, and do they have the resources to do it,” Kaye said. “There is going to be a real increase in demand to do this elsewhere in the world.”
Facebook, which also owns Instagram and WhatsApp, is the world’s largest social network, with more than 2.7 billion monthly users; more than 90 percent of them live outside the United States. The company declined to comment, but has said the actions against Trump stem from his violation of existing rules and do not represent a new global policy.
“Our policies are applied to everyone,” Sheryl Sandberg, Facebook’s chief operating officer, said in a recent interview with Reuters. “The policy is that you can’t incite violence, you can’t be part of inciting violence.”
Twitter, which has about 190 million daily users globally, said its rules for world leaders were not new. When it reviews posts that could incite violence, Twitter said the context of the events is crucial.
“Offline harm as a result of online speech is demonstrably real, and what drives our policy and enforcement above all,” Jack Dorsey, Twitter’s CEO, said in a post Wednesday. Yet, he said, the decision “sets a precedent I feel is dangerous: the power an individual or corporation has over a part of the global public conversation.”
There are signs that Facebook and Twitter have begun acting more assertively. After the Capitol attack, Twitter updated its policies to say it would permanently suspend the accounts of repeat offenders of its rules on political content. Facebook took action against a number of accounts outside the United States, including deleting the account of a state-run media outlet in Iran and shutting down government-run accounts in Uganda, where there has been violence ahead of elections. Facebook said the takedowns were unrelated to the Trump decision.
Many activists singled out Facebook for its global influence and not applying rules uniformly. They said that in many countries it lacked the cultural understanding to identify when posts might incite violence. Too often, they said, Facebook and other social media companies do not act even when they receive warnings.
In 2019 in Slovakia, Facebook did not take down posts by a member of parliament who was convicted by a court and stripped of his seat in government for incitement and racist comments. In Cambodia, Human Rights Watch said the company was slow to act to the involvement of government officials in a social media campaign to smear a prominent Buddhist monk championing human rights. In the Philippines, President Rodrigo Duterte has used Facebook to target journalists and other critics.
After a wave of violence, Ethiopian activists said Facebook was being used to incite violence and encourage discrimination.
“The truth is, despite good intentions, these companies do not guarantee uniform application or enforcement of their rules,” said Agustina Del Campo, director of the center for studies on freedom of expression at University of Palermo in Buenos Aires, Argentina. “And oftentimes, when they attempt it, they lack the context and understanding needed.”
In many countries, there’s a perception that Facebook bases its actions on its business interests more than on human rights. In India, the country with the most Facebook users, the company has been accused of not policing anti-Muslim content from political figures for fear of upsetting the government of Prime Minister Narendra Modi and his ruling party.
“Developments in our countries aren’t addressed seriously,” said Mishi Choudhary, a technology lawyer and founder of the Software Freedom Law Center, a digital rights group in India. “Any takedown of content raises the questions of free expression, but incitement of violence or using a platform for dangerous speech is not a free speech matter, but a matter of democracy, law and order.”
But even as many activists urged Facebook and Twitter to be more proactive to protect human rights, they expressed anger about the power the companies have to control speech and sway public opinion.
Some also warned that the actions against Trump would cause a backlash, with political leaders in some countries taking steps to prevent social media companies from censoring speech.
Government officials in France and Germany raised alarms over banning Trump’s accounts, questioning whether private companies should be able to unilaterally silence a democratically elected leader. A draft law under consideration for the 27-nation European Union would put new rules around the content moderation policies of the biggest social networks.
Barbora Bukovská, the senior director for law and policy at Article 19, a digital rights group, said the risk was particularly pronounced in countries whose leaders have a history of using social media to stoke division. She said the events in Washington provided momentum in Poland for a draft law from the ruling right-wing nationalist party that would fine social media companies for taking down content that is not explicitly illegal, which could allow more targeting of LGBTQ people.
“These decisions on Trump were the right decisions, but there are broader issues beyond Trump,” Bukovská said.
By Adam Satariano
c.2021 The New York Times Company