An independent report commissioned by Meta Platforms Inc. shows that the company showed bias against Palestinians during the 2021 Israel-Palestine crisis.
The report, released today, said that when the social media giant was moderating its platform during the conflict between Israel and the militant Palestinian group Hamas, it unfairly removed content from Palestinians. That was a breach of their right to free expression. The report also said that Meta, then Facebook, had come down harder in its moderating policies on Arabic speakers compared to Hebrew speakers.
The report, conducted by the consulting firm Business for Social Responsibility, said what many people have thought for a long time: that Meta has not moderated fairly in this conflict-strewn part of the world. It seems the company had overforced its moderation when it came to Palestinians and underforced when it came to Israelis – or at least people using Hebrew or Arabic.
“The BSR report confirms Meta’s censorship has violated the #Palestinian right to freedom of expression among other human rights through its greater over-enforcement of Arabic content compared to Hebrew, which was largely under-moderated,” tweeted the Arab Center for the Advancement of Social Media.
More than 260 Palestinians were killed during the two-week conflict, including 66 children. More than 1,900 Palestinians were injured. At least 13 Israelis were killed, and 200 Israelis were injured. As shells rained down and buildings were partly destroyed, people took to social media to show what was happening. Many thousands of people were displaced in the chaos.
Meta at first was applauded for not deleting such content but was later questioned for removing content that showed devastation on the Palestinian side. It was every bit real, so Meta had to explain itself. The company blamed the algorithm but also said that was the result of “human error.” The BSR reported that Meta’s contractors had wrongly labeled such content as relating to terrorism.
But that wasn’t all. There were many instances when Meta seemed to favor content written in Hebrew, but again, the algorithm was blamed. The report stated that at no point were humans at Meta showing animus toward people in regard to race, ethnicity, language or religion. It went on to say that at the company, there are “employees representing different viewpoints, nationalities, races, ethnicities, and religions relevant to this conflict.”
The report highlighted where Meta has shown “good practice” but recommended that it change some of its policies where conflicts are concerned. Meta said it has accepted the recommendations and will act according in the future.
“BSR’s report is a critically important step forward for us and our work on human rights,” said the company. “Global events are dynamic, and so the ways in which we address safety, security, and freedom of expression need to be dynamic too. Human rights assessments like these are an important way we can continue to improve our products, policies and processes.”