MEE Staff
Middle East Eye / October 15, 2021
The tech giant will appoint an external body to determine whether the company suppressed Arabic and Hebrew posts.
Facebook will allow an independent body to launch an investigation into content moderation of Arabic and Hebrew posts after the tech giant was accused of removing and suppressing pro-Palestine content.
“We have partnered with a non-profit organization expert in business and human rights, BSR, to conduct human rights due diligence of Facebook’s impacts during May-June’s intensified violence in Israel and Palestine,” Facebook said in a statement on Friday.
“BSR will examine relevant internal Facebook sources and engage with affected stakeholders. We will implement the board’s recommendation in our due diligence, defining and prioritizing all salient human rights issues according to the guidance of the UN Guiding Principles on Business and Human Rights.”
The tech company added that it will publicly communicate the results of the investigation in 2022. With offices around the world, BSR styles itself as an “organization of sustainable business experts” that works with big business to “create a just and sustainable world”.
The announcement comes after Facebook’s Oversight Board released a report earlier this year, calling for an independent body to investigate claims of content suppression relating to Israel-Palestine.
Activists and rights groups had accused the social media giant, which also owns Instagram and WhatsApp, of censoring Palestinians and supporters following the removal of pro-Palestinian posts.
Nearly 200 Facebook staff members also accused its systems of unfairly taking down or down-ranking pro-Palestine content before and during Israel’s latest offensive on Gaza.
Responding to the criticism
Following the criticism, Facebook’s Oversight Board released a report and called for an independent review into alleged bias in the moderation of Palestinian and Israeli posts.
The report focused on one particular post that moderators took down and later reinstated – an Al Jazeera Arabic story about the Hamas-affiliated Izz al-Din al-Qassam Brigades that an Egyptian user had reposted with the comment “Ooh” – but offered recommendations that have wider implications for the moderation of Palestinian and Israeli content.
For the independent review into alleged bias, the board said the reviewer should not be “associated with either side of the Israeli-Palestinian conflict”, and should examine both human and automated content moderation in Arabic and Hebrew.
One major concern among digital advocates is the degree to which Facebook is removing Palestinian content at the request of both the Israeli government, including the justice ministry’s cyber unit, and a highly organized network of volunteers who report pro-Palestinian content.
The board took up this question, asking Facebook during its investigation whether the company had received official or unofficial requests from Israel to remove the content in April and May.
The company responded that it hadn’t received “a valid legal request” from a government authority in the case of one particular post on which the board’s report focused. Facebook “declined to provide the remaining information requested by the board“.