Meta discriminated against Palestinians in May 2021 war, report finds

After backlash on over- and under-enforcement of Arabic content, consultants asked to conduct thorough examination of Meta's 'human rights impact on the rights of Palestinian users'

Oshrit Gan-El, Hadas Bar-Ad|
Meta Platforms (formerly Facebook) published a report commissioned by an external group indicating the company adversely influenced the rights of Palestinian users and caused harm to their freedom of expression and assembly during the May 2021 war in Gaza.
  • Follow Ynetnews on Facebook and Twitter

  • Meta asked Business for Social Responsibility (BSR) — an independent organization with expertise in human rights "to conduct a due diligence exercise into the impact of our policies and processes in Israel and Palestine during the May 2021 escalation, including an examination of whether these policies and processes were applied without bias," the company said.
    3 View gallery
    מהומות בזמו מבצע שומר החומות
    מהומות בזמו מבצע שומר החומות
    Riots during Operation Guardian of the Walls
    (Photo: AFP, shutterstock)
    BSR made 21 specific recommendations as a result of its due diligence, of which Meta committed to implementing 10, partly implementing four, and assessing the feasibility of another six.
    Critics had blamed Meta platforms for censoring and silencing different users and perspectives, and on the other hand for not monitoring enough of certain content and leaving inciteful content on its platforms during the fighting.
    “Meta’s actions in May 2021 appear to have had an adverse human rights impact … on the rights of Palestinian users to freedom of expression, freedom of assembly, political participation, and non-discrimination, and therefore on the ability of Palestinians to share information and insights about their experiences as they occurred,” the report reads.
    3 View gallery
    פייסבוק חושפת את השם החדש META
    פייסבוק חושפת את השם החדש META
    Meta
    (Photo: Reuters)
    It emphasized that "Meta’s actions during the events of May 2021 cannot be viewed in isolation but must be understood in the context of the ongoing conflict in Israel and Palestine," and that its "role is not to arbitrate this conflict, but rather to generate and enforce policies to mitigate the risk that its platforms aggravate it by silencing voices, reinforcing power asymmetries, or allowing the spread of content that incites violence."
    In its official response, Meta committed to closely monitor "potential over- and under-enforcement" after the data reviewed indicated that Arabic content had greater over-enforcement.
    The company explained some blocked hashtags that were blocked had been found in violation of more than one Meta regulation, but said they would make public the amount of content removed although no laws were broken, at the request of some countries.
    One key example highlighted by BSR was that "#AlAqsa" was added to a hashtag block list by an employee in Meta’s Outsourced Services, resulting in #AlAqsa being hidden from search results. This censored many posts referring to the Al-Aqsa Mosque, one of the holiest sites in Islam, which were not necessarily offensive.
    3 View gallery
    מסגד אל-אקצא
    מסגד אל-אקצא
    Al-Aqsa Mosque
    (Photo: Shutterstock)
    Nonetheless, BSR called Meta's conduct "unintentional bias" and "found no evidence of racial, ethnic, nationality or religious animus in governing teams."
    The company also noted that antisemitic content was not removed from the platforms but claimed that since such content does not differentiate between different types of hateful speech, it cannot provide official data on how that content was dealt with.
    BRS said that in some cases content was removed without cause while in other cases it remained on the platforms because of lack of enforcement but noted that Arab language content was more often removed and Hebrew language content was subject to less enforcement, although both languages were subjected to both over and under enforcement.
    The reasons given for over enforcement BSR said, were Meta's legal commitments as an American company, regarding foreign terror organizations and its monitoring tools were not necessarily sensitive to vernacular differences, possibly resulting in over enforcement of Arab language content. The report notes a shortage in Arabic speaking monitors who may be able to discern the differences in content.
    Miranda Sissons, Meta's Director of Human Rights said the company has broad reaching polices regarding dangerous organizations, states and events and on incitement, hate speech, violence and online bullying and harassment.
    But in response to a question from Ynet Sisson said Meta must also respond to conflicting regulations and international standards.
    Comments
    The commenter agrees to the privacy policy of Ynet News and agrees not to submit comments that violate the terms of use, including incitement, libel and expressions that exceed the accepted norms of freedom of speech.
    ""