Meta said that in the lead-up to the outbreak of violence on May 10, a “global technical glitch” occurred, preventing users from re-sharing posts, including Israel and Palestine. Miranda Sissons, global human rights director for Meta said that this was “not intentional or targeted but a global error that affected tens of millions of people.”
Shortly after post-resharing was blocked, the Al-Aqsa hashtag, which pertained to a mosque at the centre of the crisis, was also blocked by a content reviewer, Meta said. Sissons noted that the person who made the error is “also human”, and that the hashtag block was fixed once they were made aware of the issue.
There was an under-enforcement of Hebrew content and an over-enforcement for Arabic content throughout the crisis. Palestinian journalists reported that their WhatsApp accounts had been blocked, which again was explained as unintentional and rectified after Meta was notified.
Arabic content also received violations at a significantly higher rate than Hebrew content, which could be attributed to Meta’s policies around legal obligations relating to designated foreign terrorist organisations, the company said.
“We are a US company that has to comply with US law,” Meta said in its own report.
Users were also given false strikes, leading to significantly lower visibility and engagement, after posts were removed for violating content policies. The human rights consequences were severe. Various rights such as freedom of expression, freedom of association and more were curtailed, with journalists and activists especially impacted, the BSR report found.