Facebook’s self-styled ‘oversight’ board selects first cases, most dealing with hate speech

A Facebook -funded body that the tech giant set up to distance itself from tricky and potentially reputation-damaging content moderation decisions has announced the first bundle of cases it will consider. In a press release on its website the Facebook Oversight Board (FOB) says it sifted through more than 20,000 submissions before settling on six […]

A Facebook -funded body that the tech giant set up to distance itself from tricky and potentially reputation-damaging content moderation decisions has announced the first bundle of cases it will consider.

In a press release on its website the Facebook Oversight Board (FOB) says it sifted through more than 20,000 submissions before settling on six cases — one of which was referred to it directly by Facebook.

The six cases it’s chosen to start with are:

Facebook submission: 2020-006-FB-FBR

A case from France where a user posted a video and accompanying text to a COVID-19 Facebook group — which relates to claims about the French agency that regulates health products “purportedly refusing authorisation for use of hydroxychloroquine and azithromycin against COVID-19, but authorising promotional mail for remdesivir”; with the user criticizing the lack of a health strategy in France and stating “[Didier] Raoult’s cure” is being used elsewhere to save lives”. Facebook says it removed the content for violating its policy on violence and incitement. The video in questioned garnered at least 50,000 views and 1,000 shares.

The FOB says Facebook indicated in its referral that this case “presents an example of the challenges faced when addressing the risk of offline harm that can be caused by misinformation about the COVID-19 pandemic”.

User submissions:

Out of the five user submissions that the FOB selected, the majority (three cases) are related to hate speech takedowns.

One case apiece is related to Facebook’s nudity and adult content policy; and to its policy around dangerous individuals and organizations.

See below for the Board’s descriptions of the five user submitted cases:

  • 2020-001-FB-UA: A user posted a screenshot of two tweets by former Malaysian Prime Minister, Dr Mahathir Mohamad, in which the former Prime Minister stated that “Muslims have a right to be angry and kill millions of French people for the massacres of the past” and “[b]ut by and large the Muslims have not applied the ‘eye for an eye’ law. Muslims don’t. The French shouldn’t. Instead the French should teach their people to respect other people’s feelings.” The user did not add a caption alongside the screenshots. Facebook removed the post for violating its policy on hate speech. The user indicated in their appeal to the Oversight Board that they wanted to raise awareness of the former Prime Minister’s “horrible words”.
  • 2020-002-FB-UA: A user posted two well-known photos of a deceased child lying fully clothed on a beach at the water’s edge. The accompanying text (in Burmese) asks why there is no retaliation against China for its treatment of Uyghur Muslims, in contrast to the recent killings in France relating to cartoons. The post also refers to the Syrian refugee crisis. Facebook removed the content for violating its hate speech policy. The user indicated in their appeal to the Oversight Board that the post was meant to disagree with people who think that the killer is right and to emphasise that human lives matter more than religious ideologies.

  • 2020-003-FB-UA: A user posted alleged historical photos showing churches in Baku, Azerbaijan, with accompanying text stating that Baku was built by Armenians and asking where the churches have gone. The user stated that Armenians are restoring mosques on their land because it is part of their history. The user said that the “т.а.з.и.к.и” are destroying churches and have no history. The user stated that they are against “Azerbaijani aggression” and “vandalism”. The content was removed for violating Facebook’s hate speech policy. The user indicated in their appeal to the Oversight Board that their intention was to demonstrate the destruction of cultural and religious monuments.

  • 2020-004-IG-UA: A user in Brazil posted a picture on Instagram with a title in Portuguese indicating that it was to raise awareness of signs of breast cancer. Eight photographs within the picture showed breast cancer symptoms with corresponding explanations of the symptoms underneath. Five of the photographs included visible and uncovered female nipples. The remaining three photographs included female breasts, with the nipples either out of shot or covered by a hand. Facebook removed the post for violating its policy on adult nudity and sexual activity. The post has a pink background, and the user indicated in a statement to the Oversight Board that it was shared as part of the national “Pink October” campaign for the prevention of breast cancer.

  • 2020-005-FB-UA: A user in the US was prompted by Facebook’s “On This Day” function to reshare a “memory” in the form of a post that the user made two years ago. The user reshared the content. The post (in English) is an alleged quote from Joseph Goebbels, the Reich Minister of Propaganda in Nazi Germany, on the need to appeal to emotions and instincts, instead of intellect, and on the unimportance of truth. Facebook removed the content for violating its policy on dangerous individuals and organisations. The user indicated in their appeal to the Oversight Board that the quote is important as the user considers the current US presidency to be following a fascist model

Public comments on the cases can be submitted via the FOB’s website — but only for seven days (closing at 8:00 Eastern Standard Time on Tuesday, December 8, 2020).

The FOB says it “expects” to decide on each case — and “for Facebook to have acted on this decision” — within 90 days. So the first ‘results’ from the FOB, which only began reviewing cases in October, are almost certainly not going to land before 2021.

Panels comprised of five FOB members — including at least one from the region “implicated in the content” — will be responsible for deciding whether the specific pieces of content in question should stay down or be put back up.

Facebook’s outsourcing of a fantastically tiny subset of content moderation considerations to a subset of its so-called ‘Oversight Board’ has attracted plenty of criticism (including inspiring a mirrored unofficial entity that dubs itself the Real Oversight Board) — and no little cynicism.

‘The Real Facebook Oversight Board’ launches to counter Facebook’s ‘Oversight Board’

Not least because it’s entirely funded by Facebook; structured as Facebook intended it to be structured; and with members chosen via a system devised by Facebook.

If it’s radical change you’re looking for, the FOB is not it.

Nor does the entity have any power to change Facebook policy — it can only issue recommendations (which Facebook can choose to entirely ignore).

Its remit does not extend to being able to investigate how Facebook’s attention-seeking business model influences the types of content being amplified or depressed by its algorithms, either.

And the narrow focus on content taken downs — rather than content that’s already allowed on the social network — skews its purview, as we’ve pointed out before.

So you won’t find the board asking tough questions about why hate groups continue to flourish and recruit on Facebook, for example, or robustly interrogating how much succour its algorithmic amplification has gifted to the antivaxx movement.  By design, the FOB is focused on symptoms, not the nation-sized platform ill of Facebook itself. Outsourcing a fantastically tiny subset of content moderations decisions can’t signify anything else.  

With this Facebook-commissioned pantomime of accountability the tech giant will be hoping to generate a helpful pipeline of distracting publicity — focused around specific and ‘nuanced’ content decisions — deflecting plainer but harder-hitting questions about the exploitative and abusive nature of Facebook’s business itself, and the lawfulness of its mass surveillance of Internet users, as lawmakers around the world grapple with how to rein in tech giants.  

The company wants the FOB to reframe discussion about the culture wars (and worse) that Facebook’s business model fuels as a societal problem — pushing a self-serving ‘fix’ for algorithmically fuelled societal division in the form of a few hand-picked professionals opining on individual pieces of content, leaving it free to continue defining the shape of the attention economy on a global scale. 

France and the Netherlands signal support for EU body to clip the wings of big tech

Data & News supplied by www.cloudquote.io
Stock quotes supplied by Barchart
Quotes delayed at least 20 minutes.
By accessing this page, you agree to the following
Privacy Policy and Terms and Conditions.