Facebook’s Oversight Board tackles hate speech and nudity in first cases

Facebook’s independent Oversight Board has announced the first cases on which it will deliberate, following the opening of user appeals in October.

 First announced in 2018, the Board will have the final word on what content is permitted on Facebook and its subsidiary Instagram.

It was created in response to criticism from various groups, particularly with regards to Facebook’s handling of the Moscow-backed disinformation campaign in the run-up to the 2016 US presidential election.

Amongst the tranche of stories being assessed by the Board – whose members include a former Danish Prime Minister alongside academics from respected law schools – is an appeal by an Instagram user whose photos included female breasts related to breast cancer symptoms.

The content was originally removed for violating the social network’s rules on nudity, although it later decided it removed the posts “in error” and reinstated it.

Since it first started taking cases, the Board has received more than 20,000 appeals from users, but has said it will need to prioritise cases “that have the potential to affect lots of users around the world”.

Other stories being looked at include screenshots that were posted to Facebook of tweets by Dr Mahathir Mohamad, the former Malaysian Prime Minister, in which he said: “Muslims have a right to be angry and kill millions of French people for the massacres of the past”. This was originally removed for violating the policy on hate speech.

Another hate speech removal being considered was the posting of two well-known photos of a deceased child lying fully clothed on a beach at the water’s edge. The accompanying text (in Burmese) asks why there is no retaliation against China for its treatment of Uighur Muslims, in contrast to the killings in France relating to cartoons.

The Board has removed any personally identifiable content of the original posters in order to protect their anonymity. Each of the cases has been assigned to five-member panels, including at least one member from the region implicated in the content.

The Board expects to decide on each case and for Facebook to then act on this decision within 90 days.

Facebook said: “Any decision they make on the content will be binding and we welcome any policy guidance related to it.”

The social network has been forced to clamp down on a raft of posts this year which spread misinformation related to the coronavirus pandemic.