Facebook’s Oversight Board Has Spoken. But It Hasn’t Solved Much

The Facebook Oversight Board issued its first five decisions Thursday. The rulings are well thought out and show the board members, charged with reviewing Facebook decisions to remove content and make recommendations on Facebook policies, take their job seriously. More than anything, though, they show the futility of moderating content across networks with more than 3 billion users—nearly half the people on earth.

The cases involve posts in five languages, and often, subtleties of meaning and interpretation. Two touch on deep-seated global conflicts: China’s oppression of Uighur Muslims and the ongoing border war between Armenia and Azerbaijan. We’ve long known that the vast majority—now approaching 90 percent—of Facebook users are outside the US, but the breadth of these cases drives home the magnitude of Facebook’s challenge.

they said
this
this article
this contact form
this content
this guy
this hyperlink
this link
this page
this post
this site
this website
top article
total stranger
try here
try these guys
try these guys out
try these out
try this
try this out
try this site
try this web-site
try this website
try what he says
try what she says
understanding
updated blog post
url
us
use this link
via
view
view it
view it now
view publisher site
view siteÂ…
view website
visit
visit here
visit homepage
visit our website
visit site
visit the site
visit the website
visit their website
visit these guys
visit this link
visit this page
visit this site
visit this site right here
visit this web-site
visit this website
visit website
visit your url
visite site
watch this video
web
web link
web site
weblink
webpage
website
website link
websites
what do you think
what google did to me
what is it worth
why not check here
why not find out more
why not look here
why not try here
why not try these out
why not try this out
you can check here
you can find out more
you can look here
you can try here
you can try these out
you can try this out
you could check here

Facebook has touted automation as one solution to that challenge, but these cases also highlight the shortcomings of algorithms. In one, Facebook’s automated systems removed an Instagram post in Portuguese from a user in Brazil showing bare breasts and nipples. But the post was an effort to raise awareness about breast cancer, an exception to Facebook’s general policy against nudity, and an issue that has bedeviled Facebook for a decade. To its credit, Facebook restored the post before the Oversight Board heard the case; but it still underscores problems with letting algorithms do the work. In the other case, involving a quote purportedly from Nazi propaganda chief Joseph Goebbels, Facebook’s memory feature had actually recommended that the user recirculate a post from two years earlier. The older post had presumably been allowed to remain, raising questions about the consistency of Facebook’s standards for reviewing content.

 

Facebook announced the creation of the board in 2018, after years of criticism about its role in fomenting ethnic hatred, political misinformation, and other evils. It took almost two years to assemble the 20 members, whose rulings on specific pieces of Facebook content are supposed to be binding.

In a statement Thursday, Monika Bickert, Facebook’s vice president for content policy, said the company would follow the board’s decisions to restore four items, including the Instagram post from Brazil. The board also suggested changes in Facebook policies, which the company is supposed to reply to within 30 days. Bickert said the recommendations “will have a lasting impact on how we structure our policies.”

In one case, though, she left some doubt. The board recommended that Facebook inform users when their content is removed by an algorithm, and allow for appeals. Bickert said the company expects to take longer than 30 days to respond to this recommendation.

Thursday’s cases may have been relatively easy ones. Coming soon: the politically fraught decision of whether to restore Donald Trump’s account, which is sure to anger a bloc of Facebook users (and employees) no matter how it is decided. Facebook punted that decision to the board last week.

Taken together, the cases decided Thursday reveal the enormity of Facebook’s challenge. Social media management company Social Report estimated in 2018 that Facebook users post 55 million status updates and 350 million photos every day; they send 9 million messages an hour and share 3 million links.

A decision on any one of those posts can be enormously complex. In October, a user in Myanmar, writing in Burmese, posted photographs of a Syrian Kurdish child who drowned attempting to reach Europe in 2015, and contrasted the reaction to the photo to what the user said was a “lack of response by Muslims generally to the treatment of Uighur Muslims in China.”

Leave a Reply

Your email address will not be published. Required fields are marked *

Previous post Jeff Bezos Steps Down as CEO—and Shows Amazon Is a Cloud Company Now
Next post The FTC Cracks Down on Bot-Wielding Ticket Scalpers