Facebook responds to Guardian’s revelations on content moderation

Society is still “figuring out” what is acceptable and what is harmful to share online, and Facebook “can play an important part of that conversation”, said the company’s head of global policy management Monika Bickert. Her statement follows publication by The Guardian yesterday of internal documents revealing how the company moderates issues such as hate speech, terrorism and self-harm on its platform.

The leaked documents included internal training manuals, spreadsheets and flowcharts. The paper said that new challenges such as “revenge porn” had overwhelmed Facebook’s moderators who it said often have just seconds to make a decision. The social media company reviews more than 6.5 million reports of potentially fake accounts a week, the newspaper added. Many of the company’s content moderators have concerns about the inconsistency and peculiar nature of some of the policies.

Those on sexual content, for example, are said to be the most complex and confusing, the Guardian reported. The newspaper also gave the example of Facebook policy that allowed people to live-stream attempts to self-harm because it “doesn’t want to censor or punish people in distress.” Facebook moderators were recently told to “escalate” to senior managers any content related to 13 Reasons Why, the Netflix original drama series based on the suicide of a high school student, because it feared inspiration of copycat behavior.

Updated 16.05: European Union ministers have approved plans to make social media companies such as Facebook, Twitter and Google’s YouTube tackle hate speech on their platforms, the first legislation at EU level on the issue. Companies will have to take measures to prevent the proliferation of hate speech, incitement to hatred and content justifying terrorism, including establishing mechanisms for users to flag such content. The proposals still need to be agreed with the European Parliament before becoming law, but its legislators have indicated support.

In a blog post today, Facebook’s Monika Bickert said: “Reviewing online material on a global scale is challenging and essential. As the person in charge of doing this work for Facebook, I want to explain how and where we draw the line. On an average day, more than a billion people use Facebook. They share posts in dozens of languages: everything from photos to live videos. A very small percentage of those will be reported to us for investigation. The range of issues is broad – from bullying and hate speech to terrorism – and complex. Designing policies that both keep people safe and enable them to share freely means understanding emerging social issues and the way they manifest themselves online, and being able to respond quickly to millions of reports a week from people all over the world.

“For our reviewers, there is another hurdle: understanding context. It’s hard to judge the intent behind one post, or the risk implied in another. Someone posts a graphic video of a terrorist attack. Will it inspire people to emulate the violence, or speak out against it? Someone posts a joke about suicide. Are they just being themselves, or is it a cry for help? In the UK, being critical of the monarchy might be acceptable. In some parts of the world it will get you a jail sentence. Laws can provide guidance, but often what’s acceptable is more about norms and expectations. New ways to tell stories and share images can bring these tensions to the surface faster than ever.

“We aim to keep our site safe. We don’t always share the details of our policies, because we don’t want to encourage people to find workarounds – but we do publish our Community Standards, which set out what is and isn’t allowed on Facebook, and why. Our standards change over time. We are in constant dialogue with experts and local organisations, on everything from child safety to terrorism to human rights.  Sometimes this means our policies can seem counter-intuitive. As the Guardian reported, experts in self-harm advised us that it can be better to leave live videos of self-harm running so that people can be alerted to help, but to take them down afterwards to prevent copycats. When a girl in Georgia, USA, attempted suicide on Facebook Live two weeks ago, her friends were able to notify police, who managed to reach her in time.”

Read Monika Bickert’s post.