Social media companies are heavily criticised today by a House of Commons committee for failing to do enough to remove illegal and extremist material posted on their sites, and for not preventing it appearing in the first place.
The moderation policies of platforms such as Twitter, YouTube and Facebook have been under scrutiny after high-profile cases in which violent or abusive material has been posted online and, in some cases, not been removed even after they were notified. The committee’s report said it had found repeated examples of extremist material, including from banned jihadist and neo-Nazi groups not being removed, even after it had been reported.
“Social media companies’ failure to deal with illegal and dangerous material online is a disgrace,” said Yvette Cooper, chairwoman of parliament’s Home Affairs Select Committee. “They have been asked repeatedly to come up with better systems to remove illegal material such as terrorist recruitment or online child abuse. Yet repeatedly they have failed to do so. It is shameful.”
The committee said the UK Government needed to strengthen the laws regarding publishing such material and called on social media companies to pay for the cost of policing online content and publicly report details of their moderating
Responding to the report, the Government said it expected to see early and effective action from social media to develop the tools needed to identify and remove ‘terrorist propaganda.’ “We have made it very clear that we will not tolerate the internet being used as a place for terrorists to promote their vile views, or use social media platforms to weaponize the most vulnerable people in our communities,” interior minister Amber Rudd said.