Facebook not willing to delete videos of violent death, self-harm etc

22 May 2017

With Facebook under pressure from world leaders to scrub controversial content, The Guardian newspaper has revealed guidelines Facebook employees use to delete controversial content.

Facebook will allow users to live-stream self-harm attempts because it ''doesn't want to censor or punish people in distress who are attempting suicide'', according to the leaked documents.

The footage will be removed ''once there's no longer an opportunity to help the person'' – unless the incident was especially newsworthy.

One document says: ''Removing self-harm content from the site may hinder users' ability to get real-world help from their real-life communities.

''Users post self-destructive content as a cry for help, and removing it may prevent that cry for help from getting through. This is the principle we applied to suicidal posts over a year ago at the advice of Lifeline and Samaritans, and we now want to extend it to other content types on the platform.''

Defending the social network's policy of leaving some suicide footage on the site, Facebook's head of global policy management, Monika Bicker said: ''We occasionally see particular moments or public events that are part of a broader public conversation that warrant leaving this content on our platform.

''We work with publishers and other experts to help us understand what are those moments. For example, on September 11 2001, bystanders shared videos of the people who jumped from the twin towers. Had those been livestreamed on Facebook that might have been a moment in which we would not have removed the content both during and after the broadcast.