Click For Photo: https://i.dailymail.co.uk/1s/2019/09/10/17/18146230-0-image-a-6_1568132905369.jpg
Facebook said it will tighten its grip on content relating to suicide and self-harm in an effort to make the platform and its sister-site, Instagram, safer.
In a blog post, Facebook announced several policy changes that will affect how content relating to self-harm and suicide are treated once posted to its platform.
Company - Images
The company says it will 'no longer allow graphic cutting images to avoid unintentionally promoting or triggering self-harm.'
That policy will apply 'even when someone is seeking support or expressing themselves to aid their recovery' said Facebook in a blog post.
Policy - Images - Cuts - Company - 'sensitivity
The new policy will also encompass images of healed self-inflicted cuts, which the company says it will temper with a 'sensitivity screen' that users must click through to access the underlying content.
Likewise, Instagram will start to deprioritize content that depicts self-harm, removing it from the Explore tab and sequestering it from the company's suggestion algorithm.
Dialogue - Suicide - Self-harm - Facebook - Users
To help promote healthy dialogue on suicide and self-harm, Facebook says it will also direct users to guidelines developed by the National Centre of Excellence in Youth Mental Health, ORYGEN, when they search for content relating to those topics.
The guidelines, are meant to 'provide support to those who might be responding to suicide-related content posted by others or for those who might want to share their own feelings and experiences with suicidal thoughts, feelings or behaviors,' said Facebook.
Facebook - Changes - Result - Input - Health
According to Facebook, the changes come as the result of input from mental health professionals and experts in the field of suicide prevention.
In February, Facebook...
Wake Up To Breaking News!