The new guidelines set out how to safely manage self-harm and suicide content online
On World Suicide Prevention Day, Samaritans publishes guidelines to support sites and platforms hosting user generated content on how to safely manage self-harm and suicide content and keep vulnerable people safe online.
The guidelines have been developed by Samaritans in collaboration with government, the world’s leading technology and social media companies including Facebook, Instagram, Google, YouTube, Twitter and Pinterest, academics and third sector organisations to help tackle the issue of harmful online content, whilst improving access to content that can be a source of support.
The suicide prevention charity will be the first charity that has brought major global online companies that operate in the UK together, to develop guidance which specifically looks at content relating to suicide and self-harm.
The internet is an integral part of our lives and an invaluable resource, but the challenge with this is making sure that online content is shared in a safe way, particularly for those who are vulnerable. Whilst we have seen steps in the right direction over the last 18 months, we still think there is further to go, and we need all platforms and sites to be taking this seriously. The guidance isn’t just for large global companies, any company that hosts user generated content needs to think about how this type of content may appear on their platform and how they will respond to it. These industry standards will enable more effective removal of harmful content, whilst also improving access and engagement with content that can be a source of support. We want all sites and platforms to recognise that self-harm and suicide content has potential for serious harms. We also want them to understand what that means for their platform, take action to minimise access to harmful content and create more opportunities for support in the online environment. Global companies can also use these guidelines more broadly across other countries that they operate in and apply them to their global policies rather than just the UK ones.
Samaritans Assistant Director of Research and Influencing, Jacqui Morrissey
Research from Samaritans and the University of Bristol* found that young people use the online environment to inform self-harm and suicide attempts, showing an urgent need to protect vulnerable people and help them to access support. The study found that at least a quarter of young people who present to hospital who have self-harmed with suicide intent, have used the internet in connection with their attempt.
Tara Hopkins, Head of Public Policy at Instagram EMEA said: “Suicide and self-harm are complex and can have devastating consequences, so we want to do everything we can to keep people who use our platforms safe. We try to strike the delicate balance of giving people the space to talk about mental health, while protecting others from being exposed to harmful content. That’s why we’re grateful to partners like Samaritans for helping us develop our existing content policies, and welcome these new guidelines to help the tech industry address these issues as sensitively and effectively as possible.”
Sarah Butterfield Bromma, Head of Policy at Pinterest said: “We want Pinterest to be a place for inspiration and that means we need to be deliberate about creating a safe and positive space for our users. We’ve worked on improving our policies around self-harm content and providing compassionate support for those in need. Working closely with experts such as Samaritans helps us to find even more ways that we can connect people in distress with supportive resources and learn more about how we might improve our policies and enforcement.”
Mental health and suicide prevention minister, Nadine Dorries said: “We have seen over the last few months how social media has the power to bring us together and connect us. However, all too often those who are vulnerable can all too easily access damaging content that promotes self-harm and suicide.
“Online platforms hosting user generated content have a serious responsibility to help tackle the issue of harmful online content, while also working to improve access to content that can offer support, and I would urge them to take immediate action by implementing this incredibly helpful guidance.”
Samaritans expertise in suicide prevention has informed the guidelines which are made up of best practice principles, to help platforms create safer online spaces for vulnerable people and allow users to access the benefits from the online environment, whilst minimising the potential for harm.
Samaritans is also launching an online harms advisory service, providing specialist support and advice for all sites and platforms hosting user-generated content on managing self-harm and suicide content on their platforms.
The guidelines and advisory service form part of Samaritans Online Harms Programme; a three-year funded programme of work being led by Samaritans in collaboration with the Department for Health and Social Care, Facebook, Instagram, Google, YouTube, Twitter and Pinterest.
The programme also includes a research and insight workstream which aims to understand more about what makes self-harm and suicide content harmful and for who, along with the development of resources for staying safe online.
For more information or interview requests, please contact the Samaritans Press Team at [email protected] or 020 8394 8300.
Notes to editors
*Biddle et al., 2016
The guidelines have been developed in collaboration with tech platforms, government, academics and third sector organisations.
The guidelines will be continuously updated and evolving because they have to reflect the emerging themes and technologies.
Samaritans’ Online Harms Advisory Service is available Monday to Friday from 9am to 5pm.
Read our guidelines on how to safely manage self-harm and suicide content online
Visit our new guidelines for the technology industry here