Safe design and content management in online suicide and self-harm prevention
1 in 5 people will have suicidal thoughts at some point in their lives.¹
This statistic highlights the urgent need for intervention, and with the average person in the UK spending over six hours online daily², digital platforms are in a unique position to help with suicide prevention.
With online regulation like the Online Safety Act and Digital Services Act becoming global priorities, it's more evident than ever that online services can either cause harm or offer essential support. Samaritans' Online Excellence Programme is committed to making digital environments suicide and self-harm safe by offering evidence-based insights to guide platforms in balancing user engagement with wellbeing.
Explore our guidance
Our suite of evidence-based guidance, designed in collaboration with people with lived experience of suicide and self-harm, industry experts, the Government, and academics, provides clear strategies for creating safer, more supportive platforms. These resources are designed to help digital platforms go beyond compliance and focus on best practice.
![Meeting](https://media.samaritans.org/images/headway-5QgIuuBxKwM-unsplash.2e16d0ba.fill-460x350.jpg)
Platforms have the power, capability and responsibility to ensure their users are safe and well and this can make a massive difference to individuals who may well be seeking support on their platform, they can help those people or they can make them feel worse.
Person with lived experience
Customised Advice and Training
We offer bespoke advice and tailored training solutions through our free, confidential, Online Safety Advisory service.
We specialise in helping organisations align their policies and processes with current best practices, minimising risk while maximising user support. Additionally, our advisory service, connects you with people who have lived experience, providing meaningful insights into how those affected by suicide and self-harm navigate the internet.
Examples of our work
Policy reviews
Reviewing policies and communication materials to ensure best practice.
Managing content
Guidance on how to navigate the 'grey area' of legal but harmful content.
Supporting vulnerable users
Providing insights into why people might be posting harmful content and strategies for supporting them.
Collaboration
Working with platforms to produce educational material and user empowerment tools. Helping users keep themselves and others safe.
Training and resources
Developing resources and training for staff, including content moderators, policy, and trust and safety professionals.
References
- McManus S, Bebbington P, Jenkins R, Brugha T. (eds.) (2016). Mental health and wellbeing in England: Adult psychiatric morbidity survey 2014.
- We Are Social & Meltwater (2024), “Digital 2024 United Kingdom” retrieved from https://datareportal.com/reports/digital-2024-united-kingdom 2024