More than three quarters of people surveyed saw self-harm content online for the first time aged 14 with some being as young as 10
Social media sites are still not doing enough to tackle self-harm content being pushed to users on their sites, says Samaritans.
The warning comes as new research from the charity and Swansea University found 83% of social media users surveyed were recommended self-harm content on their personalised feeds, such as Instagram’s ‘explore’ and TikTok’s ‘for you’ pages, without searching for it.
Worryingly, 76% said they had harmed themselves more severely because of viewing self-harm content online*. While it’s vital that the internet is made a safer space for people of all ages, the research highlighted that more than three-quarters of research participants saw self-harm content online for the first time aged 14 or younger.
Samaritans CEO Julie Bentley said: “We would never stand for people pushing this kind of material uninvited through our letterbox so why should we accept it happening online. Social media sites are simply not doing enough to protect people from seeing clearly harmful content and they need to take it more seriously.
“People are not in control of what they want to see because sites aren’t making changes to stop this content being pushed to them and that is dangerous. Sites need to put in more controls, as well as better signposting and improved age restrictions.”
The research is part of Samaritans’ Online Excellence programme that looks to provide industry guidance to platforms and better understand the impact of self-harm and suicide content on people who use online spaces. Almost 5,300 people were involved in the survey and 5,000 of those have experiences of self-harm and suicide.
When asked about changes they would like to see, 88% said they want more control over the content they see on social media and 83% said that content-specific trigger warnings such as ‘self-harm’ or ‘suicide’, rather than a ‘sensitive’ content warning would have a more positive impact on them.
Since 2019, some sites have changed policies on self-harm and suicide introducing blurring of images, restrictions on posting and signposting and help-seeking messages. Whilst improvements have been made, there is still a long way to go.
Julie comments: “The Online Safety Bill must become law as soon as possible to reduce access to all harmful content across all sites regardless of their size, and critically, make sure that this is tackled for both children and adults. We’re waiting anxiously for the Bill to return to the House of Commons after numerous delays, but there is nothing stopping platforms from making changes now.
“The internet moves much quicker than any legislation so platforms shouldn’t wait for this to become law before making vital changes that could save lives.”
Professor Ann John, co-lead on the research at Swansea University, said: “While our study cannot claim to represent the whole population’s experience of this content since only those interested would have responded to our requests, many of the themes point clearly to ways social media platforms can improve. People want more control over the content they view, ways to ensure children meet age requirements and co-produced safety features and policies. That all seems very doable.”
Research has shown that the internet can be an invaluable source of support, hosting helpful resources and ways for people to connect with others who share their experiences, so they feel less alone.
Samaritans want social media sites to make safe spaces so helpful conversations can take place by providing guidance to people on how they can talk about these topics online in a safe way, whilst protecting both themselves and others.
The charity is calling for all sites and platforms to give people more control over the content they see, ensure that suicide and self-harm content is never pushed to users and improve the support available.
The preliminary report findings can be found here with the full research due to be published by University of Swansea in the coming months.