This page provides guidance for companies hosting user-generated content on reducing access to harmful self-harm and suicide content on their platform. Online discussion of self-harm and suicide plays an important role in feeling accepted, seeking help and recovery for many users. However, it can also carry potential risks by presenting opportunities to access graphic content, details around methods of harm and content that glorifies or promotes self-harm and suicide. Sites and platforms must proactively limit access to harmful content using safe and sensitive approaches and establish measures to protect users from harm.
Self-harm and suicide content that is harmful encourages or provides information to users about how to hurt themselves. All sites and platforms must take a proactive role in reducing access to potentially harmful content on their site. While one piece of content may not be harmful in isolation, engaging with high volumes of self-harm and suicide content may have a substantial impact on user wellbeing.
For detailed information on what constitutes harmful content and the risks associated with self-harm and suicide content, see our information page: Understanding self-harm and suicide content.
Sites should consider:
- Developing inbuilt safety functions allowing users to proactively block content based on certain themes, users, keywords and tags
- Ensuring site algorithms don’t push self-harm and suicide content towards users. For example, sites and platforms that make suggestions based on previous browsing should disable this functionality for self-harm and suicide content
- Reviewing autocomplete searches for terms and phrases relating to self-harm and suicide. Autocomplete searches should be turned off for harmful searches, such as methods of harm and associated equipment
- Reviewing automated suggestions for terms relating to self-harm and suicide, such as related topics or tags
- Prioritising helpful content in search results and burying harmful content further down search result rankings, making it more difficult to access and reducing the likelihood of it being seen. This could include having pop ups or fixed boxes with support content in them
- Blocking searches for harmful search terms relating to self-harm and suicide, and redirecting users to support options. This is particularly important for terms that are known to be harmful, such as those relating to methods of suicide, or searches for websites that are known to host harmful content
- Using age or sensitivity content warnings this could include alerts over content either checking a user’s age or warning users that content may be distressing as it mentions self-harm or suicide
- Reviewing link functionality if users are posting links to external sites hosting harmful content, platforms should consider removing link functionality or adding safety messages to disrupt the user journey
- Developing parental controls allowing parents to filter the content their child accesses
- Incorporating nudge techniques encouraging users to navigate away from content that could be harmful. More information about nudge techniques can be found in the Information Commissioner’s Office’s (ICO) Age Appropriate Design Code. Examples of nudge techniques include:
- Pop-up messages when users are actively seeking self-harm and suicide content, requiring them to pause, read about support services, and click again in order to continue browsing
- Warning messages when users have been browsing potential harmful content for a long time, suggesting they take a break
- Features that allow users to limit the time spent browsing
- Prompts reminding users to report content that worries them
There will be times when it is necessary for platforms to remove user-generated content from their site because it has the potential to cause harm and breaks community guidelines.
When doing this, consideration should be given to users who have viewed the content and the user posting it. It could be the first time they have reached out for support or attempted to let people know what they’re experiencing. Removing their content may cause significant distress, feelings of stigmatisation and can limit the support received from other users. For these reasons, companies must use safe and sensitive approaches when removing content.
Tips for removing content safely:
- Contact the user explaining why their content has been removed, making reference to the community guidelines. Including a link to the guidelines can remind the user about the rules of the platform
- Empathise. Use a kind, empathetic and non-judgemental tone. Remember the user might be really struggling and in need of support
- Provide guidance on how they can post safely by editing their content so they can continue to post. For example, by removing details around methods of harm
- Provide signposting to relevant support options
Pausing membership and closing accounts
If users repeatedly break community guidelines on posting self-harm and suicide content, companies may decide to pause their membership or close their account. When doing this, companies should be mindful that removing access to an online community could withdraw a vital source of support for the user. It could also mean that the user then goes to less moderated spaces for support, potentially hosting more harmful content.
If pausing a user’s membership or closing an account, companies should:
- Consider whether it would resolve the issue. If users can simply open a new account, withdrawing access might not be the answer
- Remember that the user could be in high levels of distress. Try to use a supportive and understanding tone and approach in communications
- Provide a warning first. Consider providing a warning explaining how they have broken the community guidelines and the importance of these rules in keeping users safe. Guidance should also be provided on how the user can post about their experiences in a safe way
- Explain why. Give the user a clear explanation of why their account is being paused or suspended by referring to community guidelines
- Provide signposting to relevant support options
- Tell the user what happens next such as when their membership will be reviewed or information about how to appeal the decision
Download our information sheet for approaches to reducing user exposure to harmful self-harm and suicide content: