Download document: Implementing using user-friendly report processes for self-harm and suicide content.pdf
138.1 kb - PDF
This page provides guidance for all sites and platforms hosting user-generated content on implementing user friendly reporting processes. All sites and platforms should ensure that users accessing their site can easily report concerning self-harm and suicide content, and the behaviour of other users that worries them. This helps platforms limit the potential for harm by identifying and responding to content promptly.
Note: This section refers to ‘reporting’ as this term is widely understood. However, language around reporting should be considered carefully for self-harm and suicide content: ‘report’ can suggest wrongdoing and users may worry about ‘reporting’ a person in distress for fear of getting them into trouble. This may cause delays in vulnerable users receiving help. Alternative language could include ‘flagging’, ‘raising concerns’ or ‘I’m worried’.
Sites and platforms should provide clear and accessible information to users about reporting concerning content on their site, including suicide and self-harm. This should be clearly displayed to new users, and existing users should be regularly reminded to encourage them to make reports. All information should be in plain language and changes or updates should be transparently communicated. This will ensure that users proactively and responsibly report material that concerns them and emphasises that inappropriate content is taken seriously.
Explaining what constitutes inappropriate content, and what action will be taken if content breaking community rules is found on the platform.
Including why it’s important to report concerning content, how to make a report and what happens afterwards.
Reporting processes will differ depending on the size, purpose and functionality of a site or platform.
As a minimum, all sites and platforms hosting usergenerated content should have a dedicated email address or reporting form that users can access to flag concerns about self-harm and suicide content or user behaviour.
Larger sites and platforms should consider having more sophisticated reporting functions, such as:
Effectively reviewing user reports about suicide and self-harm and quickly removing harmful content will limit its audience, reducing its impact. Promptly responding to reports that indicate a user has harmed themselves or is in urgent need of help means information can be quickly passed to appropriate support.
To ensure user reports are effectively reviewed and responded to, sites must have:
At a minimum, sites and platforms must provide the following to all users who have reported content about self-harm or suicide:
If a report indicates a user is at risk of imminent harm, the following should be provided to the person making the report:
Download our information sheet about implementing user-friendly reporting on sites and platforms:
138.1 kb - PDF