latest

Three-quarters of social media users see self-harm content online by age 14

A survey of more than 5,000 users found that a majority had self-harm content recommended to them on their personalised feeds

9th November 2022 about a 3 minute read
“We would never stand for people pushing this kind of material uninvited through our letterbox so why should we accept it happening online? Social media sites are simply not doing enough to protect people from seeing clearly harmful content and they need to take it more seriously." Julie Bentley, CEO, Samaritans

Three-quarters of social media users surveyed have seen self-harm content for the first time by the age of 14, new research has found.

The research, conducted jointly by the Samaritans and Swansea University, also found that 83% of social media users were recommended self-harm content on their personalised feeds, such as Instagram’s “explore” and TikTok’s “for you” pages, without searching for it.

The findings were drawn from a national survey designed to assess views and experiences of messaging and safety of social media platforms in relation to self-harm and suicide. It was completed by 5,294 people, ranging in age from 16 to 84, though the average age was 18.9 years.

Of those surveyed, 87% reported having self-harmed, with 211 reporting that they had never harmed themselves and 45 preferring not to answer.

The researchers also conducted focus groups with 10 social media users and in-depth interviews with 17.

When asked about the impact of seeing or sharing self-harm content online, over half of survey respondents reported that it depended on their mood at the time. More than a third reported a worsening of mood, with only 2% saying that it improved their mood. Of those who responded to the survey, 77% said they had self-harmed in the same or similar ways “sometimes” or “often” after viewing self-harm imagery, while 76% had self-harmed more severely, “sometimes” or “often” because of viewing self-harm content online.

Julie Bentley, the CEO of the Samaritans, said: “We would never stand for people pushing this kind of material uninvited through our letterbox so why should we accept it happening online? Social media sites are simply not doing enough to protect people from seeing clearly harmful content and they need to take it more seriously.

“People are not in control of what they want to see because sites aren’t making changes to stop this content being pushed to them and that is dangerous. Sites need to put in more controls, as well as better signposting and improved age restrictions.”

Context-specific warnings needed

The research is part of Samaritans’ Online Excellence programme, which aims to provide industry guidance to social media platforms and better understand the impact of self-harm and suicide content on people who use online spaces.

When asked about the changes they would like to see, 88% of respondents said they wanted more control over the content they see on social media, and 83% said that content-specific trigger warnings such as “self-harm” or “suicide”, rather than a “sensitive” content warning would have a more positive impact on them.

Although some sites have, since 2019, introduced measures such as blurring of images, restrictions on posting and introducing signposting to sources of support, the Samaritans said there was still a long way to go. Bentley said: “The Online Safety Bill must become law as soon as possible to reduce access to all harmful content across all sites regardless of their size, and critically, make sure that this is tackled for both children and adults. We’re waiting anxiously for the Bill to return to the House of Commons after numerous delays, but there is nothing stopping platforms from making changes now.”

FCC Insight

We need to exercise some caution about the findings of the survey, as the revelation that 87% of respondents had self-harmed suggests a self-selecting sample. Even allowing for that, however, the findings are still troubling.  Children and young people are being exposed to self-harm content online, even if they don’t actively seek it out. As the suicide of 14-year old Molly Russell shows, teenagers are a particularly vulnerable group who are susceptible to the influence of content that actively promotes self-harm. Social media companies need to recognise the responsibility they have for hosting – and pushing – this kind of content to young people, and to take measures to stamp it out.