Social media sites ‘not doing enough’ to prevent spread of self-harm content

A survey of social media users by Samaritans and Swansea University found that 83% of those asked had been recommended self-harm content without searching for it.
Social media sites ‘not doing enough’ to prevent spread of self-harm content

(pa) Content 76% Of More Harm Seen They Self That Or Because The Themselves Said Survey Severely Had Those Of Showed Went It Who Harm To Suicide On

Social media platforms are not doing enough to prevent users, in particular young people, from seeing and being affected by self-harm and suicide content, a new study says.

A survey of social media users by Samaritans and Swansea University found that 83% of those asked had been recommended self-harm content without searching for it.

Although the researchers said the study used a social media campaign to encourage people to take an online survey on the issue, and noted that this may have impacted the outcome as a result as it was more likely that people with experience of self-harm and suicide would have chosen to take part, they said it still highlighted how damaging such content to be, particularly to vulnerable young people.

The survey showed that 76% of those who had seen self-harm or suicide content said they went on to harm themselves more severely because of it.

In addition, it found that three-quarters of those who took part had seen self-harm content online for the first time aged 14 or younger, with the charity urging the platforms to do more now to protect their users rather than waiting for regulation to be forced upon them.

People are not in control of what they want to see because sites aren’t making changes to stop this content being pushed to them and that is dangerous

The vast majority of those asked (88%) said they wanted more control over filtering the content they see on social media, while 83% said they believe that more specific trigger warnings, such as using terms like self-harm or suicide within content warnings, would be helpful to them.

“We would never stand for people pushing this kind of material uninvited through our letterbox, so why should we accept it happening online,” Samaritans chief executive Julie Bentley said.

“Social media sites are simply not doing enough to protect people from seeing clearly harmful content and they need to take it more seriously.

“People are not in control of what they want to see because sites aren’t making changes to stop this content being pushed to them and that is dangerous.

“Sites need to put in more controls, as well as better signposting and improved age restrictions.

People want more control over the content they view, ways to ensure children meet age requirements and co-produced safety features and policies

Professor Ann John, from Swansea University and co-lead on the study, said more research on the subject was needed to get a clearer picture of the national impact of such content but said it was clearly damaging to many people.

“While our study cannot claim to represent the whole population’s experience of this content since only those interested would have responded to our requests, many of the themes point clearly to ways social media platforms can improve,” she said.

“People want more control over the content they view, ways to ensure children meet age requirements and co-produced safety features and policies. That all seems very doable.”

More in this section

Cookie Policy Privacy Policy Brand Safety FAQ Help Contact Us Terms and Conditions

Echo Group © Limited Examiner