Instagram brings enhanced self-harm content detection tools to the UK

Instagram brings enhanced self-harm content detection tools to the UK

Proactively Make Harm In Spot To The Less It App Moderation More Visible Are Able And Tools Self The New Content Automatically

Instagram is introducing new technology to its app in Europe that is able to better identify suicide and self-harm content which breaks the app’s rules.

The new moderation tools are able to more proactively spot self-harm content and automatically make it less visible in the app, and in some cases remove it completely after 24 hours if the machine learning is confident it breaks the site’s rules.

The feature is already used on Facebook and Instagram outside of the EU, where it includes additional layers which also see posts referred to human reviewers once spotted, who can then take further action such as connecting the poster to local help organisations and in the most severe cases, calling emergency services.

However, Instagram confirmed these referral aspects are not yet ready to be introduced to Europe because of data privacy considerations linked to the General Data Protection Regulation (GDPR).

The social media giant said it hoped it would be able to introduce the full set of tools in the future.

Instagram’s public policy director in Europe, Tara Hopkins, said: “In the EU at the moment, we can only use that mix of sophisticated technology and human review element if a post is reported to us directly by a member of the community.”

She said that because in a small number of cases an assessment would be made by a human reviewer on whether to send additional resources to a user, this could be considered by regulators to be a “mental health assessment” and therefore a part of special category data, which receives greater protection under GDPR.

Ms Hopkins said the company was in discussions with the Irish Data Protection Commission (IDPC) – Facebook’s lead regulator in the EU – and others over the tools and a potential introduction in the future.

“There are ongoing conversations that have been very constructive and there’s a huge amount of sympathy for what we’re trying to achieve and that balancing act of privacy and the safety of our users,” she said.

In a blog post announcing the update, Instagram boss Adam Mosseri said it was an “important step” but that the company want to do “a lot more”.

He said not having the full capabilities in place in the EU meant it was “harder for us to remove more harmful content, and connect people to local organisations and emergency services”.

He added that the firm was in discussions with regulators and governments about “how best to bring this technology to the EU, while recognising their privacy considerations”.

Facebook and Instagram are among the social media platforms to come under scrutiny for their approach to and handling of suicide and self-harm material.

Concerns have been raised about self-harm and suicide content online, particularly how platforms handle such content and its impact on vulnerable users, especially young people.

In September, Facebook and its family of apps were among the companies to agree to guidelines published by Samaritans in an effort to set industry standards on how to handle the issue.

Ms Hopkins said Instagram was trying to balance its policies on self-harm content by also “allowing space for admission” by people who have considered self-harm.

“It’s okay to admit that and we want there to be a space on Instagram and Facebook for that admission,” she said.

“We’re told by experts that can help to destigmatise issues around suicide. It’s a balancing act and we’re trying to get to the right spot where we’re able to provide that kind of platform in that space, while also keeping people safe from seeing this kind of content if they’re vulnerable.”

If you or someone you know has been affected by mental health issues you can contact:

Samaritans - 116 123, text 087 2609090 or email jo@samaritans.ie 

Pieta (Suicide & Self-harm) - 1800 247 247 or 01 623 5606 

Aware (Depression, Bi-Polar Disorder & Anxiety) - 1800 80 48 48 

Grow (Mental Health support & Recovery) - 1890 474 474 

Bodywhys (Eating Disorders Associations of Ireland) - 1890 200 444 

Childline (for under 18s) - 1800 66 66 66

More in this section

Cookie Policy Privacy Policy Brand Safety FAQ Help Contact Us Terms and Conditions

Group Limited Echo © Examiner