Amnesty International claims Tiktok's business model 'poses danger' to children

Amnesty International claims Tiktok's business model 'poses danger' to children

Young Suicide Users That Amnesty's Media Content Videos Platform Normalising Is To Romanticising Social Or Struggles — Tiktok Suggests Health Research Its On Focused Remending Mental Including Istock

The business model of Tiktok is “inherently abusive” and “poses a danger” to children using the platform, damning new reports have said.

The research from Amnesty International, published today, suggests that the China-owned social media site’s ‘For You’ feed risks pushing young people towards harmful mental health content.

Between three and 20 minutes into Amnesty’s manual research, more than half of the videos in this feed were related to mental health struggles, with multiple recommended videos in the space of an hour romanticising, normalising, or encouraging suicide, according to the findings.

Further research, using automated accounts, showed that after five to six hours on the platform, almost one in two videos were mental-health related and potentially harmful.

Amnesty International researcher Lisa Dittmer warns that Tiktok's 'addictive design practices' are 'designed to keep users engaged for as long as possible'. Picture: Mastodon
Amnesty International researcher Lisa Dittmer warns that Tiktok's 'addictive design practices' are 'designed to keep users engaged for as long as possible'. Picture: Mastodon

“The findings expose Tiktok’s manipulative and addictive design practices, which are designed to keep users engaged for as long as possible,” said researcher Lisa Dittmer.

Amnesty said the “invasive” data collection poses a danger to young users and it also unequally applies protections for users around the world.

'Rabbit holes' of harmful content

The research, conducted with the US-based Algorithmic Transparency Institute and AI Forensics, found that children and young people who watched mental health-related content on the ‘For You’ page were drawn into “rabbit holes” of potentially harmful content.

The ‘For You’ feed’s algorithms feed users new content and is highly personalised based on the content already viewed.

Amnesty said that focus group discussions, interviews, and simulations of children’s Tiktok accounts in Kenya, the Philippines, and the US found it was “addictive by design”. One participant in the Philippines said: 

When I watch a sad video that I could relate to, suddenly my whole ‘For You’ page is sad and I’m in ‘Sadtok’. It affects how I’m feeling. 

Amnesty said it is designed to keep users hooked on the platform to collect “ever more data about them”.

“Tiktok then uses this data to create profiles of users and draw inferences about them, which allows it to cluster users in groups to target them with highly personalized content to keep them engaged,” it said.

“These groups and categories are also made available to advertisers so that they can target users with personalised ads.”

In response to Amnesty’s findings, Tiktok pointed to its community guidelines, which set out content planned on its platforms, including anything showing or promoting suicide or self-harm. It also said it is developing a “company-wide human rights due diligence process”.

In September, Tánaiste Micheál Martin said the primary school curriculum must be overhauled to include education on the dangers of Tiktok and similar platforms.

He said the findings of a probe from Ireland’s data watchdog into Tiktok’s approach to privacy settings for children were “very serious” and “concerning”.

The Government has taken steps to ban Tiktok on all Government devices, following the example of other countries worldwide.

More in this section

Cookie Policy Privacy Policy Brand Safety FAQ Help Contact Us Terms and Conditions

Examiner Limited © Group Echo