- Advertisement -
()

US government agencies have launched an investigation into TikTok's handling of child sexual abuse material.

The fight against sexual predators has been a persistent challenge for social media platforms. TikTok's young user base is particularly vulnerable.

The US Department of Homeland Security is investigating TikTok's handling of child sexual abuse material.

The Department of Justice is also reviewing how a particular privacy feature of TikTok is being exploited by predators, said a person familiar with the case.

The tests will shed light on how TikTok can cope with the content stream generated by more than 1 billion users. The Chinese-owned company ByteDance has more than 10,000 human moderators worldwide.

tiktok oroszországot
Photo by Daily Sabah

Business is booming

Insider Intelligence forecasts that TikTok's advertising revenue will be $11.6 billion this year, triple last year's $3.9 billion.

Mark Zuckerberg, CEO of Meta, blamed the popularity of TikTok among young people for the slowdown in interest in Facebook and Instagram.

However, Meta has more experience of dealing with problematic material, with around 15 000 moderators worldwide, and uses other automated systems to check posts.

Between 2019 and 2021, the number of child exploitation investigations involving TikTok by homeland security agencies increased sevenfold.

Social media networks use technology based on a database of images collected by the National Center for Missing and Exploited Children (NCMEC). This is a centralised organisation where companies are legally obliged to report child abuse material.

Last year, the short video platform reported nearly 155 000 videos, while Instagram, which also has more than 1 billion users, published nearly 3.4 million reports. TikTok received no takedown requests from NCMEC last year, unlike rival Facebook, Instagram and YouTube.

"TikTok has zero tolerance for child sexual abuse material," the company said. "If we detect any attempt to post, obtain or distribute [child sexual abuse material], we will remove the content, ban accounts and devices, report it to NCMEC immediately and contact law enforcement as necessary."

Homeland Security, however, claimed that international companies like TikTok were less motivated to cooperate with US law enforcement. "We want [social media companies] to be proactive in making sure that children are not being exploited and abused on your sites - and I can't say that they are, and I can say that many US companies are," he added.

TikTok said it removed 96 percent of content that violated its minor security policies before anyone viewed it. Under these guidelines, videos of underage drinking and smoking accounted for most of the removals.

They slip through the filter

One pattern that the Financial Times checked with law enforcement and child safety groups was that content was obtained through private accounts. The password was then shared with victims and other predators. Keywords are used in public videos, usernames and biographies. The illegal content is in turn uploaded using the app's "Just me" feature, where videos are only visible to those logged into the profile.

Child safety campaigner Seara Adair reported this trend to US law enforcement after she first flagged the content on TikTok and was told that one of the videos did not violate the guidelines. "TikTok keeps talking about the success of its artificial intelligence, but a clearly naked child slips through," Adair said.

"We are deeply committed to the safety and well-being of minors. We build youth safety into our policies, enabling privacy and security settings by default in teen accounts and limiting functionality by age," TikTok added.

How useful was this post?

Click on a star to rate!

Average rating / 5. Number of votes:

No votes so far! Be the first to rate this post.

Since you found this post useful...

Follow us on social media!

Sorry this post was not useful for you!

Fix this post!

Tell us, how can we improve this post?

- Advertisement -