TikTok is still slow to respond to complaints of inappropriate sexual messages allegedly sexual … [+] Predators after new BBC Panorama investigation.

GC images

Research by BBC Panorama found that TikTok has been slow to act against adults who engage children in sexually explicit conversations, even though such conversations are flagged by users. The allegations follow similar claims made by an earlier BBC investigation last year, suggesting the social media company is still failing to address perceptions that its network is helping child predators.

TikTok ‘Slow’ to respond to predators

Working with a 23-year-old woman who produces TikTok videos for an internet search company, the investigative program created a bogus report on a 14-year-old girl. The 23-year-old’s pictures were edited to make her look younger, while her posts were hashtagged (e.g., “#schoollife”) to indicate that she was younger than consent.

Some older men soon followed on their TikTok account. A 34-year-old sent her a sexually explicit message even after the women told him she was 14 years old.

The ‘girl’ then reported the user and their comments to TikTok. However, the social media company only took action after BBC Panorama contacted him about four days after the initial report and disclosed details of his investigation.

Tiktok’s initial response to BBC Panorama was as follows: “A report on a user’s account or comments generally does not trigger verification of direct messages.”

The social media company also stated that no action was taken as the report “was directed against the account in general and not the specific direct messages.”

Upon completion of BBC Panorama’s investigation, TikTok notified the BBC that two accounts and the devices used with them were permanently banned.

TikTok has also told me that it “works continuously to make TikTok a hostile environment for predatory behavior”. A company spokesman says this is the only platform that turns off direct messaging for under 16s, allows direct messaging between people over the age of 16 only if they agree to follow each other, and that sharing pictures and videos is about Direct prohibition is prohibited messaging, regardless of age.

“We are already looking at ways to improve the verification of user reports,” added the spokesman. “There’s no such thing as ‘job done’ when it comes to protecting young people from online harm. That’s why we work with industry experts, NGOs and online security specialists and invest in our technology, processes and people to continuously improve TikTok’s security. “

Remote moderation

In the episode of BBC Panorama, which airs in the UK on BBC One, investigators also speak to a former “content presenter” who worked in TikTok’s London office. His job was to ensure that users were complying with TikTok’s Terms of Use and Community Guidelines.

He says TikTok’s Chinese headquarters made important decisions regarding content moderation during his time with the company, leaving him and his colleagues largely powerless to tackle issues like sexual predators.

He told BBC Panorama: “It seemed like not much was being done to keep people safe. Beijing was reluctant to suspend accounts or take action against users who were abusive. They’d pretty much always get a temporary ban of some sort, like a week or so. “

He also says that he and his team’s moderators were unable to freeze accounts and that they would have to ask TikTok’s Beijing office for permission to permanently suspend profiles.

In March, TikTok announced that it would no longer use Beijing-based moderators. The company told BBC Panorama that it is “investing heavily in automated moderation”. There is also an “ever-growing team of experts” of more than 10,000 moderators in 20 countries who “review content and accounts and take action that violates its policies”.

Feedback loop

The former content host also told BBC Panorama that the company’s algorithms were effective in providing suggestive content to sexual predators while working on TikTok.

“The algorithm will feed you what you interact with. So if you see a lot of children dancing sexually and you interact with that, you will have more children dancing sexually,” he said.

To test this claim, BBC Panorama set up another fictional account, this time for a 36-year-old man. Whenever pictures of young girls in school uniform were presented, ‘the man’ liked them and watched videos until the end.

Within half an hour, Panorama reported that its “For You” page was filled with pictures of underage teenagers.

TikTok states that the community guidelines make it clear that the sexualization of minors is not allowed on its platform because sexualized content is not allowed to be displayed in the For You feed. “We use a combination of technology and moderation teams to identify and remove content that violates our guidelines,” added the spokesman.

It’s also worth noting that TikTok isn’t exactly a special case in this regard. A similar complaint can be made against other social media platforms, with YouTube being another network that feeds potentially provocative content to potential predators. This is a very difficult problem to solve because even as platforms remove tags from videos, algorithms recommend videos that have fallen together.

However, if social media companies can be more proactive in responding to complaints and removing potentially problematic accounts (or content), this problem can potentially be significantly resolved. However, with social media usage increasing rapidly in the wake of the coronavirus pandemic, users – and users’ parents – must also remain vigilant.

This article has been updated to include a comment from TikTok.