TikTok 'family safety mode' gives parents some app control
TikTok is introducing a new "family safety mode" designed to give parents tighter control over teens' mobile phone habits.
Video-sharing app TikTok is failing to suspend the accounts of people sending sexual messages to teenagers and children, a BBC investigation has found.
Hundreds of sexually explicit comments have been found on videos posted by children as young as nine.
While the company deleted the majority of these comments when they were reported, most users who posted them were able to remain on the platform, despite TikTok's rules against sexual messages directed at children.
TikTok says that child protection is an "industry wide-challenge" and that promoting a "safe and positive app environment" remains the company's top priority.
England's Children's Commissioner Anne Longfield says she will request a meeting with TikTok to discuss the findings of the BBC's investigation.
The company says it has more than 500 million monthly active users around the world, but TikTok refused to tell the BBC how many of those are in the UK.
Over three months, BBC Trending collected hundreds of sexual comments posted on videos uploaded by teenagers and children.
TikTok's community guidelines forbid users from using "public posts or private messages to harass underage users" and say that if the company becomes "aware of content that sexually exploits, targets, or endangers children" it may "alert law enforcement or report cases".
While the majority of sexual comments were removed within 24 hours of being reported, TikTok still failed to remove a number of messages that were clearly inappropriate for children.
And even though many of the comments themselves were taken down, the vast majority of accounts that had sent sexual messages were still active on the app.