Video-sharing app TikTok has been slammed after failing to stop abusive members sending sexual and threatening messages to children.
The app, which allows people to act out film scenes and lipsync to songs, has become popular with youngsters and has more than 500 million users worldwide.
But a three-month investigation found most accounts that repeatedly sent inappropriate messages to young women and girls were allowed to remain on the app without punishment, despite the company deleting the posts when they were flagged up.
TikTok said user safety was its top priority and pledged to do more to protect children, adding it has a team going through thousands of comments every day.
It comes as Children’s Commissioner Anne Longfield has confirmed she will be seeking a meeting with TikTok to discuss child protection.
Video-sharing app TikTok has been slammed after failing to stop abusive members sending sexual and threatening messages to children
A three-month investigation found most accounts that repeatedly sent inappropriate messages to young women and girls were allowed to remain on the app without punishment
The BBC found hundreds of sexual comments on videos uploaded by teenagers and children, many of whom were female
Although TikTok has a policy of only accepting users over 13, many children are still able to get around its age gate, with some as young as nine.
The BBC investigation found hundreds of sexual comments on videos uploaded by teenagers and children, many of whom were female.
The messages were reported and the majority were deleted, but some were allowed to remain.
Often the accounts behind the messages use pseudonyms to remain anonymous.
One parent, known only as Chris, told the BBC his 10-year-old son had been using TikTok without his knowledge and began receiving threats to ‘come and get him’.
He said: ‘If [my son] had started engaging in conversation, what could have been next?’
‘It’s disgusting. TikTok’s got a responsibility now and if people are getting on there and seeing messages like this should be contacting the police at the very least.’
Teenagers Emily Steers and Lauren Kearns, of Northamptonshire, both 15, run a channel that has more than one million followers and say they enjoy receiving support from fans but also get sent sexually-themed comments.
Emily’s father Mark told the BBC: ‘It is a bit worrying. When they do catch people saying bad things or sexual things, they should have the power to block them or actually take them away straightaway.’
MP Damian Collins, chairman of the House of Commons Digital, Culture, Media and Sport Committee, added TikTok and other online apps needed to have ‘robust age verification’ tools to stop their policies becoming ‘meaningless’
Ms Longfield said she wanted children to be able to enjoy TikTok but that the company had to ‘take its responsibilities seriously’.
MP Damian Collins, chairman of the House of Commons Digital, Culture, Media and Sport Committee, added TikTok and other online apps needed to have ‘robust age verification’ tools to stop their policies becoming ‘meaningless’.
TikTok was hit with a £4.3million fine in the US earlier this year for gathering personal information on children aged under 13.
The app has come under fire in India and the government has been asked to ban it because of concerns it will be used for ‘pornography’.
TikTok said it had more to do to remove and tackle abusive accounts but said it was constantly rolling out new technology to track and remove inappropriate content.
The firm said it had introduced a range of tools for users to keep their accounts private, turn comments off and filter out certain words.
A spokesman for the firm said safety was its ‘top priority’.
The spokesman said: ‘Promoting a safe and positive app environment for our users is our top priority.
‘We are pleased that the BBC investigation recognised that we have managed to remove the majority of the reported comments.
‘These findings reflect the progress of many of our more recent changes to fight against misuse, the work is never ‘done’ on our end.
TikTok was hit with a £4.3million fine in the US earlier this year for gathering personal information on children aged under 13
‘This is an industry-wide challenge, and we are committed to continuously enhancing our existing measures and introducing additional technical and moderation processes in our ongoing commitment to our users.
‘We deploy a combination of policies, technologies, and moderation strategies to detect and review problematic content, accounts, and implement appropriate penalties.
‘For example, we take escalating actions ranging from restricting certain features up to banning account access, based on the frequency and severity of the reported content.
‘In addition, we have multiple proactive approaches that look for potentially problematic behaviour and take action including terminating accounts that violate our Terms of Service.
‘We have a dedicated and growing team of human moderators to manually cross review tens of thousands of videos and accounts, and we constantly roll out internal training and processes to improve moderation accuracy and efficiency.
‘While these protections won’t catch all instances of misuse, we’re committed to improving and enhancing our protective measures, and we use learnings like these to continually hone our moderation efforts.
‘Age-based access is a topic that is important to many platforms, including TikTok.
‘Together with our industry peers, we participate in the conversation with experts and third-party organisations to explore future solutions to address this challenge.’