Survey Finds Facebook Is The Most 'Abusive' Social Media
The survey for Stop Cyber Bullying Day (16th June) shows the scale of racist, religion based, and homphobic abuse seen online
Trusted Flaggers on the site say there could be up to 100,000 predatory accounts leaving indecent comments on videos of young people
Part of YouTube's system for reporting sexualised comments left on children's videos has not been functioning correctly for more than a year, say volunteer moderators.
They say there could be up to 100,000 predatory accounts leaving indecent comments on videos. A BBC Trending investigation has discovered a flaw in a tool that enables the public to report abuse.
YouTube says it reviews the "vast majority" of reports within 24 hours.
It says it has no technical problems in its reporting mechanism and that it takes child abuse extremely seriously. On Wednesday, the company announced new measures to protect children on the site.
Users can use an online form to report potentially predatory accounts, and they are then asked to include links to relevant videos and comments. The reports then go to moderators - YouTube employees who review the material and have the power to delete it.
What Comments Are Made
However, sources told Trending that after members of the public submitted information on the form, the associated links might be missing from the report. YouTube employees could see that a particular account had been reported, but had no way of knowing which specific comments were being flagged.
With the help of a small group of Trusted Flaggers, BBC Trending identified 28 comments directed at children that were clearly against the site's guidelines.
Some of these are extremely sexually explicit. Others include the phone numbers of adults, or requests for videos to fulfil sexual fetishes. They were left on YouTube videos posted by young children and they are exactly the kind of material that should be immediately removed under YouTube's own rules - and in many cases reported to the authorities.
The group of flaggers estimate that there are "between 50,000 to 100,000 active predatory accounts still on the platform".
The children in the videos appeared to be younger than 13 years old, the minimum age for registering an account on YouTube. The videos themselves did not have sexual themes, but showed children emulating their favourite YouTube stars by, for instance, reviewing toys or showing their "outfit of the day".
Over a period of several weeks, five of the comments were deleted, but no action was taken against the remaining 23 until BBC Trending contacted the company and provided a full list. All of the predatory accounts were then deleted within 24 hours.
YouTube has pledged to begin taking an “even more aggressive stance” against predatory behaviour as it was reported that paedophiles are operating on the site and evading protection mechanisms.