TikTok Failing To Remove Online Groomers
Video-sharing app TikTok is failing to suspend the accounts of people sending sexual messages to teenagers and children, a BBC investigation has found.
YouTube will no longer allow videos that "maliciously insult someone" based on "protected attributes" such as race, gender identity or sexuality.
The video-sharing platform will also ban "implied threats of violence" as part of its new harassment policy.
A row erupted in June after a prominent video-maker said he had been the target of abuse by another YouTube star. At the time, YouTube said its rules had not been broken. But it has now deleted many of the videos in question.
"Even if a single video doesn't cross the line, with our new harassment policy we can take a pattern of behaviour into account for enforcement," Neal Mohan, chief product officer at YouTube, told the BBC.
What does the updated policy say?
But the new policy also bans:
YouTube said the new policy would apply to "everyone" including politicians and popular YouTube stars as well as the general public.
Video-makers who consistently break the rules will have their ability to earn advertising revenue restricted, and may have videos deleted or their channel closed.
The company said there would be some exemptions from the new policy, including insults used in "scripted satire, stand-up comedy, or music".
Mr Mohan told the BBC that individual complaints would have to be judged on a case-by-case basis, with the context of each video being taken into account.
YouTube stated they had consulted with think tanks, video-makers, Google employees and other third parties to help inform its policy.