100,000 Predatory Accounts Still On YouTube Despite New Measures
Trusted Flaggers on the site say there could be up to 100,000 predatory accounts leaving indecent comments on videos of young people
Instagram has pledged to remove images, drawings and even cartoons showing methods of self-harm or suicide.
The move is its latest response to the public outcry over the death of British teenager Molly Russell.
The 14-year-old killed herself in 2017 after viewing graphic content on the platform.
Molly's father has described the Facebook-owned app's commitment as "sincere" but said managers needed to act more swiftly.
Instagram's latest promise covers explicit drawings, cartoons and memes about suicide, in addition to any other method "promoting" self-harm.
It extends measures announced in February, which banned graphic images of self-harm and restricted those with suicidal themes. This included both stills and videos.
Instagram has been under pressure to act after Mr Russell said he believed the US-based service had been partly responsible for his daughter's death. After she died, Mr Russell found large amounts of graphic material about self-harm and suicide on her Instagram account. He also found similar content on her Pinterest login.
The 56-year-old went public in January of this year.
The UK government, charities and the media were among those who subsequently called on Instagram and other technology companies to make changes.
'Lack of responsibility'
Instagram's latest announcement coincided with a visit by Mr Russell to Silicon Valley. During his visit, Florida-based internet safety campaigner and paediatrician Dr Free Hess showed him content still available on Instagram. It included graphic photographs, videos of self-harm and cartoons advocating suicide.
She said hashtags had helped lead young people to the content.
Instagram says it has doubled the amount of material removed related to self-harm and suicide since the first quarter of 2019.
Between April and June this year, it said, it had removed 834,000 pieces of content, 77% of which had not been reported by users.
If you've been affected by self-harm, eating disorders or emotional distress, help and support is available via the BBC Action Line.