Instagram biggest for child grooming online - NSPCC finds
Sex offenders are grooming children on Instagram more than on any other online platform, a charity has found.
Facebook-owned app promises changes to better shield users from self-harm images.
Instagram will introduce “sensitivity screens” to hide images of self-harm, in an attempt to protect young people who use the site, the app’s head has announced.
Adam Mosseri, who took over Instagram after the app’s founders departed suddenly in 2018, has promised a series of changes following the death of British teen Molly Russell, whose parents believe she took her own life after being exposed to graphic images of self-harm and suicide on Instagram and Pinterest.
The Facebook-owned app already bans posts that promote or encourage suicide or self-harm, Mosseri said, but faces challenges in finding those posts to take them down, as well as ensuring that users can still share imagery related to those topics in ways that allow them to express themselves, but do not amount to incitement.
That includes “sensitivity screens” for images of self-harm, which blur the image behind them until the user explicitly indicates they want to view the graphic content. It also means that company has blocked images of cutting from showing up in search, hashtags or account recommendations. Mosseri says the changes “will make it more difficult for people to see” those images.
Mosseri’s promise comes a week after Facebook was issued with an ultimatum by the health secretary Matt Hancock get better at protecting children on Instagram and Facebook’s main app, or face the force of the law.
If you feel emotionally distressed or suicidal please call Samaritans for help on 116 123 or email firstname.lastname@example.org.