Criminals target Snapchat users in extortion scam that threatens to reveal their private photos
Scammers are now targeting people on the popular picture sharing app.
Sites must assess content for sexual abuse and suicide risk or face fines of up to £17m.
Technology companies will be required to assess their sites for sexual abuse risks, prevent self-harm and pro-suicide content, and block children from broadcasting their location, after the publication of new rules for “age-appropriate design” in the sector.
The UK Information Commissioner’s Office, which was tasked with creating regulations to protect children online, will enforce the new rules from autumn 2021, after one-year transition period. After which companies that break the law can face sanctions comparable to those under GDPR, including fines of up to £17m or 4% of global turnover.
Companies that make services likely to be accessed by a child will have to take account of 15 principles designed to ensure their services do not cause harm by default. Those include: