Skip to Main Content

Watchdog cracks down on tech firms that fail to protect children

Sites must assess content for sexual abuse and suicide risk or face fines of up to £17m.

on laptop.jpg

Technology companies will be required to assess their sites for sexual abuse risks, prevent self-harm and pro-suicide content, and block children from broadcasting their location, after the publication of new rules for “age-appropriate design” in the sector.

The UK Information Commissioner’s Office, which was tasked with creating regulations to protect children online, will enforce the new rules from autumn 2021, after one-year transition period. After which companies that break the law can face sanctions comparable to those under GDPR, including fines of up to £17m or 4% of global turnover.

Companies that make services likely to be accessed by a child will have to take account of 15 principles designed to ensure their services do not cause harm by default. Those include:

  • a requirement to default privacy settings to high, unless there is a compelling reason not to;
  • orders to switch off geolocation by default, and to turn off visible location tracking at the end of every session;
  • a block on using “nudge techniques to lead or encourage children to provide unnecessary personal data or weaken or turn off their privacy protections”;
  • a requirement on sites to uphold their stated terms, policies and community standards.