Skip to Main Content

Fines to be part of regulating social media, says Ofcom


Ofcom will not hesitate to impose fines on social media firms who fail to deal with harmful content, its new boss has said.

little-boy-looking-at-phone-on-couch.jpg

The watchdog would also consider temporarily suspending platforms in extreme cases of harm.

Chief executive Dame Melanie Dawes set out the powers Ofcom would use if, as expected, it is appointed as regulator. She was responding to questions from MPs on the Digital, Culture, Media and Sport Committee.

The government has not yet announced which body will be given the role of overseeing content on Facebook, Twitter, Snapchat and Google, although Ofcom is the most likely candidate.

The new regulator will have powers to impose penalties on a range of content including bullying, child abuse, terrorism and fake news. "Fines need to be part of the regime. These are extremely large companies with significant financial muscle," she told MPs.

France recently passed a law that requires social media to remove hate speech and illegal content within one hour or face heavy fines.

New skills

Dame Dawes acknowledged that regulating Facebook, Twitter and Facebook would be a "big challenge" but promised that if Ofcom was appointed to the role, it would "shine a light" and "hold them to account", she told MPs.

She said it would require the recruitment of data analysts, and revealed "someone from Google" had already been appointed.

But she acknowledged that finding the right skilled people would be a challenge, and it would be hard to persuade people to move from well-paid private sector jobs to a public sector role.

The DCMS committee had previously heard from social media companies on how they were dealing with online harm, and expressed frustration at the actions they had taken to deal with misinformation around Covid-19.

Dame Dawes was kinder in her assessment, borrowing an analogy from the virologists when she said that the "R number" for viral misinformation had been kept down by social media firms with a range of new measures.

This included Facebook's decision to limit the amount of content that could be forwarded via WhatsApp, and Twitter's decision to limit the sharing of articles by users who had not read them.

Super-complaints

Several MPs wanted to know how Ofcom would keep children safe online, and in particular whether the regulator would monitor images of self-harm.

The case of 14-year-old Molly Russell, who killed herself after viewing self-harm images on Instagram, put the issue into sharp relief and helped convinced the government that it needed to appoint a social media regulator.

Dame Dawes did not rule out the possibility of legal action against social media firms, including so-called super-complaints potentially launched with the help of charities such as the NSPCC.

She was also asked whether the watchdog would look at limiting the amount of time children were online, but said it would be difficult to provide a definitive figure on how much time was too much.

She also refused to be drawn on the issue of foreign interference on social networks but she did say that anonymous accounts on Twitter, which could be Russian bots, was an area that needed to be examined.

Dame Dawes, who was appointed to her post three weeks before the UK went into lockdown, said that the pandemic had thrown up three major challenges for the regulator:

  • keeping mobile networks working as more people used online services
  • supporting vulnerable customers
  • the financial and commercial impact of the pandemic on public broadcasters

 

https://www.bbc.co.uk/news/technology-53149267