Online Safety Bill : What to expect

  • Published
Girl upset with phoneImage source, Getty Images
Image caption,
Protecting children from online harm is one of the key pillars of the legislation

A report from a joint committee of MPs and Lords recommending what should be included in the Online Safety Bill is due in the next few days.

The landmark legislation is one of the first attempts to lay down in law a set of rules about how online platforms should deal with content.

Its remit is huge - too wide for some. But others say it doesn't go far enough.

What is the bill designed to do?

The three things the bill sets out to do are:

  • prevent the spread of illegal content and activity such as images of child abuse, terrorist material and hate crimes, including racist abuse
  • protect children from harmful material
  • protect adults from legal - but harmful - content

There are an awful lot of challenges: should pornography sites use age verification? How should abuse from an anonymous account be dealt with? And how to make sure there are strong safeguards for free speech?

The legislation largely puts the onus on the tech giants to figure out the how, and empowers a regulator - Ofcom - to police whether they do a good enough job.

Firms that fail to comply with the new rules could face fines of up to £18m, or 10% of their annual global turnover, whichever is highest.

Where is the legislation currently?

Image source, Getty Images
Image caption,
New Culture Secretary Nadine Dorries is determined to be tough on the tech giants

Dealing with online content is something government has been toying with for years - originally under the name of "Online Harms".

The renamed Online Safety Bill is still a draft, and has been scrutinised over several months by the joint parliamentary committee, which is due to lay out its recommendations any day now.

Then it will go back to the Department for Digital, Culture, Media and Sport (DCMS) for a final decision on what to include, with a view to presenting it back to Parliament around March 2022.

Since its original draft, the DCMS has undergone big changes itself, with a new secretary of state - Nadine Dorries.

She appears to want to beef the bill up considerably - potentially reinstating the option for technology executives to face prison if they fail to remove what she called "harmful algorithms".

And she does not seem keen to give them the current two-year grace period to get their houses in order - telling the joint committee that six months is a better timeline.

Politicians have increasingly been targeted by trolls, and Ms Dorries has had personal experience of that - so it may be a priority for her.

Who will oversee it?

Image source, Getty Images
Image caption,
Dame Melanie Dawes is ready for the job, she told delegates at Web Summit recently

Ofcom is set to get a lot more power. It will be in charge of monitoring whether the tech giants are doing enough to prevent the spread of illegal content, such as as images of child abuse, terrorist material and racist abuse.

But it will also have to make sure they are doing enough to prevent "legal but harmful" content - a tricky definition that no-one has entirely nailed down.

Ofcom chief executive Dame Melanie Dawes told the joint committee that the regulator was ready for its challenging new role, but warned that her team may be overwhelmed by complaints when the legislation is introduced.

She also conceded that Facebook, Google and Twitter will deploy teams of lawyers to fight it.

Despite those misgivings, Ofcom is putting a brave face on the challenges ahead. It told the BBC it is gearing up for the task by "acquiring new technology and data skills".

Tech firms, it said, would need to be "much more open with us, and their users" about how they plan to protect them from harmful content, and how they intend to quickly remove illegal posts.

What do the tech giants say?

Image source, Getty Images
Image caption,
Jack Dorsey is no longer in charge at Twitter

The big social media giants have undergone some massive changes since the legislation was announced: Facebook has decided to pursue the "metaverse" and changed its name, while Twitter has parted ways with founder Jack Dorsey.

But their position on the bill, so far, remains unchanged.

They are cautiously welcoming - but keen to reiterate all the measures they have put in place to protect users, and repeat constantly how much money they have spent on platform safety.

Monica Bickert, vice-president of content policy at Meta, has said that Facebook wants governments to set new rules because "businesses like ours should not be making these decisions on our own".

In an opinion piece for the Telegraph, she acknowledged that the UK is leading the way: "While we won't agree with all the details, we're pleased the Online Safety Bill is moving forward."

What changes could be made?

Both Martin Lewis from Moneysavingexpert.com and consumer group Which? have campaigned to have the bill include online scam adverts, which they say cause huge financial and emotional harm to victims, and should be something that tech firms can crack down on.

It seems unlikely that will find its way into the legislation - Nadine Dorries said recently that it needed it own bill.

The NSPCC, which has been one of the most vocal critics of the bill in its current form, has signed an open letter to Nadine Dorries, asking her to put children "at the heart" of the bill.

It has laid out a five-point plan to strengthen the legislation:

  • disrupt grooming pathways
  • tackle how offenders use social media to organise abuse
  • have a named manager responsible for children's safety
  • give more powers to combat abuse in private messaging
  • set up a statutory body to represent the interests of children

What are the other criticisms?

Image source, Getty Images
Image caption,
Some believe that the bill damages free speech

The Index on Censorship and the Open Rights Groups have warned that the idea that speech could be defined as "harmful" was a dangerous idea in itself, and outsourcing the decision about when to remove it to Silicon Valley firms is equally fraught.

Those words were echoed by lawyer Yair Cohen, from Cohen Davis Solicitors, who told the BBC: "Tech companies have now officially become the equivalent of owners of a potential crime scene who are also in charge of investigating the crime on their venue whilst also acting as both judges and executioners."

He said that Parliament was being "extremely lazy" delegating that to tech firms.

And he added that the simplest way for legislation to turn online abuse into an unacceptable form of anti-social behaviour was for victims to be able to "unmask their abusers".

"Knowing that their identity could be easily unravelled would deter 90% of online abusers, most of whom are otherwise normative individuals," he said.