Up for discussion in the Guardian tech newsletter: Facebook and Google will be targets for Ofcom if the government passes the proposed legislation
The new online safety bill places a duty of care on firms such as Facebook and Google to protect users from harmful content – at pain of substantial fines.
You wouldn’t blame Ofcom for feeling daunted. The world, or at least the bit of the planet that wants to clean up the internet, is watching the online safety bill and the UK communications regulator has to enforce it. Hearings into the draft bill by a joint committee came to an end last week and if you take a step back and look at what has come out of those sessions since September, it is clear that Ofcom has got a job on its hands.
Quick primer: the bill covers tech firms that allow users to post their own content or to interact with one another. So that means big fish such as Facebook, Twitter, Instagram, YouTube and Snapchat have to obey it but also commercial pornography sites like OnlyFans. Search engines such as Google are also included.
The bill places a duty of care on those companies to protect users from harmful content – at pain of substantial fines levied by Ofcom. The duty of care is split into three parts: preventing the proliferation of illegal content and activity such as child pornography, terrorist material and hate crimes (ie racial abuse); ensuring children are not exposed to harmful or inappropriate content; and, for the big players such as Facebook, Twitter and YouTube (described as “category 1” services), ensuring that adults are protected from legal but harmful content. The latter content category is to be defined by the culture secretary, after consultation with Ofcom, and then scrutinised by parliament before being enacted in secondary legislation.
The Ofcom chief executive, Dame Melanie Dawes, had warned of being “overwhelmed” by complaints from social media users and having to deal with the “sheer legal weight” of big tech’s response to the act once it becomes law, which should happen around the end of next year.
The culture secretary, Nadine Dorries, rounded off the hearings with an appearance in which she proposed a number of changes to the legislation. But even the preceding sessions had underlined the complexities and gaps in the bill. It needs to be simpler – yet there is no doubt following Dorries’s appearance that it is going to be bigger.
The committee will publish its report on the bill by 10 December and Dorries has said she will look at the recommendations “very seriously indeed”. Here are some of the changes we can expect, or at least issues that the committee will address in its report, after the hearings.
A permanent joint committee will oversee the act
Dorries said a permanent committee of MPs and peers – modelled on the human rights joint committee – will be set up to conduct “ongoing scrutiny” of the landscape that the act will police and the role of the secretary of state and Ofcom in enforcing the bill. The body could also recommend when the secretary of state deploys secondary powers under the bill, such as giving guidance on how Ofcom should exercise its powers.
There will be criminal sanctions for users and executives
Dorries is definitely gunning for tech executives, telling the Facebook founder, Mark Zuckerberg, and his communications chief, Nick Clegg, to steer clear of the metaverse and concentrate on the real world. Addressing the wider tech industry, Dorries said: “Remove your harmful algorithms today and you will not be subjected – named individuals – to criminal liability and prosecution.” Getting prosecuted for failing to deal with algorithms that steer users towards harmful content is definitely not in the bill. As it stands, the bill contains provisions for a deferred power, after about two years, to impose criminal sanctions on executives if they do not respond to information requests from Ofcom accurately and in a timely manner. Dorries is now talking about bringing in criminal sanctions within three to six months for a much broader offence, of allowing their platforms to guide users towards harmful content. Is that illegal content such as racist abuse or less clear areas like legal but harmful?
For users, three new criminal sanctions will be brought for the offences of: sending messages or posts that “convey a threat of serious harm”; posting misinformation – “false communications” – intended to cause non-trivial emotional, psychological or physical harm; and sending posts or messages intended to cause harm without reasonable excuse.
Online advertising: in the bill or not?
In his appearance before the committee, the founder of MoneySavingExpert.com, Martin Lewis, urged the government to include advertising in the bill as an area that should be regulated. “Scam ads are destroying people’s lives. People take their own lives on the back of being scammed, and it should go in the bill.” Ofcom’s Dawes suggested regulating ads alongside the Advertising Standards Authority and the committee’s chair, the Conservative MP Damian Collins, is exercised about misleading political advertising. But Dorries stated firmly last week that advertising, particularly scam ads, would be too big an addition, saying: “It needs its own bill.” Nonetheless, don’t be surprised if the committee tries to get it in or at least make firm recommendations for dealing with advertising in the bill or different legislation.
Increased investigatory powers for Ofcom
The information commissioner, Elizabeth Denham (Britain’s data regulator), said in her appearance that as the bill stands Ofcom does not have strong enough powers to properly audit tech firms. There had been talk in the sessions about gaining access that allows the regulator to scrutinise algorithms and demand changes to them. Denham said she was able to “look under the bonnet” of tech firms as part of the age-appropriate design code, which requires websites and apps to take the “best interests” of their child users into account. She said Ofcom’s powers under the bill needed to “bolstered by audit powers for the regulator to be able to look under the bonnet”.
Currently, the bill requires companies to submit details of how their services might expose users to harmful content – and how they will combat that risk. These risk assessments will inform codes of conduct for the platforms that Ofcom will enforce but the feeling on the committee is that the regulator needs more oomph. Dorries’s strong words about algorithms and criminal liability suggest that she agrees.
Tackling anonymous abuse
Former Manchester United and Leeds footballer Rio Ferdinand spoke scathingly about the failure to deal with anonymous abuse during his appearance in September. A total ban on anonymous social media accounts and posts is not coming but expect some form of action. A recent opinion poll found that of those who had suffered online abuse, 72% of it was from anonymous accounts.
Clean up the Internet, a campaign group calling for increased civility and respect online, has called for action on anonymous trolls in its submission to the committee, and for the bill to require platforms to demonstrate to Ofcom that they have systems in place to deal with anonymity. Clean up the Internet has called on social media platforms to give users the option of pre-emptively blocking any interactions with anonymous accounts, as well as making the verification status of users clearly visible. The group has also suggested anonymous users register their identity with platforms, who could retain that information against the account – and reveal it to law enforcement if needed.
SOURCE :Theguardian