Online harms and disinformation
The Online Safety Bill is a lengthy and complex piece of legislation, in part because it has to fill a gap in which no effective regulation currently exists but also because the digital world cuts across huge areas of life. It is a wide-ranging attempt to address a variety of different harms to both individuals and society more broadly because of social media companies.
The Bishop of Oxford, spoke in the second reading of the Bill in the House of Lords, on Wednesday 1 February 2023.
My Lords, as a member of your Lordships’ Committee on Artificial Intelligence and founding member of the Centre for Data Ethics and Innovation, I have followed the sometimes frustratingly slow progress of this Bill since the original White Paper. We have seen increasing evidence that many social media platforms are unwilling to acknowledge, let alone prevent, harms of the kind this vital bill finally addresses. We know there is an all too porous frontier between the virtual world and the physical world. The resulting harms, as we have heard, damage real lives, real families, and real children.
There is a growing list of priority harms, and now concern, as well as excitement, over new AIs like ChatGPT, and they demonstrate, yet again, that technology has no inherent precautionary principle. Without systemic checks and balances, AI in every field develops faster than society can respond. We are, and forever will be, catching up with technology.
This Bill is very welcome, marking as it does a belated but important step towards rebalancing a complex but vital aspect of public life. I pay tribute to the government, and to civil servants, for their patient efforts to address a complex set of ethical and practical issues in a proportionate and broadly systemic way.
But the job is not yet done. I will concentrate on three particular areas of concern with the draft Bill:
Harms to adults
First, removal of the risk assessment regarding harms to adults is concerning. Surely, my Lords, every company has a basic moral duty to assess their product’s or service’s risk to customers. Removal can only undermine a risk-based approach to regulation. Can the Minister explain how conducting a risk assessment erodes or threatens freedom of speech?
Secretary of State’s powers
My second concern is the Secretary of State’s powers in relation to OFCOM. This country has a record of independence of our own media regulators. Given legitimate concerns about protecting freedom of speech, clause 39 gives the SoS uniquely sweeping powers to direct OFCOM’s public policy unilaterally. Such powers undermine OFCOMs independence whether or not they are exercised.
During Committee, I shall also want to engage with questions about how OFCOMs proposed powers dovetail with other regulators so that no harms fall through the net.
The third area of concern I wish to raise is this bill’s provisions, or rather lack of provision, over disinformation of various kinds. I also have the privilege of serving on your Lordships’ Climate Scrutiny Committee. Climate disinformation and medical disinformation both inflict substantial harms on society and must be included in the user empowerment tools.
Other noble prelates will raise their own concerns at the forthcoming Committee. My friend the Right Reverend Prelate, the Bishop of Gloucester believes:
“It is imperative that we prevent technology-facilitated domestic abuse as well as bring a code of practice to keep women and girls safe online. To help young people flourish we should look at controlling algorithmically served content, restrictions on face and body editing apps as well as improving media literacy overall. The Right Reverend Prelate is, unable to speak today but will follow these issues closely.”
My Lords, this Bill is vital for the health of children and adults and the flourishing of a whole society. I look forward to the progress being made in this House.
Watch Bishop Steven giving his speech in the House of Lords: