Kim Leadbeater – 2022 Speech on the Online Safety Bill
The speech made by Kim Leadbeater, the Labour MP for Batley and Spen, in the House of Commons on 5 December 2022.
I apologise for having left the debate for a short time; I had committed to speaking to a room full of young people about the importance of political education, which felt like the right thing to do, given the nature of the debate and the impact that the Bill will have on our young people.
I am extremely relieved that we are continuing to debate the Bill, despite the considerable delays that we have seen; as I mentioned in this House previously, it is long overdue. I acknowledge that it is still groundbreaking in its scope and extremely important, but we must now ensure that it works, particularly for children and vulnerable adults, and that it goes some way to cleaning up the internet for everyone by putting users first and holding platforms to account.
On new clause 53, I put on record my thanks to the Government for following through with their commitments to me in Committee to write Zach’s law in full into the Bill. My constituent Zach Eagling and his mum Clare came into Parliament a few weeks ago, and I know that hon. Members from both sides of the House were pleased to meet him to thank him for his incredible campaign to make the vile practice of epilepsy trolling completely illegal, with a maximum penalty of a five-year prison sentence. The inspirational Zach, his mum and the Epilepsy Society deserve enormous praise and credit for their incredible campaign, which will now protect the 600,000 people living with epilepsy in the UK. I am delighted to report that Zach and his mum have texted me to thank all hon. Members for their work on that.
I will raise three areas of particular concern with the parts of the Bill that we are focusing on. First, on director liability, the Bill includes stiff financial penalties for platforms that I hope will force them to comply with these regulations, but until the directors of these companies are liable and accountable for ensuring that their platforms comply and treat the subject with the seriousness it requires, I do not believe that we will see the action needed to protect children and all internet users.
Ultimately, if platforms enforce their own terms and conditions, remove illegal content and comply with the legal but harmful regulations—as they consistently tell us that they will—they have nothing to worry about. When we hear the stories of harm committed online, however, and when we hear from the victims and their families about the devastation that it causes, we must be absolutely watertight in ensuring that those who manage and operate the platforms take every possible step to protect every user on their platform.
We must ensure that, to the directors of those companies, this is a personal commitment as part of their role and responsibility. As we saw with health and safety regulations, direct liability is the most effective way to ensure that companies implement such measures and are scrupulous in reviewing them. That is why I support new clause 17 and thank my right hon. Friend the Member for Barking (Dame Margaret Hodge) for her tireless and invaluable work on this subject.
Let me turn to media literacy—a subject that I raised repeatedly in Committee. I am deeply disappointed that the Government have removed the media literacy duty that they previously committed to introducing. Platforms can boast of all the safety tools they have to protect users, talk about them in meetings, publicise them in press releases and defend them during Committee hearings, but unless users know that they are there and know exactly how to use them, and unless they are being used, their existence is pointless.
Ofcom recently found that more than a third of children aged eight to 17 said they had seen something “worrying or nasty” online in the past 12 months, but only a third of children knew how to use online reporting or flagging functions. Among adults, a third of internet users were unaware of the potential for inaccurate or biased information online, and just over a third made no appropriate checks before registering their personal details online. Clearly, far more needs to be done to ensure that internet users of all ages are aware of online dangers and of the tools available to keep them safe.
Although programmes such as Google’s “Be Internet Legends” assemblies are a great resource in schools—I was pleased to visit one at Park Road Junior Infant and Nursery School in Batley recently—we cannot rely on platforms to do this themselves. We have had public information campaigns on the importance of wearing seatbelts, and on the dangers of drink-driving and smoking, and the digital world is now one of the largest dangers most people face in their daily lives. The public sector clearly has a role to warn of the dangers and promote healthy digital habits.
Let me give one example from the territory of legal but harmful content, which members have spoken about as opaque, challenging and thorny. I agree with all those comments, but if platforms have a tool within them that switches off legal but harmful content, it strikes me as incredibly important that users know what that tool does—that is, they know what information they may be subjected to if it is switched on, and they know exactly how to turn it off. Yet I have heard nothing from the Government since their announcement last week that suggests they will be taking steps to ensure that this tool is easily accessible to users of all ages and digital abilities, and that is exactly why there is a need for a proper digital media literacy strategy.
I therefore support new clauses 29 and 30, tabled by my colleagues in the SNP, which would empower Ofcom to publish a strategy at least every three years that sets out the measures it is taking to promote media literacy among the public, including through educational initiatives and by ensuring that platforms take the steps needed to make their users aware of online safety tools.
Finally, I turn to the categorisation of platforms under part 7 of the Bill. I feel extremely strongly about this subject and agree with many comments made by the right hon. and learned Member for Kenilworth and Southam (Sir Jeremy Wright). The categorisation system listed in the Bill is not fit for purpose. I appreciate that categorisation is largely covered in part 3 and schedule 10, but amendment 159, which we will be discussing in Committee, and new clause 1, which we are discussing today, are important steps towards addressing the Government’s implausible position—that the size of a platform equates to the level of risk. As a number of witnesses stated in Committee, that is simply not the case.
It is completely irresponsible and narrow-minded to believe that there are no blind spots in which small, high-risk platforms can fester. I speak in particular about platforms relating to dangerous, extremist content —be it Islamist, right wing, incel or any other. These platforms, which may fall out of the scope of the Bill, will be allowed to continue to host extremist individuals and organisations, and their deeply dangerous material. I hope the Government will urgently reconsider that approach, as it risks inadvertently pushing people, including young people, towards greater harm online—either for individuals or for society as a whole.
Although I am pleased that the Bill is back before us today, I am disappointed that aspects have been weakened since we last considered it, and urge the Government to consider closely some proposals we will vote on this evening, which would go a considerable way to ensuring that the online world is a safer place for children and adults, works in the interests of users, and holds platforms accountable and responsible for protecting us all online.