Chris Philp – 2022 Statement on Online Safety
The statement made by Chris Philp, the Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport, in the House of Commons on 25 February 2022.
The Government recognise the impact that online abuse, including anonymous abuse, has on people and their online experience. Too many people experience online abuse and protecting users is a priority for this Government.
The Online Safety Bill introduces vital new protections from online abuse. The legislation will require all companies in scope to manage the risk of criminal abuse effectively, including anonymous abuse, on user-to-user services. Companies will need to assess the functionality of anonymous and pseudonymous profiles and the role they play in allowing illegal abuse and mitigate the risk associated with such functionality.
All services likely to be accessed by children will also have to put in place appropriate measures to protect children from cyber-bullying and other forms of abuse, whether anonymous or not.
Category 1 companies—those which are high risk and high reach—will also have to set out clearly what abusive content they accept on their platform for adults and have effective systems in place to enforce their terms and conditions.
The Government recognise concerns that have been raised by the Joint Committee during pre-legislative scrutiny of the Bill, alongside the Digital, Culture, Media and Sport Committee, the Petitions Committee and others regarding the impact of online abuse and ensuring users have more control over whom they interact with online, while protecting the right of individuals to be anonymous if they choose. We thank the committees and campaigners for their scrutiny of the Online Safety Bill.
As a result, I am pleased to announce that we will strengthen the duties in the Online Safety Bill by adding two new additional duties on category 1 companies to provide adults with optional user verification and user empowerment tools.
The user verification duty will require category 1 companies to provide their adult users with an option to verify their identity. Ofcom will set out in guidance how companies can fulfil this new duty and the verification options companies could use. In developing this guidance, Ofcom must ensure that the possible verification measures are accessible to vulnerable users and consult the Information Commissioner, as well as vulnerable adult users and technical experts.
The user empowerment tools duty will require category 1 companies to provide tools to give adults more control over whom they interact with and the legal content they see. Under the proposed new duty, for harmful content that category 1 companies do allow, they would have to provide users with the tools to control what types of harmful content they see. This could include, for example, content on the discussion of self-harm recovery which may be tolerated on a category 1 service but which a particular user may not want to see.
In addition to the existing provisions in the Bill, the new duties will help provide robust protections for adults, including vulnerable adults, while protecting freedom of expression online.