Maria Miller – 2021 Speech on Digital Image Abuse
The speech made by Maria Miller, the Conservative MP for Basingstoke, in the House of Commons on 2 December 2021.
It is a great pleasure to speak in this Adjournment debate. There can be few things more harmful, traumatising or abusive than for an individual to have a nude or sexually explicit image shared without their consent with thousands or even millions of people online. It is a horrific invention of the online world and an act of sexual abuse because it is done without the consent of the victim.
Technology is being used every day to invent new and even more grotesque ways of inflicting abuse, particularly sexual violence, especially against women and girls. I have secured this debate on deepfake and nudification image abuse because they are yet more forms of abuse against women and girls, their impact is not understood, and they continue to be largely unrecognised, especially in law and the legal sanctions that are available. It is a great pleasure to see the Parliamentary Under-Secretary of State for Justice on the Front Bench, underlining the Government’s understanding of the need to address this issue.
For those who are unfamiliar with the term “deepfakes”, they are pornographic images that are created by merging existing pornographic content with the image of an individual—usually a woman—who has not given her consent. The resulting pornographic material is often violent, including illegal depictions of rape. In a similar way, nudification software takes everyday images—again, usually of women without their consent—and uses an extensive database of pornographic images to create a new image that makes it appear as though the original subject of the photo is nude.
The decision to create and share a deepfake or a nudified image is a highly sinister, predatory and sexualised act undertaken without the consent of the person involved. It has been a growing problem for the past 10 years, along with other forms of intimate image abuse. Reports of such abuse have grown by almost 90% in the past 12 months, coinciding—not coincidentally—with the lockdown, the pandemic and the changes in behaviour that are leaving many people at home for longer periods.
All forms of intimate image abuse have a significant and long-term impact on their victims, but I believe that deepfakes and nudification are particularly pernicious because the images are almost completely fabricated, causing psychological harm, anxiety, depression, post-traumatic stress disorder—the list goes on. Some people experience an impact on their physical health or damage to their relationships. There may also be damage to their financial situation because of the need to take time off work or perhaps withdraw altogether from the online world, which we all know is a fundamental part of most people’s jobs in modern society. In some cases, there have been reports of self-harm and even attempted suicide among those who have been a victim of this heinous act.
I would like to turn specifically to the impact on individuals. This horrific abuse can happen to absolutely anyone, as a constituent of the hon. Member for Sheffield Central (Paul Blomfield) discovered in 2019 when she learned that her image had been uploaded to a pornographic website—an ordinary image from her social media that was then manipulated with software to make it appear as if she were in something completely pornographic. She was only alerted to the existence of the photos by an acquaintance after the images had been in circulation for years. The original images were taken from her social media, including photographs from her pregnancy.
I commend the hon. Member’s constituent, because she has had the courage to speak out about something that many cannot or feel unable to speak about. We can understand that much more closely when we hear her words explaining how she felt. She said that the images were “chilling” and that she still experiences nightmares. Speaking of the experience, she said:
“Obviously, the underlying feeling was shock and I initially felt quite ashamed, as if I’d done something wrong. That was quite a difficult thing to overcome. And then for a while I got incredibly anxious about even leaving the house.”
That reaction is typical; it leaves many women frightened to seek the help that they need.
Another victim—I will call her Alana, although that is not her real name—was identified by Professor Clare McGlynn in her work on “Shattering Lives and Myths”, a report on the issue. Alana also had faked intimate images widely circulated without her consent. Her testimony is equally harrowing; I will quote from it, because her words are powerful and the Minister needs to hear them if he is to bring the right solutions to this place. She said:
“It has the power to ruin your life, and is an absolute nightmare, and it is such a level of violation…because you are violated not only by the perpetrator but you also feel violated because society doesn’t recognise your harm. It is a very isolating experience, it is a very degrading and demeaning experience, it is life-ruining.”
Those are words that we should all listen to as we move forward, hopefully, to some solutions.
At the moment, deepfakes are not against the law, and people who use nudification software are not recognised as sexually abusing others. Deepfakes have been a shocking development in violence against women online. Let us be clear: this technology is almost exclusively used to inflict violence against women. Indeed, the cyber research firm Sensity found that 96% of all deepfakes are pornographic and that all the pornographic deepfakes it detected—100%—targeted women.
Offline, non-consensual sexual acts are recognised in the criminal law through the crimes of sexual assault, sexual abuse, rape—the list goes on, yet those responsible for developing and using technology in the online world and through artificial intelligence have been allowed to operate perniciously and with impunity, inflicting online sexual attacks on women and girls without criminal consequences. We cannot allow the online world to be a continuum of the offline world where women and girls experience even further new forms of sexual abuse and violence, which is why we need a new law to criminalise the taking, making and sharing of nude and sexual images without consent, including deepfakes and nudification. Those, surely, are some of the worst forms of such activity.
This technology is no longer the reserve of movie CGI experts. Image manipulation can be incredibly technical, but nowadays creating content of this kind is dangerously easy. With the development of nudification apps that can be downloaded on to a phone, anyone can create an indecent image of somebody without their consent in seconds. Apps and websites like these are not hidden in the recesses of the dark net, undiscovered; they are receiving millions of visitors. In the first seven months of 2021, one nudifying app received a staggering 38 million hits. This service has an interesting slogan: it is to
“make all men’s dreams come true”.
I am sure that is not the case, because I know that many of my hon. Friends would find this as abhorrent as I do. The app allows users to undress thousands of women without their consent, and runs an “incentive program” for users who share the links to their deepfakes, so users who get clicks on their deepfakes can nudify more pictures faster. It is disgusting, but it is not against the law. We have to act.
Deepfakes are widely regarded by academics as the future of violence against women online, but the existing law is woefully behind and largely redundant. Section 33 of the Criminal Justice and Courts Act 2015, the so-called revenge porn legislation, in whose drafting I was involved, was a good step in the right direction, but it specifically excludes altered or photoshopped images and videos; and there are shortfalls in the current law because it does not adequately capture all motivations for non-consensually taking or sharing an intimate image. Although motivations such as sexual gratification and causing distress are covered, if the image that is being nudified was not originally private or intimate in nature, and if it was not shared directly with the individual in the photograph, it can be interpreted by law enforcement agencies as not having the intention of harassing or causing distress, even if it is shared with thousands of people on the internet. That is clearly an absurdity that needs to be changed.
Threats to share images have now been included in the Domestic Abuse Act 2021, and the Government are to be applauded for making that change, but if no threat to share is made, there is the potentially ridiculous scenario that the image could be legally shared if the motivation to share it was a joke, because such motives are not recognised in the current law.
I hope I have explained why it is so critical for the Online Safety Bill to effectively mitigate violence against women and girls online by introducing new criminal offences—and I would say that they should be sex offences—of the taking, making and sharing of intimate images without consent. I know that the Online Safety Bill is very popular—we heard about that in the previous debate—but perhaps the Government should be thinking of a set of Bills to be introduced together, rather than trying to put everything into one Bill. There might be a suite of Bills to tackle all the different issues, to prevent the risk of making one Bill so expansive that it becomes what is commonly known as a Christmas tree Bill. It is an innovative approach, which I am surprised that the Government do not take more often when dealing with highly complex areas that are interrelated.
In the tackling violence against women and girls strategy, the Government have committed to root out offending online as well as offline. They cite the forthcoming Online Safety Bill as the instrument in their efforts to do this, but reform of the laws on intimate image abuse is not yet included in the Bill. This oversight needs to be addressed before the Bill comes back to this House for debate, which we hope will be in the very near future. The current law is not fit for purpose. It is a patchwork of different elements based on defined motivations that can make prosecutions more difficult and that fails to recognise the nature and impact of image-based abuse online. If we have an Online Safety Bill that does not tackle the gaps in the criminal law, it will be a Bill that falls well short of what our constituents need.
The Law Commission has already developed a wide range of recommendations for legal reform in this area that are widely supported by industry stakeholders and experts, so I urge the Government to fast-track those recommendations through the Online Safety Bill, in recognition of the fact that we cannot wait any longer for legal reform. We need deepfake and the use of nudification apps to be outlawed in a comprehensive new law to criminalise the making, taking and sharing of intimate sexual images without consent. This change is long overdue, and I know that this Minister understands that point. I look forward to hearing his response to the debate.