Search

Criminalizing Nudifying Tools in England and Wales


Introduction

Discussions regarding the ethical use of artificial intelligence have been exponentially increasing in recent years. However, some uses of artificial intelligence have not been considered to their fullest extent, thus analysing the regulation of artificial intelligence through a feminist lens is crucial. An example of this, which will be the focus of this article, is the use of nudifying tools, which one British Member of Parliament has called to ban less than a year ago at the time of writing[1]. It must be understood that when referring to nudifying tools, this article is not referring to the use of deepfake pornography where one individual’s face is superimposed onto the body of a porn actor. Nudifying tools are defined here as the use of artificial intelligence (AI) tools to attempt to create an accurate naked image of an individual based on a full-bodied and clothed uploaded image. While many of these tools are currently inaccurate at best, it is presumed that as artificial intelligence becomes more intelligent, the accuracy of these tools (and therefore their ability to cause harm) will vastly increase. The criminal law in England and Wales is currently unequipped to deal with this type of computer misuse. There is nothing provided for the unauthorized use of sexual images in the Sexual Offences Act 2003 (except when the sexual image is that of a child, which falls under section 1 of the Protection of Children Act 1978, as amended by section 45 of the Sexual Offences Act 2003). Before discussing the legislation of nudifying tools, it must be highlighted that the Law Commission has reported that of the 14,091 pornographic deepfakes identified online by Sensity, 100% of the victims were women, highlighting the disproportionate victimisation of women. This essay aims to highlight the current gaps in the law regarding deepfake pornography, specifically nudifying tools, and propose a new legislative framework which will aim to prevent this unique type of harm.


The Insufficiency of the Draft Online Safety Bill

As mentioned above, the current law does not sufficiently protect individuals (particularly women, who are most often the victims of sexual offences[2]) from online sexual exploitation. Indeed, while the U.K.’s draft Online Safety Bill does include sending a photograph or film of genitals under section 156[3], it does not expressly provide for the creation of a nude image without an individual’s consent. In fact, under the current proposed law, a nudifying tool would only fall within the criminal law if this artificially generated image was sent to another person intending that the recipient ‘will see the genitals and be caused alarm, distress or humiliation’, or the sender ‘sends or gives such a photograph or film for the purpose of obtaining sexual gratification and is reckless as to whether B [the recipient] will be caused alarm, distress or humiliation.’ It is therefore submitted that the proposed Online Safety Bill does not go far enough to protect those vulnerable to image abuse through the use of artificial intelligence, where a nudified image can be generated easily through the use of artificial intelligence nudifying tools.

However, positive advances are being made with regard to the use of computer-generated images. Section 156(5) explicitly includes images which are ‘made by computer graphics,’ and which the courts could go on to interpret as generated through the use of artificial intelligence. Unfortunately, a Parliamentary Bill which would have explicitly criminalised the production of ‘digitally-altered images or videos in which an individual is depicted pornographically without their consent’ was withdrawn at the second reading[4].

A Consultation Paper published by the Law Commission in 2021 discusses the creation of deepfake pornography, but again only focuses on the potential use of ‘face-swapping’ and does not include a discussion on nudifying tools which aim to artificially undress the victim. As stated in the Commission’s report, ‘Deepfake porn has been argued to stem from men’s feelings of entitlement to women’s bodies, and is used to silence, shame and degrade women, reducing them to sex objects.’ Nudifying tools are almost certainly used for this same purpose; and instead of face-swapping, the perpetrator aims to generate an accurate image of the victim’s naked body, likely for their own sexual gratification.


Criminalising Deepfake Pornography

To best protect women and other potential victims, it is suggested to create a criminal offence which would include using artificial to create pornographic visual content without the subject’s consent. While it may be possible to create a criminal offence which would sanction the creation of the nudifying tools themselves, this may violate human rights law, specifically the right to freedom of expression[5] and may also go too far in preventing innovation and the creation of art. It is whether the potential victim is aware and whether, being aware, they have or would have consented to the creation of such an image which ought to be considered. Therefore, the Sexual Offences Act 2003 should be amended to create a new section which reads:

A person (A) who intentionally uses digital means, including artificial intelligence, to generate a pornographic image of the victim (B) without their knowledge and consent commits an offence which is liable –

(a) On summary conviction to imprisonment for a term not exceeding 12 months;

(b) On conviction on indictment to imprisonment for a term not exceeding 24 months.


Conclusion

A new offence should be created to criminalise the use of nudifying tools without the victim’s knowledge and consent. The current law in England and Wales is insufficient to protect women from the harms created by this specific type of online abuse. Current laws may be interpreted by the courts to include the use of nudifying tools, but at the time of writing there is not any case law determining this. In creating a new offence, the focus must be on the consent of the victim so as not to interfere with the right to freedom of expression under European Human Rights Law.

[1] Jane Wakefield, ‘MP Maria Miller Wants AI ‘nudifying’ tool banned’ BBC (4 August 2021) <https://www.bbc.co.uk/news/technology-57996910> accessed 24 March 2022. [2] Office for National Statistics, ‘Sexual offences victim characteristics, England and Wales: year ending March 2020’ (Office for National Statistics 2021) <https://www.ons.gov.uk/peoplepopulationandcommunity/crimeandjustice/articles/sexualoffencesvictimcharacteristicsenglandandwales/march2020#toc> accessed 24 March 2022. [3] Online Safety HC Bill (2021-22) 609 cl 156. [4] UK Parliament, ‘Unsolicited Explicit Images and Deepfake Pornography Bill’ (UK Parliament, 18 March 2022) <https://bills.parliament.uk/bills/2921> accessed 24 March 2022. [5] Convention for the Protection of Rights and Fundamental Freedoms (European Convention on Human Rights, as amended) (ECHR) art 10.


33 views0 comments