UK Member of Parliament Maria Miller is urging fellow lawmakers to agree to ban AI-powered software capable of “undressing” people in photos. Miler wants the forthcoming Online Safety Bill to make the practice illegal.
Services leveraging artificial intelligence to undress women in photos are spreading rapidly on social media. Popular photos getting nudified include those of celebrities and Olympic athletes, according to the BBC.
Designed to ‘objectify and humiliate’ women
MP Maria Miller wants a parliamentary debate on whether the technology, part of the ‘deepfake’ family, needs to be banned.
“Parliament needs to have the opportunity to debate whether nude and sexually explicit images generated digitally without consent should be outlawed, and I believe if this were to happen the law would change,” she said.
“At the moment, making, taking or distributing without consent intimate sexual images online or through digital technology falls mostly outside of the law. It should be a sexual offence to distribute sexual images online without consent, reflecting the severity of the impact on people's lives,” Miller said.
The MP believes the technology is specifically “designed to objectify and humiliate women” and should be shut down. Even porn sites should be forced to proactively block their upload, she said, claiming adult sites profit from the mass distribution of such content.
Victims, she said, are “often traumatised and humiliated” to have such images removed from the web.
Online Safety Bill
Miller wants the issue to be raised as part of the forthcoming Online Safety Bill, which aims to address a wide range of potentially harmful content, such as online trolling, illegal pornography, underage access to porn, and some forms of internet fraud.
The bill would force online platforms to take action against users uploading both illegal, and legal but harmful, content. Failing to do so would draw fines of up to £18 million, or 10% of their annual turnover, whichever is higher.
Deepfake technology
A nudifying service called DeepNude was launched in 2019 and subsequently withdrawn by its creators after acknowledging that people would almost certainly misuse it. However, its source code got leaked, prompting others to create clones, some even more powerful than the original.
Today, several websites claim to be able to remove clothes from photographs of people. This false advertising likely has its basis in the same class of Machine Learning frameworks powering deepfake video creation tools – Generative Adversarial Networks (GANs).
A GAN framework trained on photographs can generate new images with many realistic characteristics, making them look authentic to the human eye.
In their original, as-intended, form, GANs have been typically used to generate unique realistic profile photos of people who do not exist. Although not malicious in and of itself, the technology has also been used to automate the creation of fake social media profiles.
The state of California has a law banning the use of human image synthesis technologies to make fake pornography without the consent of the people depicted.
tags
Filip has 15 years of experience in technology journalism. In recent years, he has turned his focus to cybersecurity in his role as Information Security Analyst at Bitdefender.
View all postsNovember 14, 2024
September 06, 2024