The UK government has announced plans to introduce a new law that would make it a criminal offence to create sexually explicit “deepfake” images. Deepfakes are typically created using artificial intelligence to produce a manipulated photo or video of an individual’s face or body. In recent years, deepfakes have been used to impose the likenesses of celebrities onto pornographic films or images. Even Taylor Swift fell victim to this earlier in the year.
The new law will make it illegal for anyone to create a sexually explicit deepfake image of an adult without their consent, and those who create such material will face criminal record and unlimited fines. Under last year’s Online Safety Act, anyone sharing deepfakes is already breaking the law. However, the new legislation will apply even if the creator did not intend to share it with others and solely created the image to cause “alarm, humiliation or distress to the victim.”
The UK government plans to introduce this new law as part of the Criminal Justice Bill, which is currently making its way through Parliament. Safe-guarding minister Laura Farris has asserted that this change would send an “immoral, misogynistic, and criminal” message. Farris further expressed that deepfake sexual images are unacceptable and despicable and emphasized that the government would not tolerate them.
Following the recent deepfake images of Taylor Swift, US politicians had also expressed their concerns about this issue, urging the need for legislation to catch up with advancements in AI technology. Although presently, there are no federal laws in the US prohibiting the sharing or creation of deepfakes, some states have been making strides to tackle the issue. Hence, various US representatives advocate for changes in legislation; some even have proposed acts that would make sharing deepfake pornography without consent illegal
Read the full article on NME here: Read More