Auto Amazon Links: No products found. Blocked by captcha.
The UK government has been criticised for the slow progress towards criminalising the creation of sexually explicit deepfake images. Baroness Owen, a Conservative peer, has proposed a law to make it an offence to create or solicit intimate images of people without their consent. Deepfakes are digitally altered images or videos using artificial intelligence to replace the face of one person with another. Baroness Owen has criticised ministers for “delaying action”, saying this was “a betrayal of those who need our protection the most”.
The proposed bill would introduce new offences. Those found guilty face a fine and up to six months in jail. However, it is unlikely to become law without government support. Baroness Owen said the creation of sexually explicit deepfakes was growing rapidly, with so-called “nudification” apps easily available online. She cited research suggesting that one app had processed 600,000 images in its first three weeks, while the largest site “dedicated to deepfake abuse” had 13.4 million hits per month.
Labour’s general election manifesto promised to ban the creation of sexually explicit deepfakes. Justice minister Lord Ponsonby has said that the government agreed more needed to be done to protect women from this form of abuse. He said the government would deliver its manifesto commitment and bring forward its own legislation next year, however, Baroness Owen said that she was “devastated” the government was not backing her bill, adding: “I know that survivors will feel let down”.
Sharing or threatening to share sexually explicit deepfake images is already illegal in England and Wales under the Online Safety Act, which passed last year. The last Conservative government also promised to make creating such images a criminal offence. However, its proposals ran out of time to become law when the general election was called in May. The proposed legislation has been criticised for only making such images a crime if someone wants to cause “alarm, humiliation or distress to the victim” rather than simply if the victim has not given consent to their image being used.
Read the full article from The BBC here: Read More
Auto Amazon Links: No products found. Blocked by captcha.