The creation of deepfake pornography featuring Taylor Swift prompts renewed demands for legislation in the US.

Estimated read time 4 min read

The quick and widespread circulation of deepfake pornographic pictures featuring Taylor Swift has sparked renewed demands, even from American politicians, to make the act illegal. This involves the use of artificial intelligence to create fake but realistic sexual images.

This week, the US popstar’s pictures were shared on various social media platforms and viewed by millions of people. The images were originally shared on Telegram, and one particular image of Swift on X received 47 million views before it was taken down.

In a statement, X stated that they are currently taking measures to remove any identified images and are also taking action against the accounts responsible for posting them.

Democratic Representative Yvette D Clarke of New York recently posted on X about the recurring issue faced by Taylor Swift. For many years, women have been victims of deepfakes without their consent, and with the progression of artificial intelligence, creating such fake videos has become more accessible and affordable. This is a problem that should concern all political parties and even fans of Swift, and we need to unite to find a solution.

Certain states in the US have their own laws specifically targeting deepfakes, but there is a increasing movement for federal legislation to be amended.

In May of 2023, Representative Joseph Morelle, a member of the Democratic party, introduced the Preventing Deepfakes of Intimate Images Act. This legislation aims to prohibit the distribution of deepfake pornography without consent. According to Morelle, these manipulated images and videos can have severe consequences, such as permanent emotional, financial, and reputational damage. He also noted that women are unfairly affected by this issue.

He posted a tweet denouncing the Swift photos as “exploitative in a sexual manner”. The legislation he suggested has not been enacted yet.

Tom Kean Jr, a Republican representative, stated that the progress of AI technology is outpacing the implementation of necessary safety measures. This is evident in cases like Taylor Swift’s and other young people across the nation. He believes there is a need for protective measures to address this concerning trend. Kean Jr has joined Morelle in sponsoring a bill and has also proposed his own AI Labeling Act. This act would mandate all AI-created content, even seemingly harmless chatbots used in customer service, to be clearly identified as such.

There has been no public statement from Swift regarding the images. Her US publicist has not responded to a comment request at the time of publication.

Convincing deepfake video or audio has been used to imitate some high-profile men, particularly politicians such as Donald Trump and Joe Biden, and artists such as Drake and the Weeknd. In October 2023, Tom Hanks told his Instagram followers not to be lured in by a fake dentristry advert featuring his likeness.

However, the focus of the technology is heavily on women and often in a sexually objectifying manner. According to a 2019 report by DeepTrace Labs, which is referenced in the proposed US bill, 96% of deepfake videos are non-consensual and contain pornographic content.

The situation has significantly deteriorated in the past year. The problem of fake pornography, which involves using photo editing technology to insert an unwilling person’s face into a pre-existing pornographic image, has been an ongoing issue. However, with the advancement of artificial intelligence, a new area of concern has emerged. AI can now create entirely new and incredibly realistic images, even with basic text commands.

Prominent females are especially vulnerable. In 2018, Scarlett Johansson addressed the issue of widespread fabricated pornography that uses her image: “Unfortunately, I have faced this situation numerous times. The reality is that attempting to shield oneself from the internet and its immorality is essentially futile, for the most part.”

In December 2022, the UK government passed an amendment to the Online Safety Bill that made the creation of nonconsensual deepfake pornography illegal. This amendment also prohibits the use of any explicit images taken without an individual’s consent, including photos known as “downblouse” images.

Dominic Raab, the former deputy prime minister, stated that it is vital to take action in order to safeguard women and girls from individuals who exploit and share intimate photos to harass or degrade them. The proposed changes aim to empower law enforcement and legal authorities to hold these offenders accountable and protect women and girls from this despicable form of abuse.

Source: theguardian.com

You May Also Like

More From Author