top of page

US Senators are introducing a new bill after those Taylor Swift images

Is this a step in the right direction to prevent women from being the victim of deep fakes?



A group of US senators have taken steps to introduce a new bill that would hold people who create and distribute non-consensual, sexually explicit “deepfake” images and videos accountable for their actions.
 
Deepfakes are videos or images created using artificial intelligence (AI) that look realistic. They may superimpose an individual’s likeness onto real video footage depicting someone else, or they may consist of entirely original content where someone is represented doing or saying something they did not do or say. In either case, all deepfakes are hoaxes.
 
This month, Taylor Swift was the victim as deep fakes of her created using AI circulated on X/ Twitter. So much so that Swifties tried flooding the Taylor Swift search term with kind photos of Taylor Swift to push the fake images down the feeds, until X stepped in and stopped anyone searching for ‘Taylor Swift’ on the platform.
 
Named the ‘Disrupt Explicit Forged Images and Non-Consensual Edits Act of 2024’ or the ‘DEFIANCE Act of 2024’ the bill looks to create a remedy for victims identifiable in “digital forgery,” which is defined as a visual depiction created through the use of software, machine learning, artificial intelligence, or any other computer-generated or technological means to falsely appear to be authentic.
 
The bill argues that the laws have not kept up with the times, and makes the case that the remedy is enforceable against individuals who produced or possessed the forgery with intent to distribute it; or who produced, distributed, or received the forgery, if the individual knew or recklessly disregarded that the victim did not consent to the conduct.

As well as outlining the bill’s measures, the press release issued alongside the bill highlighted how distressing being a victim of a deepfake is, and how although the images are fake – the harm these images cause is very real. It also made the point that the majority of deepfake material is sexually explicit and is produced without the consent of the person depicted. In fact, a 2019 study found that 96 percent of deepfake videos were nonconsensual pornography.
 
This bill is an important change in the times, seeking to protecting women, their identities and reputations from false narratives that can easily be shared and spread. Everyone should be held accountable for their actions – whether physical or digital. 
bottom of page