The vast majority of Americans say they support introducing new laws to criminalize the sharing of explicit fake images of people generated by artificial intelligence (AI), a new poll has revealed.
A massive public outcry ensued in recent weeks after AI-generated sexually explicit images of Democrat pop sensation Taylor Swift began to circulate and go viral on social media.
Now, a recent poll has shown that approximately 75% of Americans favor the imposition of criminal charges against individuals who create and share so-called “deepfake” and nonconsensual pornography, the Daily Mail reported.
The poll, which coincides with multiple pieces of recently introduced bipartisan legislation to address the issue, included solid majorities in virtually every demographic category that were in favor of criminalizing such false and explicit imagery.
The Daily Mail joined forces this month with the pollsters at TIPP and asked 1,400 survey respondents whether they agreed or disagreed that “People who share deepfake porn online, like the explicit images of Taylor Swift, should face criminal charges.”
Overall, 75% agreed with the criminalization of deepfake, AI-generated pornography, while 14% disagreed and 11% were unsure either way.
Support was highest among older Americans but still strong among the younger crowd — 84% for those aged 65+ compared to around 66% for those aged 18-24.
It was just a bit higher among Democrats than Republicans, with 81% favoring criminalization among the former and 71% among the latter.
Interestingly enough, the Daily Mail reported that AI-generated deepfake porn involving celebrities is not a particularly new problem.
Such false and nonconsensual imagery has been a thing online for years.
However, the issue certainly received a substantial boost in recent weeks after the fake image of Swift, a liberal celebrity, went viral on social media.
Those particular images of Swift, which featured her in various sexually explicit positions while dressed in Kansas City Chiefs garb, were traced back to anonymous forums on 4chan.
On 4chan, countless other similarly fake images of dozens of other celebrities can be found and have been posted for years.
It was likely the move in January of those images of Swift from the relatively obscure forums to major social media platforms, not to mention significant improvements in the quality and increasingly realistic nature of AI-generated imagery, that caught the public’s attention and spurred the incredible outcry.
Lawmakers in Congress, as they tend to do in constantly gauging public opinion, moved quickly to seize on the moment to introduce legislation, including a bipartisan bill in the Senate Judiciary Committee that was dubbed the Disrupt Explicit Forged Images and Non-Consensual Edits Act of 2024, or DEFIANCE Act.
The legislation would create a federal avenue for victims to sue entities or individuals responsible for creating and spreading a “digital forgery.”
“Sexually explicit ‘deepfake’ content is often used to exploit and harass women — particularly public figures, politicians, and celebrities,” Democrat Chairman Dick Durbin (D-IL), a sponsor of the bill, said in a statement.
“This month, fake, sexually explicit images of Taylor Swift that were generated by artificial intelligence swept across social media platforms.”
“Although the imagery may be fake, the harm to the victims from the distribution of sexually explicit ‘deepfakes’ is very real,” he added.
“Victims have lost their jobs, and they may suffer ongoing depression or anxiety.
“By introducing this legislation, we’re giving power back to the victims, cracking down on the distribution of ‘deepfake’ images, and holding those responsible for the images accountable.”
Joining him as a bill sponsor was Sen. Josh Hawley (R-MO), who said, “Nobody — neither celebrities nor ordinary Americans — should ever have to find themselves featured in AI pornography.”
“Innocent people have a right to defend their reputations and hold perpetrators accountable in court.
“This bill will make that a reality.”
Another similarly bipartisan bill, the Preventing Deepfakes of Intimate Images Act, which would make the creation and sharing of fake nonconsensual sexual imagery a federal criminal offense, was introduced last month by Reps. Joe Morelle (D-NY) and Tom Kean (R-NJ).
Morelle and Kean chose not to jump on the Swift bandwagon, however.
Instead, they highlighted the real case of teenage high school girls in New Jersey who’d been victimized by fake AI-generated explicit imagery that was circulated among their classmates.
READ MORE – Biden Campaign Planning to Deploy Taylor Swift to Boost Popularity among Voters