Taylor Swift's nasty, nude photos are circulating online and while Swifties are well aware that they must be fake, they cannot help but feel hatred.

The devoted following of Taylor Swift is wondering why there isn't more regulation around the nonconsensual fabrication of X-rated photographs, as fake pornographic images of her created with artificial intelligence are making the rounds on social media.

The so-called "deepfakes" feature Swift in a variety of sexualized poses during a Chiefs game, alluding to her well-publicized relationship with tight end Travis Kelce.

Though "Taylor Swift AI" was trending on the platform as of Thursday morning with more than 58,000 posts on the issue, it was not immediately obvious who developed the photographs or published them to X.

Swifties banded together and shared an abundance of flattering posts about the 34-year-old singer in an attempt to hide the pictures.

"Why is this not regarded as a sexual assault?" A user from X inquired. "We are discussing a woman's face and body being used for something she most likely wouldn't consent to or feel comfortable with, so how come there are no laws or regulations prohibiting this?"

ALSO READ: Taylor Swift Romance Fame Gotten Into Travis Kelce's Head? This is How Chiefs Star Changed

"I was astounded to see the Taylor Swift AI images with my own eyes.

"Those AI images are repulsive," remarked someone else.

Some extreme Swift fans referred to the creator as "disgusting," saying that incidents such as these "ruin the [AI] technology."

Another person added, "Whoever released them deserves punishment."

Tree Paine, Swift's spokeswoman, did not answer The Post's request for comment right away. It remains to be seen if Taylor Swift with her power and money, can bring this practice down, legally! 

In October, President Joe Biden issued an executive order aimed at further regulating artificial intelligence. Among other things, the order forbids "generative AI from producing child sexual abuse material or producing non-consensual intimate imagery of real individuals," and it offers more control over the technology's application in the creation of biological materials.

The federal government must also provide guidelines "to watermark or otherwise label output from generative AI," according to the directive.

READ ALSO: Travis Kelce, Taylor Swift Defend Decision Not to be Engaged Yet, Waiting for the Superbowl?   

Though it hasn't been able to stop the circulation of AI-generated nude images at high schools in New Jersey and Florida, where explicit deepfake images of female students were circulated by male classmates, nonconsensual deepfake pornography has also been made illegal in Texas, Minnesota, New York, Hawaii, and Georgia.

A bill that would make the nonconsensual distribution of digitally changed pornographic photos a federal criminal with imposable penalties like jail time, a fine, or both was proposed this week by Reps. Joseph Morelle (D-NY) and Tom Kean (R-NJ).

The House Committee on the Judiciary was tasked with reviewing the "Preventing Deepfakes of Intimate Images Act," however the committee has not yet decided whether or not to approve the legislation.

ALSO READ: Taylor Swift 'Annoys' Real Travis Kelce, NFL Fans: 'Why Do We Need to See Her?'  

See More Taylor Swift
Join the Discussion