A source revealed that Taylor Swift might be considering pursuing legal action as sexual AI photos of her make rounds on social media.

Fans have since flooded the timeline with #ProtectTaylorSwift tweets and have reported the creators and reposts of the said explicit photos.

Taylor Swift 'Furious' Over Sexual AI Photos Of Her Online

According to a The Daily Mail source, Taylor Swift is considering pursuing legal action against the proliferation of sexual AI photos of her on social media.

"Whether or not legal action will be taken is being decided but there is one thing that is clear: these fake AI generated images are abusive, offensive, exploitative, and done without Taylor's consent and/or knowledge," the source emphasized. "The Twitter account that posted them does not exist anymore. It is shocking that the social media platform even let them be up to begin with."

The insider also reiterated that these Taylor Swift sexual AI photos "must be removed from everywhere they exist and should not be promoted."

"Taylor's circle of family and friends are furious, as are her fans obviously. They have the right to be, and every woman should be."

Apart from their stern warning, the source also implored legislation to be passed to prevent this type of obscenity and slander online.

Independent researcher Genevieve Oh revealed to The Daily Mail that over 143,000 new deepfake videos were posted in 2024, surpassing the total number of photos uploaded in the past years combined.

As photos went viral on social media, including Facebook, a Meta spokesperson confirmed that the content has since been taken down and appropriate actions toward the account that reposted or shared it are enacted.

READ MORE: 'Protect Taylor Swift' Trending: Taylor Swift Sexual AI Photos Went Viral After Buffalo Bills Game

Can Taylor Swift Sexual AI Photos Be Sued?

Newsweek experts said that "more" must be done to address the issue, which they say could affect "anyone, but overwhelmingly is targeted against women."

AI Celebrity porn website Celeb Jihad generated the said sexual Taylor Swift AI photos, Newsweek reports. Now-suspended Twitter/X user FloridaPigMan posted sexual Taylor Swift AI photos which have since also been taken down on the platform.

Unfortunately, these perpetrators cannot be dealt with legally yet because there are no legal safeguards yet for the said emerging technology but The Daily Mail said that states like Texas, Minnesota, New York, Virginia, Hawaii, and Georgia are where deepfake pornography is illegal while those in Illinois and California can sue the creator.

However, New York representative Joseph Morelle authored the "Preventing Deepfakes of Intimate Images Act" in May 2023 which was created to "protect the right to privacy online amid a rise of artificial intelligence and digitally-manipulated content."

READ ALSO: Taylor Swift 'Sexually Assaulted,' Swifties Say With Outrage Over Circulating Viral Images

Join the Discussion