WATCH: Microsoft CEO Says AI Needs ‘Guardrails’ After Pornographic Deepfake Images of Taylor Swift Go Viral

Microsoft CEO Satya Nadella said that artificial intelligence needs “guard rails” after pornographic deepfake images of Taylor Swift went viral this week.

Even the White House weighed in on the images, which originated from Celeb Jihad, a website that hosts leaked and faked images of nude celebrities.

In an interview with NBC News’ Lester Holt, Nadella said that “certain norms” must be adhered to regarding AI.

“I’d say two things. One is again, I go back to I think … what is our responsibility? Which is all of the guardrails that we need to place around the technology so that there’s more safe content that’s being produced and there’s a lot to be done there and not being done there,” Nadella said.

“But it is about global, societal … convergence on certain norms … especially when you have law and law enforcement and tech platforms that can come together,” Nadella continued. “I think we can govern a lot more than we think we give ourselves credit for.”

According to a report from The Verge, “Microsoft might have a connection to the faked Swift pictures. A 404 Media report indicates they came from a Telegram-based nonconsensual porn-making community that recommends using the Microsoft Designer image generator. Designer theoretically refuses to produce images of famous people, but AI generators are easy to bamboozle, and 404 found you could break its rules with small tweaks to prompts. While that doesn’t prove Designer was used for the Swift pictures, it’s the kind of technical shortcoming Microsoft can tackle.”

On Friday, White House Press Secretary Karine Jean-Pierre told reporters, “We are alarmed by the reports of the circulation of the … false images.”

“While social media companies make their own independent decisions about content management, we believe they have an important role to play in enforcing their own rules to prevent the spread of misinformation and non-consensual, intimate imagery of real people,” Jean-Pierre continued.

The press secretary added, “Sadly, though, too often we know that lax enforcement disproportionately impacts women and they also impact girls, sadly, who are the overwhelming targets of online harassment and also abuse.”

According to the The Daily Mail, Swift is reportedly “furious” and considering legal action over the images.

An unnamed source “close to Swift” spoke to the Daily Mail on Thursday, saying, “Whether or not legal action will be taken is being decided, but there is one thing that is clear: these fake AI-generated images are abusive, offensive, exploitative, and done without Taylor’s consent and/or knowledge.”

“The Twitter account that posted them does not exist anymore. It is shocking that the social media platform even let them be up to begin with,” the source continued. “These images must be removed from everywhere they exist and should not be promoted by anyone.”

The source continued, “These images must be removed from everywhere they exist and should not be promoted by anyone. They have the right to be, and every woman should be. The door needs to be shut on this. Legislation needs to be passed to prevent this and laws must be enacted.”

Swift has not publicly commented on the images.

The Daily Mail noted, “Nonconsensual deepfake pornography is illegal in Texas, Minnesota, New York, Virginia, Hawaii, and Georgia. In Illinois and California, victims can sue the creators of the pornography in court for defamation.”

The American Civil Liberties Union, the Electronic Frontier Foundation and The Media Coalition have argued that laws limiting deepfakes may violate the First Amendment.

Dear Reader - The enemies of freedom are choking off the Gateway Pundit from the resources we need to bring you the truth. Since many asked for it, we now have a way for you to support The Gateway Pundit directly - and get ad-reduced access. Plus, there are goodies like a special Gateway Pundit coffee mug for supporters at a higher level. You can see all the options by clicking here - thank you for your support!

 

Thanks for sharing!