-
Woman Asks for ‘Unhinged’ Examples of Microfeminism—Over 13K Reply - 10 mins ago
-
Mayor Karen Bass says she reached a deal to restore police hiring - 19 mins ago
-
‘Piano Man’ Billy Joel jokes ‘getting old sucks’ during health battle - 25 mins ago
-
Ronaldo won’t play at Club World Cup despite interest from ‘quite a few’ teams - 28 mins ago
-
Some NCAA Athletes Can Now Be Paid by Schools: What to Know - 49 mins ago
-
Photos: A fierce pushback on ICE raids in L.A. from protesters, officials - 58 mins ago
-
When was the last time there was a Triple Crown winner? - about 1 hour ago
-
Tiger Woods’ Son, Charlie, Back in Full Force After AJGA Victory - about 1 hour ago
-
L.A. immigration raids: 44 people detained. What you need to know - 2 hours ago
-
Author reveals why the ‘dress for success’ concept has important foundations in the Bible - 2 hours ago
San Francisco sues 16 websites that create AI-generated nudes
San Francisco City Atty. David Chiu announced Thursday that his office is suing the operators of 16 A.I.-powered “undressing” websites that help users create and distribute deepfake nude photos of women and girls.
The lawsuit, which city officials said was the first of its kind, accuses the websites’ operators of violating state and federal laws that ban deepfake pornography, revenge pornography and child pornography, as well as California’s unfair competition law. The names of the sites were redacted in the copy of the suit made public Thursday.
Chiu’s office has yet to identify the owners of many of the websites, but officials say they hope to find their names and hold them accountable.
Chiu said the lawsuit has two goals: shutting down these websites and sounding the alarm about this form of “sexual abuse.”
On these websites, users upload photos of fully clothed real people, then artificial intelligence alters the image to simulate what the person would look like undressed. The sites create “pornographic” images without the consent of the persons in the photo, Chiu said during a Thursday morning press conference.
According to the lawsuit, one of websites promotes the nonconsensual nature of the images, stating, “Imagine wasting time taking her out on dates, when you can just use [redacted website name] to get her nudes.”
The availability of open source A.I. models means that anyone can access and adapt A.I.-powered engines for their own purposes. One result: sites and apps that can generate deepfake nudes from scratch or “nudify” existing images in realistic ways, often for a fee.
Deepfake apps grabbed headlines in January when fake nude images of Taylor Swift circulated online, but many other, far less famous people were victimized before and after the pop star. “The proliferation of these images have exploited a shocking number of women and girls across the globe,” from celebrities to middle school students, Chiu said.
Through its investigation, the city attorney’s office found that the websites in question were visited more than 200 million times in just the first six months of 2024.
Once an image is online, it’s very difficult for victims to determine what websites were used to “nudify” their images because these images “don’t have any unique or identifying marks that link you back to websites,” said Yvonne R. Meré, San Francisco’s chief deputy city attorney.
It’s also very difficult for victims to remove the images from the internet.
Earlier this year, five Beverly Hills eighth-graders were expelled for creating and sharing deepfake nude images of 16 eighth-grade girls, superimposing the girls’ faces onto A.I.-generated bodies.
Chiu’s office said it has seen similar incidents at other schools in California, Washington and New Jersey.
“These images are used to bully, humiliate and threaten women and girls,” Chiu said. “The impact on victims has been devastating on their reputations, their mental health, loss of autonomy and, in some instances, causing individuals to become suicidal.”
Source link