By WIRED
Another image on the site showed a group of young teens who appear to be in middle school: a boy taking a selfie in what appears to be a school gymnasium with two girls, who smile and pose for the picture. The boy’s features were obscured by a Snapchat lens that enlarged his eyes so much that they covered his face.
Captions on the apparently uploaded images indicated they include images of friends, classmates, and romantic partners. “My gf” one caption says, showing a young woman taking a selfie in a mirror.
Many of the photos showed influencers who are popular on TikTok, Instagram, and other social media platforms. Other photos appeared to be Instagram screenshots of people sharing images from their everyday lives. One image showed a young woman smiling with a dessert topped with a celebratory candle.
Several images appeared to show people who were complete strangers to the person who took the photo. One image taken from behind depicted a woman or girl who is not posing for a photo, but simply standing near what appears to be a tourist attraction.
Some of the images in the feeds reviewed by WIRED were cropped to remove the faces of women and girls, showing only their chest or crotch.
Huge Audience
Over an eight-day period of monitoring the site, WIRED saw five new images of women appear on the Home feed, and three on the Explore page. Stats listed on the site showed that most of these images accumulated hundreds of “views.” It’s unclear if all images submitted to the site make it to the Home or Explore feed, or how views are tabulated. Every post on the Home feed has at least a few dozen views.
Photos of celebrities and people with large Instagram followings top the list of “Most Viewed” images listed on the site. The most-viewed people of all time on the site are actor Jenna Ortega with more than 66,000 views, singer-songwriter Taylor Swift with more than 27,000 views, and an influencer and DJ from Malaysia with more than 26,000 views.
Swift and Ortega have been targeted with deepfake nudes before. The circulation of fake nude images of Swift on X in January triggered a moment of renewed discussion about the impacts of deepfakes and the need for greater legal protections for victims. This month, NBC reported that, for seven months, Meta had hosted ads for a deepnude app. The app boasted about its ability to “undress” people, using a picture of Jenna Ortega from when she was 16 years old.
In the US, no federal law targets the distribution of fake, nonconsensual nude images. A handful of states have enacted their own laws. But AI-generated nude images of minors come under the same category as other child sexual abuse material, or CSAM, says Jennifer Newman, executive director of the NCMEC’s Exploited Children’s Division.
“If it is indistinguishable from an image of a live victim, of a real child, then that is child sexual abuse material to us,” Newman says. “And we will treat it as such as we’re processing our reports, as we’re getting these reports out to law enforcement.”
Discussion about this post