AI advances lead to new way online predators can exploit victims sexually.

A growing trend of using AI "nudification" apps to bully girls is becoming prevalent among American teenagers.

May 2nd 2024.

AI advances lead to new way online predators can exploit victims sexually.
According to a recent report by Vox, a disturbing trend is on the rise among high school students in the US. This trend involves using AI "nudification" apps to generate and share fake nude photos of their classmates without their consent. This form of "image-based sexual abuse" is a new and alarming development that is causing serious harm to young women across the country.

While revenge porn has been a problem for years, the emergence of deepfake technology has made it easier than ever for anyone to create and share these harmful images. As Britt Paris, an assistant professor at Rutgers who has studied deepfakes, explains, these apps allow users to insert the face of anyone - friends, classmates, colleagues - into a nude image without their knowledge or consent.

Unfortunately, this dangerous trend has already caused harm to many students. In places like Issaquah High School in Washington and Westfield High School in New Jersey, male students have used nudification apps to create fake naked photos of their female classmates. These fabricated images were then shared around the school, causing emotional distress and harm to the victims.

In response to this concerning trend, several states have passed laws to criminalize the creation and sharing of fake nudes. However, some experts believe that these laws are not enough to combat the issue. Amy Hasinoff, a professor at the University of Colorado Denver, argues that more needs to be done to regulate the apps themselves, as they are the root cause of the problem.

In addition to legislation, there is also a push to regulate the app stores that offer these nudification apps. Both Apple and Google have removed several of these apps from their stores, but many remain available for use. This has raised concerns about the effectiveness of these efforts to protect young women from the harm caused by deepfake images.

The impact of these fake images on the victims cannot be understated. 15-year-old Francesca Mani, a student at Westfield High School, shared her experience of being targeted by a deepfake image and the emotional toll it took on her. She also pointed out that even though the images are fake, victims are still subjected to shaming and stigmatization, which can have lasting consequences on their lives.

To address this issue, some states have passed or updated laws to penalize those involved in creating and sharing deepfake images. There is also a federal bill in the works that would allow victims and their families to take legal action against perpetrators. However, with these apps still readily available, many are skeptical about the effectiveness of these measures.

Until there is greater accountability for the harm caused by these apps, there is a risk that this disturbing trend will continue to harm young women. As Yeshi Milner, founder of Data for Black Lives, points out, these fake images not only damage the victims' reputations and future opportunities, but they also make them vulnerable to violence. It's clear that more needs to be done to protect young women from the dangers of AI nudification technology.

[This article has been trending online recently and has been generated with AI. Your feed is customized.]

 0
 0