Nonconsensual deepfake porn is a big deal


The Effects of Social Constraints on Online Harassment and Cyberbullying: A Primer on a Double Standard for Women

The effects of online harm against women are chilling. We can look at research that shows the effect when women are facing more social restrictions. In a pioneering research study, Katy Pearce and Jessica Vitak found women in Azerbaijan opting out of being online because the potential real-world repercussions resulting from online harassment were simply too high in an honor-based culture with high degrees of surveillance. In other words, women faced an impossible double standard: unable to control their image on social media but punished severely for it.

Deepfakes have been used to put women’s faces, without their consent, into often aggressive pornographic videos. It’s a depraved AI spin on the humiliating practice of revenge porn, with deepfake videos appearing so real it can be hard for female victims to deny it isn’t really them.

Safety-by-design measures can help people control their images. For example, Twitter recently allowed people to control how they are tagged in photos. Dating app Bumble rolled out the aptly named Private Detector, an AI-enabled tool allowing users control over which—if any—unsolicited nude images they want to see. Legislation, such as the UK’s proposed Online Safety Bill, can push social media companies to address these risks. The bill is not perfect, but it is a systems based approach to regulation and pushes for better systems to take care of users.

This regulatory approach is not guaranteed to keep women from logging off in great numbers in 2023. If they do, not only will they miss the benefits of being online, our online communities will suffer.

How I Feel about Self-Self-Improved Artificial Intelligence Portraits of the Viking Irregularity

I’m a futuristic Viking in glinting armor and a silver headpiece that spikes around my head like the wings of an avenging angel. My long hair is more lustrous than I am in real life, it billows against a fiery background. I view the camera with a sort of haughty confidence that I have never felt before. Some of my features—my nose, my brows—are slightly elongated, while others are truncated. With no small sense of delight I note that I sort of look like the person in the picture.

Minutes earlier, in a burst of curiosity and boredom (the ingredients for most forms of social media engagement), I’d run my face through a popular AI effect on TikTok called “AI Portrait.” What would I like to see? Certainly, a more realistic version of myself. This is always the promise of AI portraiture. We come to it to be looked at favorably. Many of us do the same things when we see a particularly good representation of ourselves. Even though I understood that the portrait was not about reality, I felt a deep sense of disorientation, even though I knew it was not about reality.

There are only a few selfies in my camera roll. I don’t believe this is a humblebrag but a sign of my lack of interest in seeing my own face. I have a stubborn inability to see myself. I have been known to smile at my reflection, but not realize it was an extension of myself. I’m always surprised when I see the face that blinks back. A stranger’s face. I feel a sense of contradiction when I look through my photos, a desire to say that it isn’t me. I feel as if I know what I look like.

Maybe this isn’t an uncommon phenomenon. Many of us have a distorted sense of our own faces, imagining our features to be either more classically attractive or troll-like than others might judge them to be. Our self-image will always be a bit skewed because the image we hold in our heads is composed of all the previous iterations that have existed before—our gawky preteen years, the inadvisably daring haircuts, the polished bridal visage—each layered thinly atop the previous one, until what’s left is not so much a cohesive reference for the present, but a weird amalgamation of time and identity.

What we see in our minds is an uncanny valley version of ourselves. In this way, perhaps, our self-image is not so different from what we might get from an AI effect.

The Fake News: On the Detection of Sexually Explosive Pornographic Videos of Female Stars and Men on Twitch

“Adversaries and strategic competitors,” they warned in 2019, might use this technology “to create convincing—but false—image, audio, and video files to augment influence campaigns directed against the United States and our allies and partners.”

The scenarios are not difficult to imagine; a faked video showing a politician in a compromising position; faked audio of a world leader discussing sensitive information.

The threat seems like it could be close. The recent success of achatgdt, an A.I. computer that is able to answer questions and write prose, shows how powerful this technology can be.

The long-simmering issue exploded into public view last week when it emerged Atrioc, a high-profile male video game streamer on the hugely popular platform Twitch, had accessed deepfake videos of some of his female Twitch streaming colleagues. He later apologized.

“It’s very, very surreal to watch yourself do something you’ve never done,” Twitch streamer “Sweet Anita” told CNN after realizing last week her face had been inserted into pornographic videos without her consent.

If you watched anything that shocked you, it was kind of like that. If you watched a video of yourself being murdered or jumping off a cliff, it would remind you of that.

Indeed, the very term “deepfake” is derived from the username of an anonymous Reddit contributor who began posting manipulated videos of female celebrities in pornographic scenes in 2017.

Hany Farid, professor at the University of California, Berkeley and digital forensics expert, told CNN he was baffled by how bad people are on the internet, and he wouldn’t see them face to face.

“I think we have to start sort of trying to understand, why is it that this technology, this medium, allows and brings out seemingly the worst in human nature? I think we will have to start thinking about how we can be better human beings if we have these technologies ingrained in our lives.

“It’s all rape culture,” Cole said, “I don’t know what the actual solution is other than getting to that fundamental problem of disrespect and non-consent and being okay with violating women’s consent.”

Source: https://www.cnn.com/2023/02/16/tech/nonconsensual-deepfake-porn/index.html

Why Isn’t It So Hard? Artificial Intelligence in Erotica: A Commentary on Mark Zuckerberg’s Moving Beyond Silicon Valley

But there’s skepticism. “We haven’t even solved the problems of the technology sector from 10, 20 years ago,” Farid said, pointing out that the development of artificial intelligence “is moving much, much faster than the original technology revolution.”

“Move fast and break things,” was Facebook founder Mark Zuckerberg’s motto back in the company’s early days. As the power and danger of his platform grew he changed his motto to, “Move fast with stable infrastructure.”

Whether it was willful negligence or ignorance, Silicon Valley was not prepared for the onslaught of hate and disinformation that has festered on its platforms. Tools it had created to bring people together have also been used to divide.

There is a good amount of discussion about ethics in the field of artificial intelligence, but there is concern that things could get out of hand.

The people who are developing technology have to ask themselves if they are doing the right thing.

“If the harms outweigh the benefits, should you carpet bomb the Internet with your technology and put it out there and then sit back and say, ‘well, let’s see what happens next?’”

In the spring of 2001, when I was just 18 years old, I launched a multiyear career as an online porn model and cam girl, giving paying customers access to my naked body in the form of photo sets and weekly cam shows broadcast in the members’ sections of my paysites. The work I did was not very high up in quality. The bulk of what I put out into the world was just softcore stills. My cam shows only allowed viewers to watch an image refresh every 15 seconds, which gave them access to a slow- moving digital flipbook. I only shot two videos for three and a half years and one of them was completely silent because of a malfunctioning microphone.

People still paid to see me naked. They joined websites I modeled for. They paid me directly for private shows that would play out on a custom link available to them, and them alone. It seemed like nudity was enough to overcome any problems with production value, even if the images could be bad or blurry.

But the argument rarely seems to be that only some people will enjoy—let alone pay for—AI porn. Artificial intelligence is being used in erotica to dominate the industry because it will find favor over images produced by human sex workers. They think that the only thing anyone will be looking for when looking at porn is a collection of objects shaped like a human being. My experience in sex work suggests that the models don’t seem too concerned about an artificial intelligence script coming for their income stream.