Nonconsensual deepfake porn puts the field under scrutiny


Atrioc Apologises for Using DeepFakes to Put Your Faces in Pseudo pornographic Videos

Deepfakes have been used to put women’s faces, without their consent, into often aggressive pornographic videos. It’s a depraved AI spin on the humiliating practice of revenge porn, with deepfake videos appearing so real it can be hard for female victims to deny it isn’t really them.

The scenarios are not difficult to imagine; a faked video showing a politician in a compromising position; faked audio of a world leader discussing sensitive information.

The threat doesn’t seem too distant. The recent success of a A.I. chatbot that can answer questions and write prose is a sign of how powerful this type of technology can be.

The issue exploded into public attention last week when it was learned Atrioc, a high profile male video game streamer on the hugely popular platform–Twitch, accessed deepfake videos of his female streaming colleagues. He apologized.

It is very strange to watch yourself doing something you have never done, despite the fact that your face has been used in pornographic videos without your consent.

“It’s kind of like if you watched anything shocking happening to yourself. She said that if you watched a video of yourself being murdered or a video of yourself jumping off a cliff, that would be the same thing.

The Role of Artificial Intelligence in Ethical Online Behavior: A Comment on Mark Zuckerberg’s “Move Fast and Break Things”

The term “deep fake” is derived from the account of a contributor who started posting manipulated videos of female celebrities in porn scenes.

But concerns over the use of nonconsensual pornographic images isn’t exclusive to this community, and threatens to become more commonplace as artificial intelligence technology develops at breakneck speed and the ease of creating deepfake videos continues to improve.

“I am baffled by how awful people are to each other on the Internet in a way that I don’t think they would be face to face,” Hany Farid, a professor at the University of California, Berkeley, and digital forensics expert, told CNN.

I think we need to start questioning, why is it that the technology allows and brings out the worst in people? He believes we need to think about how we can be better with these types of devices in the future if we want to have ingrained technologies in our lives.

“It’s all rape culture and I don’t know what the solution is other than getting to that fundamental problem of disrespect andnon-consent and being okay with violating women’s consent.”

But there’s skepticism. “We haven’t solved the problems of the technology sector from 10, 20 years ago, and the development of Artificial Intelligence is moving much, much faster than the original technology revolution.”

“Move fast and break things,” was Facebook founder Mark Zuckerberg’s motto back in the company’s early days. He changed his slogan to, “Move fast with stable infrastructure” when the power of his platform came into focus.

Silicon Valley was not prepared for the onslaught of hate that has spread on its platforms, even though it was negligent or ignorant. The same tools it had built to bring people together have also been weaponized to divide.

There has been much discussion about ethical Artificial Intelligence, but there is a concern that things could get out of hand.

“The people who are developing these technologies – the academics, the people in the research labs at Google and Facebook – you have to start asking yourself, ‘why are you developing this technology?,’” Farid suggested.

“If the harms outweigh the benefits, should you carpet bomb the Internet with your technology and put it out there and then sit back and say, ‘well, let’s see what happens next?’”

These conversations miss the point. They brush aside legitimate harm in favor of bad-faith arguments. Centering “real” versus “fake” diminishes the lasting impact these images have on the streamers and their careers. Blaire says that they are hurting. Everyone involved in this is hurting.

Blaire, whose last name WIRED has not released for privacy reasons, began to hear from other women after she learned of the deepfakes. She saw the screenshots after a long time. She said it was a slap in the face. “Even though ‘it’s not my body,’ it might as well be. It was the same feeling, seeing a body that isn’t yours being represented as yours. She was struggling with body dysmorphia for years, but the photos caused her to relive it. She threw up her lunch for the first time in a long while. It feels very dirty to see people just sexualizing you against your will, without your consent. You are used to it.