Humans can out compete Artificial Intelligence when it is only fans

The Online Safety of Women: The Impact of Digital Sexual Exploitation and Harassment on Women’s Social Media Lives and Works

Unless we act, 2023 will be the year that women leave the internet. Women already face enormous risks online. A Pew Research report of a US survey shows that one-third of young women report having been sexually harassed online and that women report being more upset by these experiences and seeing it as a bigger problem than men do. A UNESCO study of journalists found that 73 percent of women surveyed had experienced online violence, and 20 percent said that they had experienced physical attacks or had been abused offline in connection to online abuse. Women journalists reported self-censoring, withdrawing from online interactions, and avoiding interacting with their audiences. At one point, Maria received over 90 hate messages per hour, after she wrote about the online abuse she faced. After writing about campaign finance improprieties around a presidential candidate, a Brazilian journalist received threats of physical confrontation as well as hundreds of thousands of harassing messaging from her employer. She had to cancel all her other events for a month. What both women shared was that they dared question power while being visible on social media.

The person who created deepfakes used it to make pornographic material of women without their consent from the very beginning, according to a reporter with Vice.

There are answers: Better safety-by-design measures can help people control their images and their messages. For example, Twitter recently allowed people to control how they are tagged in photos. Dating app Bumble rolled out the aptly named Private Detector, an AI-enabled tool allowing users control over which—if any—unsolicited nude images they want to see. Legislation such as the UK’s Online Safety Bill can push social media to address these risks. The bill is far from perfect and does ask platforms to assess the risks and develop solutions such as better human content moderation and better systems to take care of users.

This regulatory approach is not guaranteed to keep women from logging off in great numbers in 2023. Our online communities will suffer if they do, because they will miss out on the benefits.

I Run My Face Through an Artificial Intelligence Experiment: How I Wanna Become Michelle Yeoh? Why I Want to See My Face

I’m a futuristic Viking in glinting armor and a silver headpiece that spikes around my head like the wings of an avenging angel. My hair is more lustrous than it is in real life, and it billies against a fiery background. I’m staring at the camera with bravado, exuding the sort of haughty confidence I’m pretty sure I’ve never felt before. My brows are truncated and some of my features are slightly shortened. And I note—with no small sense of delight—that I sort of look like Michelle Yeoh?

I ran my face through an artificial intelligence experiment on a platform called TikTok, which was designed to make people feel like they are in a picture. What did I hope to see? I suppose it’s a more idealized version of myself. The promise of artificial intelligence is always portraiture. We came to it for flattery, not for accuracy. I expected to preen over my image, privately, the way many of us do when we encounter a particularly good representation of ourselves. Even though I knew that the portrait had nothing to do with reality, I felt a sense of surprise, even though I knew it was a portrait.

I have very few selfies in my camera roll. I state this not as a humblebrag, but as evidence of my lack of interest in seeing my own face. I can’t see myself at all. I smile politely at my own reflection at fancy restaurants, not knowing that it is an extension of myself. When I pass a store window, I’m always surprised by the face that blinks back. Someone has a face. And when glancing through photos of myself, I feel a sense of rising contradiction, a desire to proclaim, “That’s not me.” It seems as if I know what I look like.

Maybe this isn’t an uncommon phenomenon. Our faces are distorted, so we imagine our features to be more attractive or troll-like than other people might think. Our self-esteem is always skewed because of our previous versions of ourselves that have existed before, which consists of our shirless preteen years, the daring haircuts, and the polished bridal visage.

What we see in our minds is an uncanny valley version of ourselves. Maybe our self-image isn’t so different from what we might get from an Artificial Neural Network.

The Digital Consequences of Deep-Fake Pornography: An Analysis with Atrioc, a Streamer, and Farid

They warned in the last year that strategic competitors could use this technology to create false image, audio, and video files to amplify influence campaigns against the US and our allies.

It’s not hard to imagine that a fake video showing a politician in a compromising position and a fake audio of a world leader have been made.

The threat doesn’t seem too distant. The recent viral success of ChatGPT, an A.I. chatbot that can answer questions and write prose, is a reminder of how powerful this kind of technology can be.

Last week, the issue exploded in public view when it emerged Atrioc, a high-profile male video game streamer, had accessed deepfake videos of some of his female Twitch streaming colleagues. He later apologized.

“It’s very, very surreal to watch yourself do something you’ve never done,” Twitch streamer “Sweet Anita” told CNN after realizing last week her face had been inserted into pornographic videos without her consent.

It feels like if you watched anything shocking happen to yourself. Like, if you watched a video of yourself being murdered, or a video of yourself jumping off a cliff,” she said.

Indeed, the very term “deepfake” is derived from the username of an anonymous Reddit contributor who began posting manipulated videos of female celebrities in pornographic scenes in 2017.

Hany Farid, a professor at the University of California, Berkeley and digital forensic expert, told CNN that he is confused by how awful people are on the Internet.

“I think we have to start sort of trying to understand, why is it that this technology, this medium, allows and brings out seemingly the worst in human nature? And if we’re going to have these technologies ingrained in our lives the way they seem to be, I think we’re going to have to start to think about how we can be better human beings with these types of devices,” he said.

Cole said that rape culture is all about violating women’s consent and being disrespectful to them, so he doesn’t know what the solution is.

Source: https://www.cnn.com/2023/02/16/tech/nonconsensual-deepfake-porn/index.html

How I Came to Silicon Valley: Why Do You Want to Be There? A Tale of Two Dilated Men and One Robot in a Silence

But there’s skepticism. The development of artificial intelligence is moving much, much faster than the original technology revolution, which did not solved the problems of the technology sector from 10, 20 years ago.

Mark Zuckerberg used to tell his company to move quickly and break things. The platform came into focus and he had to change his motto to, “Move fast with stable infrastructure.”

Silicon Valley was not prepared for the onslaught of hate and misinformation that has arisen on its platforms. The same tools it had built to bring people together have also been weaponized to divide.

And while there has been a good deal of discussion about “ethical AI,” as Google and Microsoft look set for an AI arms race, there’s concern things could be moving too rapidly.

The people who are developing these technologies have to start asking themselves, “Why are you developing this technology?”

“If the harms outweigh the benefits, should you carpet bomb the Internet with your technology and put it out there and then sit back and say, ‘well, let’s see what happens next?’”

When I was just 18 years old, I started my career as a cam girl and an online porn model, giving paying customers access to my nude body in the form of Photo sets and weekly cam shows broadcast in the members section of my paysites. By today’s standards, the work I did was laughably low-fi. Most of what I put into the world was softcore stills. Even my cam shows only offered viewers the chance to watch an image refresh every 15 seconds or so, basically providing access to a slow-moving digital flipbook. I only took two videos over the course of three and a half years, and one was silent because of a malfunctioning microphone.

People still paid to see me naked. They joined the websites that I modeled for. They paid me for shows that would play out on their own website, and alone. It seemed that nudity was enough to overcome any shortcomings in production value: The images could be bad or blurry or low res, but as long as there were tits available to view, I had a marketable product.

This isn’t to say that no one will ever enjoy AI porn. People who purchase Real Dolls are apparently unbothered by the love dolls refusal to ever leave the valley. Men who enthusiastically post images of AI models, undeterred by their wonky teeth and other bizarre AI tells, probably aren’t faking their enthusiasm for these artificial women. Artificial intelligence erotica, which can be used for people who don’t want to pay for porn or who feel comfortable masturbating to an image of someone who is not there, could be a useful niche.

Previous post These are natural wonders, not US national parks
Next post The toys get thrown in the trash