Privacy fears about AI trend for users

Privacy fears about AI trend for users
Shaun M Jooste

Shaun M Jooste

Artificial Intelligence is starting to show us what it’s made of. I’m sure by next year, we’ll have AI developing video games for us with as little human intervention as possible. That means more profits for companies and developers and fewer expenses paying people to do that work. Before that fear arrives, we need to deal with privacy and copyright issues.

Today, I want to spend a peaceful Sunday morning chatting with you about the latest privacy fears with regard to AI. Despite these concerns, celebrities are using the latest trends to make magical avatars with their faces. Do we really care about our privacy if we’re paying companies to use our images for free?

Giving AI rights to use your face whenever it wants

Let’s use the latest Lensa AI magical avatar trend as an example. If you want to look adorable in a fantasy theme, you’ll have to pay about $8 for a set of 50. Not too long ago, it was about $4, but clearly, the company, Prism Labs, has realized how much money they can make from this. 

The primary issue at the moment is how Prism Labs is using your facial features. The company claims it removes your photos after the AI has been trained to recognize your face. Does it matter then that they delete the images, considering that it now can draw your body without any input from you? 

That’s like handing someone the blueprints to a secret ingredient, and they tell you they’ll destroy the plans once they know how to make it. It’s kind of pointless. Plus, there are the company’s terms and conditions. While you retain all rights to the AI-generated art, Prism Labs has royalty-free rights to any photos you send them. So their AI model can keep making avatars and portraits of you, and there’s nothing you can do about it.

Privacy fears about AI trend for users

Paying for AI art but not getting compensated for its use

Here’s another catch I don’t think many people have caught on to. You need to pay the company $8 to use the AI technology, and yet, you’re agreeing that they don’t pay you whenever they produce other artwork with your face on them. Think about it this way: there could be an animated movie or a video game with a character that looks like you, and you won’t get paid a cent.

In one way, companies like Prism Labs have found a way to maximize profits and reduce expenses. They can make a fortune off this technology without having to pay anyone royalties. Yet, what about all those digital creations that are copyrighted by artists, who will now lose out on the compensation they would usually get from royalties?

Without their permission, AI is learning how to replicate these pieces of art while changing them slightly so they aren’t exact copies. All it takes is putting your face on the Mona Lisa, and AI won’t be sued for it. Why? Because the company takes no responsibility for what the AI tech produces, and no one is holding them accountable.

Privacy fears about AI trend for users

Issues with sexuality and color

Let me move on to another privacy fear among AI users that has arisen recently. For some reason, the Lensa avatar technology is sexualizing many images, especially those of women. Where they may have had slender, petite forms before, they now have massive bosoms. 

While some celebrities are happy about it and others are crying out about it, issues of feminism are rising due to it. It’s more than that, though. Sure, Prism Labs has indicated it made some moves to reduce how much nude art is randomly created, but it doesn’t stop AI from putting your face on someone else’s nude body.

As history shows, this type of technology can be used against you by someone that really dislikes you or is insanely jealous of you. Forget Photoshopping your head onto these bodies. AI can now make it look more realistic.

There’s also the issue of color and race. Some issues have been arising where people of color have whiter skin in the avatars. On the flip side, some people are commenting that we need to feed AI more images of people with color so that it can “learn” that white isn’t the only human state on Earth.

Despite the outcry, people still turn to AI trends

Humans are an odd breed. Even though there’s a clear outcry against AI and these new rising trends of machine learning, we’re still feeding technology data. We’re too eager to see what we’d receive when we pay $8 for 50 avatar images. 

Let’s be honest, though. Not all of us are celebrities. As a matter of fact, they’re in the minority. Too many of us are all too happy to get a moment of fame by also having our faces placed on sexy or artistic avatar bodies and images for a spat of attention. That’s why companies like Prism Labs will continue to make a fortune with apps like Lensa. They’re banking on your desire to look absolutely fantastic.

Privacy fears about AI trend for users

Privacy is in your hands

At the end of the day, your privacy is in your hands. If you’re willing to pay $8 for a machine to learn how to draw your face and replicate it at any time, that’s on you. When you one day learn that Sony is making millions from a video game with the main character that has your face, feel free to cry into your pillow for missing out on those royalties.

The only one ultimately responsible for keeping your private photos private is you. If you’re going to share your family photos on social media, expect AI to make use of them without your permission. For now, there’s nothing we can do to stop it….yet….

Shaun M Jooste

Shaun M Jooste

I live in South Africa, Cape town, as a father of two children. I've been gaming almost all my life, with plenty of experience writing reviews and articles on the latest titles. With 15 years of experience in local government performing Facilities Management functions, I moved towards becoming CEO of my own company, Celenic Earth Publications, which serves to publish author's books, including my own. I'm a published author of horror and fantasy novels, while I also dabble in game and movie scriptwriting.

Latest from Shaun M Jooste