in

This AI tool generates your creepy lookalikes to trick facial recognition

This AI tool generates your creepy lookalikes to trick facial recognition

EFF Photos

If you’re worried about facial recognition firms or stalkers mining your online photos, a new tool called Anonymizer could help you escape their clutches.

The app was created by Generated Media, a startup that provides AI-generated pictures to customers ranging from video game developers creating new characters to journalists protecting the identities of sources.  The company says it built Anonymizer as “a useful way to showcase the utility of synthetic media.”

The system was trained on tens of thousands of photos taken in the Generated Media studio. The pictures are fed to generative adversarial networks (GANs), which create new images by pitting two neural networks against each other: a generator that creates new samples and a discriminator that examines whether they look real.

The process creates a feedback loop that eventually produces lifelike profile photos.

Credit: 2020 Generated Media, Inc.
The images are tagged, categorized, and added to the training dataset.

You have to buy a license to use Anonymizer for commercial purposes, but the tool is free for personal usage — as long as you don’t violate the terms and conditions.

Just upload a clear photo of your face looking straight ahead, and the system will spit out a grid of 20 doppelgängers. You could then pick one that resembles you and use it in place of the social media profiles scanned by the likes of Clearview AI.

Unlike many of the facial recognition systems it could trick, Anonymizer seemed to work fairly well on a diverse range of faces during our brief testing. But Generated Media admits it needs to do better:

Our goal is to represent every person regardless of age, sex, ethnicity, or physical characteristics. The reality of generating consistent content with AI is that training data needs to be available for our systems to learn from. This requires sourcing a large number of models and takes time. After running a studio for the last two years, we have learned it can be difficult to find diverse models with unique features that are also willing to shoot stock photography. This is not a challenge will are backing down from. 

However, many of the clones bear little resemblance to the face they replace. In some cases (including mine) it seems to suspect that the uploader is a child. I’m gonna take it as a compliment.

Credit: Generated Media
After the faces are created, further machine learning processes identify and remove flaws.

Nonetheless, Anonymizer could be a useful way of avoiding facial recognition systems. Still, there are risks of it being deployed for nefarious purposes, despite Generated Media prohibiting its use for any illegal activity, such as defamation, impersonation, or fraud.

The tool could also ascent our descent into a counterfeit world. But if you can’t beat ’em, I guess you might as well join ’em in the simulated reality.

HT – Thomas Smith

Source: thenextweb.com

What do you think?

48 points
Upvote Downvote

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Artificial Intelligence Is Now Smart Enough to Know When It Can't Be Trusted

Artificial Intelligence Is Now Smart Enough to Know When It Can’t Be Trusted

Can AI Really Know When It Shouldn’t Be Trusted?

Can AI Really Know When It Shouldn’t Be Trusted?