Tech News

How to prevent AI from recognizing your face in selfies

[ad_1]

Fawkes has already been downloaded about half a million from the file project page. The user has also created a file for online type. There is no mobile app, but nothing is stopping someone from starting to make it, says Wenger.

Fawkes can reserve a new face recognition so they don’t notice you – the next Clearview, say. But it will not harm the existing systems that have been taught on your previously unprotected images. The development is growing steadily, however. Wenger thinks a tool developed by Valeriia Cherepanova and colleagues at the University of Maryland, one of the teams at the ICLR this week, could address the issue.

Invited LowKey, the weapon expands on the Fawkes using destructive images in the image based on a violent type of antagonist, which also deceives the commercially experimental genres. As Fawkes, LowKey is also available online.

Ma and his friends have added a great twist. Their method, which converts images into what they call them untrained examples, makes AI completely ignore your selfies. “I think that’s a good thing,” said Wenger. “Fawkes teach race to learn the wrong about you, and this tool teaches race not to learn anything about you.”

My photos that I posted on the internet (above) have become obscure examples (below) that facial recognition is not overlooked. (Story by Daniel Ma, Sarah Monazam Erfani and colleagues)

Unlike Fawkes and his followers, untrained examples do not lead to persecution. Instead of triggering a change in image that forces AI to make mistakes, the Ma team is adding small changes that entice AI to ignore it in training. When he is shown the image afterwards, his examination of its contents will not be successful in mere imagination.

Unlearned examples may be more effective than rebellion, because they cannot be taught against. The most contradictory examples that AI sees, it is worth noting. But because Ma and her peers forbid AI to teach images at first, they say this cannot be done with untrained models.

Wenger quit military service, however. His team recently realized that Microsoft Azure face recognition service was not affected by their other images. He said: “All of a sudden we became strong images we made.” “We don’t know what happened.”

Microsoft may have changed its format, or AI may have seen more images from users of Fawkes that they learned to recognize. Either way, Wenger’s team released the update to their squad last week which also works with Azure. “This is another cat and mouse race,” he says.

For Wenger, this is an online story. “Companies like Clearview take advantage of what they see as free access and use it to do whatever they want,” he says. “

These rules may help in the long run, but that doesn’t stop companies from using lies. “There will always be a discrepancy between what is legal and what people want,” he says. “Weapons like Fawkes fill this gap.”

“Let’s give people the power they didn’t have before,” he says.

[ad_2]

Source link

Related Articles

Leave a Reply

Back to top button