The world of artificial intelligence is one of today’s fastest-growing niches in technology, and its newest development shows it has no intention of slowing down. According to The Verge, a tool called the generative adversarial network (GAN) makes it possible for AI to create fake images of human faces that look eerily real.
The Nvidia researchers who developed this technology explained in a report that their infrastructure “leads to an automatically learned, unsupervised separation of high-level attributes (e.g., pose and identity when trained on human faces) and stochastic variation in the generated images (e.g., freckles, hair), and it enables intuitive, scale-specific control of the synthesis.”
The GAN, writes Open Culture, uses an AI algorithm “that pits multiple neural networks against each other in a kind of machine-learning match.” Machine learning allows the AI to creates images of faces that are uncannily similar to the real thing. Engineers can even manipulate and customize what the faces will look like by applying style transfer to face generation.
Many have criticized the rapid development of AI and how it can impact our safety, privacy, and the spread of misinformation and propaganda. These faux photos are no different, with there being serious risk in fields like law enforcement and journalism if this technology were to become mainstream. Nvidia did not address such concerns in their paper.
Though these “photos” are rather realistic, there are quite a few things that can help you determine that their fake, such as asymmetrical ear placement or hair that looks painted on.
Many experts in image authentication, however, are already ahead of the game with thing like camera apps that stamp pictures with geocodes to verify when and where they were taken. “Clearly, there is going to be a running battle between AI fakery and image authentication for decades to come,” wrote The Verge. “And at the moment, AI is charging decisively into the lead.”
If you enjoyed this article and want to receive more valuable industry content like this, click here to sign up for our digital newsletters!