Scientists have developed an artificial intelligence tool that can synthesise fake human fingerprints and potentially fool biometric authentication systems.
Fingerprint authentication systems are a widely trusted, ubiquitous form of biometric authentication, deployed on billions of smartphones and other devices worldwide.
However, researchers from New York University (NYU) in the US revealed a surprising level of vulnerability in these systems.
Using a neural network, they evolved a fake fingerprint that could fool biometric authentication for one in five people.
Much the way that a master key can unlock every door in a building, these “DeepMasterPrints” use artificial intelligenceto match a large number of prints stored in fingerprint databases and could thus theoretically unlock a large number of devices.
The work builds on earlier research led by Nasir Memon, professor at NYU. Memon, who coined the term “MasterPrint,” described how fingerprint-based systemsuse partial fingerprints, rather than full ones, to confirm identity.
Devices typically allow users to enroll several different finger images, and a match for any saved partial print is enough to confirm identity.
Partial fingerprints are less likely to be unique than full prints, and Memon’s work demonstrated that enough similarities exist between partial prints to create MasterPrints capable of matching many stored partials in a database.
Researchers took this concept further, training a machine-learning algorithm to generate synthetic fingerprints as MasterPrints.
“Fingerprint-based authentication is still a strong way to protect a device or a system, but at this point, most systems don’t verify whether a fingerprint or other biometric is coming from a real person or a replica,” said doctoral student Philip Bontrager.
“These experiments demonstrate the need for multi-factor authentication and should be a wake-up call for device manufacturers about the potential for artificial fingerprint attacks,” he said.