No matter how secure the CIDR is maintained, this crucial data of an entire nation in a single database is a huge security risk.

After the Aadhaar Virtual ID, yet another security feature has been announced by the UIDAI – Face Authentication. This is designed to address the issue of failed biometric authentication, as an alternative for people having difficulty authenticating, due to factors like worn out fingerprints, or changing biometric data due to old age, hard work conditions, accidents and the like. It is good to see the UIDAI attempt to resolve the issues in its system; unfortunately, they have sought to address it with something more problematic.

UIDAI’s Face authentication system

The details of the face authentication system were released via a UIDAI Circular. It has been introduced as an option to the existing modes of authentication, which include demographic data, biometric data in the form of a fingerprint/iris scan, and the OTP. The UIDAI is to use the face photos already collected, to extract the facial data required to conduct authentication. An acknowledgement of the ease with which face data can be spoofed can be seen, in the form of mandating the use of face authentication in combination with another factor of authentication — either the fingerprint, or iris scan, or OTP.

The technicalities and other details of how the system will work, however, have not been given in the Circular. Before going into the issues of privacy and security that arise with yet another form of biometric authentication, the basic details given in the Circular alone indicate many problems with implementation.

The fundamental issue with the proposed system

A fundamental issue that arises is that the face authentication system is meant as an alternative mode of authentication, to be used when existing modes of authentication fail. For example, a person whose fingerprint recognition is failing with the AuA’s authentication device, can turn to face authentication. The issue then, is that face authentication can only be used in combination with another form of authentication, the fingerprint, iris scan or OTP. If another form of authentication is functional, then why does the need to use face authentication arise in the first place?

This fundamental issue questions the efficacy of face authentication as an alternative form of authentication. Removing the second factor of authentication from the use of face authentication will have disastrous implications for privacy. On the other hand, if the second factor of authentication is functional, then face authentication in itself is unnecessary.

Moreover, if, in the example above, the person’s fingerprint is failing and he has no alternative mode of authentication, then is face authentication to be accepted along with the faulty fingerprint?

If such is the case, this creates a major security issue, since spoofing face data is extremely easy, much easier than even other forms of biometric authentication.

Cheating facial recognition systems is easy

Looking at the face authentication itself, facial recognition and mapping are already in widespread use by private companies. Apple’s FaceID is one form, and Facebook’s photo-tagging is another form. Even these sophisticated technologies are known to fail, whether with a similar looking person being tagged, or cheating facial recognition systems using photos and videos, or even twins, siblings and other similar looking persons.

Implementation issues with the use of a single photograph

The face authentication scheme proposed by the UIDAI is likely to be even more basic. In the absence of details in the Circular, it can be assumed that a basic face map will be created from the existing photo only. This is unlike FaceID, which takes an in-depth, 3D map of the face, or even Facebook, which creates a facemap using the many, many photos a person uploads. When these more sophisticated technologies have been cheated and found to be faulty, a map from a single photo like that on the Aadhaar card is simple enough to spoof.

A related implementation issue likely to arise with this is that the face authentication is unlikely to work unless the image is taken from the right angle, to match the angle of the photograph in the possession of the UIDAI.

Creating a simpler password

The Circular suggests the use of regular cameras like those on laptops and mobile phones to capture and send photographs, in addition to the new software proposed for the authentication devices. The absence of use of sophisticated cameras makes this authentication accessible to a larger number of people, but this simultaneously creates the simplest and most unsophisticated form of password. Today, it is easier to find a person’s picture online, than even their phone numbers. The combination with a second factor of authentication is likely to resolve that problem, but as described earlier, the second factor is itself not without problems.

The liveness feature

The UIDAI suggests a liveness detection feature, or a feature which detects movement, to overcome these issues and ensure that the person is actually present there. This, however, can also be circumvented easily, such as through the use of a GIF of a person with blinking eyes. Researchers have even shown how the creation of a 3D map using publicly available photographs can circumvent such a feature.

What about changing facial features?

Yet another issue with implementation is that faces are even more changeable than other forms of biometric data. People’s faces change tremendously with age, and it is hardly likely that the people look the same now as they did when they first handed over their photographs to the UIDAI almost ten years ago. To be workable, people will have to regulate and update their photographs regularly with the UIDAI, or risk the failure of this authentication as well.

Issues of consent

Along with the many technical and implementation related problems that arise, some issues also arise with the law. Normally, under privacy laws, consent to the data provided is an essential factor. In this case, people have, technically, consented to provide their photographs, now called ‘biometric information’ under the Aadhaar Act. This consent provided does not extend to permitting the extraction of ‘core biometric information’ from this, in the form of face data. This will need separate consent. Unfortunately, in the absence of any privacy law in India, or any provision in the Aadhaar Act protecting against this, there is nothing really to prevent such an extraction of data.

Increasing surveillance capabilities

A last, but crucial concern that arises with this announcement is the increasing surveillance capabilities of the Aadhaar ecosystem. The UIDAI has time and again stated that Aadhaar is not surveillance tech. Yet, its potential to be used as such is immense, and this announcement now adds face data.

The Aadhaar Act, no doubt, under Section 29(b), prohibits the use of core biometric data, ie, the fingerprint and iris scans for any purpose other than authentication. However, apart from illegal uses that may be made, it is just a question of amending the law, and adding the requisite clause such as ‘use for national security purposes’.

Looking at surveillance systems in China, the future potential of the use of this data is a huge concern.

Turning to new biometric authentication is not the answer

It doesn’t make sense to turn from one failed form of biometric authentication to another, even a more problematic one. One can’t help wonder what will be next, 3D face mapping, or DNA? The fundamental issue here is with the use of biometric data as a form of authentication. Biometric data is changeable, inaccurate, and once compromised, irreplaceable.

No matter how secure the CIDR is maintained, this crucial data of an entire nation in a single database is a huge security risk. Adding to it with face data only makes it that much more attractive to cybercriminals. An alternative form of authentication is definitely required, but turning to new forms of biometric authentication is not the answer.

The author is a lawyer and author specialising in technology laws. She is also a certified information privacy professional.