Automatic Illumination-Invariant Face Recognition using Active Near-Infra-Red Imaging

1. Ambient Illumination Removal by Active Near-IR Imaging

Problem to address:

Varying direction and energy distribution of the ambient illumination, together with the 3D structure of human face, can lead to severe differences in the shading and shadows on the face. Such variation in face appearance can be much larger than the variation caused by personal identity.

Previous approach:

  1. Passive approaches:
    1. Model face appearance as a function of illumination: 3D linear subspace, Illumination cone, spherical harmonic images, quotient images, and etc.
    2. Photometric normalization: Multiscale Retinex method, isotropic/anisotropic smoothing, homomorphic filtering, histogram equalization and etc.
  2. Active approach:
    Apply additional devices to actively obtain modalities of face images that are insensitive to or independent of illumination change. Those modalities include face shape, thermal infrared face images, and near-infrared hyper-spectral images.

Proposed approach:

A novel approach is proposed to obtain ambient illumination invariant faces using active Near-IR imaging. Active Near-IR illumination projected by a Light Emitting Diode (LED) light source attached to a camera is used to provide a constant illumination. The difference between two face images captured when the LED light is on and off respectively, is the image of a face under just the LED illumination, and is independent of the ambient illumination.

The idea of employing the difference imaging distinguishes the proposed approach from the other approaches based on Infrared imaging. Simply applying infrared filters, which is usually adopted by those approaches to obtain the infrared face images, will admit the infrared component from ambient illumination. Therefore, the captured infrared image is NOT independent of ambient illumination. This problem is solved by applying the differencing technique in the proposed approach.



******
Figure 1: Face capture system.


A face database with 40 subjects, 2 sessions with a interval of weeks, 4 ambient illumination conditions and 6 shots per condition are captured indoor using the system shown in Figure1, 40*2*4*6=1920 ambient faces in total and same amount of LED faces. A ring of 4 fluorescent lamps are used to provide ambient illumination from different directions.


******
Figure 2. Ambient faces, combined illumination faces and LED faces.


Face recognition experiments are carried out on the ambient faces and LED faces respectively. Different face representations and classifiers are employed. Three test protocols are defined: Cross Session, Cross Illumination, and Combined test (cross both session and illumination).

Table 1. Average error rates in the combined tests.
******

The results in Table 1 show that:

  1. Regardless of the choice of face representations and classifiers, recognition on LED face achieved much lower error rates than on ambient faces.
  2. The best results on LED faces are achieved in the intensity domain.
  3. The AffinityTM face authentication SDK (from Omniperception) gave the best performance among all recognition approaches on both ambient faces and LED faces.


2. Automatic Face Localization for Near-IR Illuminated Faces.

For automatic face localization in Near-IR face images, a multistage approach is proposed by combining a feature-based face localization method and a global appearance based method using FloatBoost. The LEDs attached to the camera lead to bright-pupil effect which is very beneficial for automatic face localization. The circular shape of the bright pupil is a scale and rotation invariant feature which is exploited in the first method to quickly detect pupil candidates. Support Vector Machines (SVM) trained on local eye region appearance and global face appearance are employed to finally validate the candidates. Sometimes bright pupils can be missing in the face image or failed to be detected which leads to the failure of this feature based method. In this situation, the second face localisation method based on FloatBoost can be employed as a remedy.



******
Figure 3. Diagram of the proposed multistage localization method.


******
Figure 4. Near-IR face images


******
Figure 5. Localization results for the images in Figure 4.

Localisation experiments are performed on LED faces. As shown in Figure 6, taking deye=0.05 as the threshold to distinguish a successful localization, the success rate achieved by the bright pupil detector is 96.5%, which is 6% higher than the FloatBoost detector. A further improvement of 1% is achieved by the multistage approach.



******
Figure 6. Localization performance curves of the three methods.

Face recognition experiments are conducted on the LED faces registered by the automatic localization results from the multistage localization approach.



Table 2. Average recognition test error rates on automatically registered face.
******


Table 3. Average recognition test error rates on Man/auto registered faces.
******

As shown above in Tables 2 and 3:

1) Very low error rates are achieved for all the tests on the automatically localized faces. Nearly all the error rates are below 0.8%, whether trained on manually registered data or automatically registered data, which confirms once again that the proposed multistage approach provides accurate face localization.

2) A practical application scenario is best represented by the combined test of the face recognition system on across Auto/Auto faces, yielding an error rate of less than 0.8% surprisingly. This error rate is better than the result obtained using manually registered training images. It shows the excellent performance of the proposed automatic face recognition system in a practical environment with varying illumination.

Publications

  1. X. Zou, J. Kittler and K. Messer. "Face recognition using active Near-IR illumination". In Proceedings of British Machine Vision Conference, pp. 209-219, 2005.
  2. X. Zou, J. Kittler and K. Messer. "Ambient illumination variation removal by active Near-IR imaging". In Proceedings of IAPR International Conference on Biometric, pp. 19-25, 2006.
  3. X. Zou, J. Kittler and K. Messer, "Accurate face localisation for faces under active Near-IR illumination". In Proceedings of 7th IEEE International Conference Automatic Face and Gesture Recognition, pp. 369-374, 2006.
  4. X. Zou, J. Kittler and K. Messer, "Illumination Invariant Face Recognition: A Survey". In Biometrics: Theory, Applications, and Systems, 2007 (BTAS 2007).

For more information, please contact Prof. Josef Kittler.