Fusion iris and periocular recognitions in non-cooperative environment

The performance of iris recognition in non-cooperative environment can be negatively impacted when the resolution of the iris images is low which results in failure to determine the eye center, limbic and pupillary boundary of the iris segmentation. Hence, a combination with periocular features is s...

Full description

Bibliographic Details
Main Authors: Anis Farihan, Mat Raffei, Sutikno, Tole, Asmuni, Hishammuddin, Hassan, Rohayanti, Razib, M. Othman, Shahreen, Kasim, A Riyadi, Munawar
Format: Article
Language:English
Published: Institute of Advanced Engineering and Science 2019
Subjects:
Online Access:http://umpir.ump.edu.my/id/eprint/26456/
http://umpir.ump.edu.my/id/eprint/26456/
http://umpir.ump.edu.my/id/eprint/26456/2/1147-3809-1-PB.pdf
Description
Summary:The performance of iris recognition in non-cooperative environment can be negatively impacted when the resolution of the iris images is low which results in failure to determine the eye center, limbic and pupillary boundary of the iris segmentation. Hence, a combination with periocular features is suggested to increase the authenticity of the recognition system. However, the texture feature of periocular can be easily affected by a background complication while the colour feature of periocular is still limited to spatial information and quantization effects. This happens due to different distances between the sensor and the subject during the iris acquisition stage as well as image size and orientation. The proposed method of periocular feature extraction consists of a combination of rotation invariant uniform local binary pattern to select the texture features and a method of color moment to select the color features. Besides, a hue-saturation-value channel is selected to avoid loss of discriminative information in the eye image. The proposed method which consists of combination between texture and colour features provides the highest accuracy for the periocular recognition with more than 71.5% for the UBIRIS.v2 dataset and 85.7% for the UBIPr dataset. For the fusion recognitions, the proposed method achieved the highest accuracy with more than 85.9% for the UBIRIS.v2 dataset and 89.7% for the UBIPr dataset.