ANALYSIS AND EVALUATION OF BIOMETRIC AUTHENTICATION MEANS BASED ON THE IMAGE OF THE FACE AND IRIS OF THE STAFF OF CRITICAL INFRASTRUCTURE FACILITIES

Authors

DOI:

https://doi.org/10.28925/2663-4023.2023.21.136148

Keywords:

information security, security of critical infrastructure objects, biometric authentication; identity recognition; recognition of emotions; image of a person's face; iris of the eye; emotion; performance criteria; personnel of critical infrastructure objects

Abstract

Dedicated to the analysis and evaluation of biometric authentication systems for personnel of critical infrastructure facilities. It is shown that tools based on the image of the face and the iris of the eye have broad prospects, which is explained by the proven solutions in the field of face image analysis, the availability and distribution of video recording tools that allow the iris to be recorded simultaneously with the face with satisfactory quality. It was determined that one of the ways to improve the effectiveness of such tools is to increase the accuracy of face recognition and resistance to obstacles that cover part of the face. It is also shown that an additional direction of improving the effectiveness of biometrics can be recognition of the current psycho-emotional state of personnel of critical infrastructure facilities. The need to evaluate the effectiveness of face and emotion recognition tools based on face and iris images is determined. Based on the analysis of the literature, two groups of efficiency criteria were formed: basic and additional. Criteria characterizing the effectiveness of the recognition process are included in the main group, and criteria corresponding to the technical implementation features and service capabilities of recognition tools are included in the additional group. An evaluation of modern means of face and emotion recognition based on the image of the face and iris of the eye was carried out, and their non-compliance with a number of criteria was determined. It is proposed to correlate the ways of further research with the solution of the task of ensuring the fulfillment of the criteria related to the possibility of recognizing emotions and a person based on part of the image of the face, facial expressions and part of the image of the iris of the eye, with the technical implementation of expert solutions. The possibility of performing the specified task due to the use of modern neural network technologies is shown.

Downloads

Download data is not yet available.

References

Vysotska, O.O., Davydenko, A.M., Khrystevych, V. (2022). Vydilennia oblychchia liudyny u videopototsi dlia kontroliu za dotrymanniam spivrobitnykamy stanu bezpeky v protsesi roboty ta navchannia. Zakhyst informatsii, 24(2), 94-107. DOI: 10.18372/2410-7840.24.16934.

Mykhailenko, V. M., Tereikovskaia, L. A. (2019). Ohliad zasobiv rozpiznavannia emotsiinoho stanu liudyny za heometriieiu oblychchia. Upravlinnia rozvytkom skladnykh system, 37, 178-184. doi.org10.6084/m9.figshare.9783236.

Nazarkevych, M., Voznyi, Ya., Nazarkevych, H. (2021). Rozroblennia metodu mashynnoho navchannia pry biometrychnomu zakhysti iz novymy metodamy filtratsii. Kiberbezpeka: osvita, nauka, tekhnika, 3(11), 16–30. doi.org/10.28925/2663-4023.2021.11.1630.

Tereikovska, L. O. (2023). Metodolohiia avtomatyzovanoho rozpiznavannia emotsiinoho stanu slukhachiv systemy dystantsiinoho navchannia: dys. dokt. tekhn. nauk.Ma, X., Fu, M., Zhang, X., Song, X., Becker, B., Wu, R., Xu, X., Gao, Z., Kendrick, K., & Zhao, W. (2022). Own Race Eye-Gaze Bias for All Emotional Faces but Accuracy Bias Only for Sad Expressions. Frontiers in Neuroscience, 16. https://doi.org/10.3389/fnins.2022.852484.

Noyes, E., Davis, J. P., Petrov, N., Gray, K. L. H., & Ritchie, K. L. (2021). The effect of face masks and sunglasses on identity and expression recognition with super-recognizers and typical observers. Royal Society Open Science, 8(3). https://doi.org/10.1098/rsos.201169

Ranjith, G., Pallavi, K., & Mahendra, V. (2022). Human Face, Eye and Iris Detection in Real-Time Using Image Processing. У Algorithms for Intelligent Systems (с. 383–389). Springer Nature Singapore. https://doi.org/10.1007/978-981-19-1669-4_34.

Rinck, M., Primbs, M. A., Verpaalen, I. A. M., & Bijlstra, G. (2022). Face masks impair facial emotion recognition and induce specific emotion confusions. Cognitive Research: Principles and Implications, 7(1). https://doi.org/10.1186/s41235-022-00430-5.

Royer, J., Blais, C., Charbonneau, I., Déry, K., Tardif, J., Duchaine, B., Gosselin, F., & Fiset, D. (2018). Greater reliance on the eye region predicts better face recognition ability. Cognition, 181, 12–20. https://doi.org/10.1016/j.cognition.2018.08.004.

Tereikovskyi, I., Korchenko, O., Bushuyev, S., Tereikovskyi, O., Ziubina, R., Veselska, O. (2023). A Neural Network Model for Object Mask Detection in Medical Images. International Journal of Electronics and Telecommunications, 69(1), 41-46. DOI: 10.24425/ijet.2023.144329.

Toliupa, S., Tereikovska, L., Tereikovskyi, I., Doszhanova, A., Alimseitova, Z. (2020). Procedure for Adapting a Neural Network to Eye Iris Recognition. IEEE International Conference on Problems of Infocommunications, Science and Technology, 167-171. DOI: 10.1109/PICST51311.2020.9468020.

Vinette, C., Gosselin, F., Schyns, P. (2004). Spatio-temporal dynamics of face recognition in a flash: it’s in the eyes. Cognitive Science, 28(2), 289–301. doi.org/10.1016/j.cogsci.2004.01.002.

ViswanathReddy, D. A., Aswini Reddy, A., & Bindyashree, C. A. (2021). Facial Emotions over Static Facial Images Using Deep Learning Techniques with Hysterical Interpretation. Journal of Physics: Conference Series, 2089(1), 012014. https://doi.org/10.1088/1742-6596/2089/1/012014.

Downloads


Abstract views: 249

Published

2023-09-28

How to Cite

Korchenko, O., & Tereikovskyi, O. (2023). ANALYSIS AND EVALUATION OF BIOMETRIC AUTHENTICATION MEANS BASED ON THE IMAGE OF THE FACE AND IRIS OF THE STAFF OF CRITICAL INFRASTRUCTURE FACILITIES. Electronic Professional Scientific Journal «Cybersecurity: Education, Science, Technique», 1(21), 136–148. https://doi.org/10.28925/2663-4023.2023.21.136148