Extracting and analysing micro-expressions in a neural network lie detection task
https://doi.org/10.37661/1816-0301-2025-22-3-35-44
Abstract
O b j e c t i v e s. The objectives of the study are to collect data, develop an algorithm for automatic extraction of microexpressions from video recordings, and form rules for combinations of motor units, based on which basic human emotions are determined.
M e t h o d s. Human facial microexpressions are brief, involuntary reactions that may appear when a person attempts to hide their true emotions. Microexpressions play a key role in lie detection and are an important indicator of the concealment of truthful information. In this article, Action Units (movement units) obtained using the py-feat library from the Facial Action Coding System (FACS) were used to analyse facial expressions.
R e s u l t s. A dataset consisting of video recordings of a group of specific people was collected. Rules were developed based on combinations of action units and their intensities to determine basic emotions. An algorithm for determining and extracting microexpressions from video recordings was also formulated. The results of the algorithm study showed a negative correlation between the emotion of joy and the manifestation of lying.
C o n c l u s i o n. The results obtained allow us to expand the information base for neural network lie detection using a video series of facial images by detecting and analysing microexpressions on them.
About the Authors
K. A. KotovaBelarus
Ksenia A. Kotova - Undergraduate of Belarusian State University, Faculty of Radiophysics and Computer Technologies.
Nezavisimosti av., 4, Minsk, 220030
V. S. Sadov
Belarus
Vasiliy S. Sadov - Ph. D. (Eng.), Assoc. Prof., Prof. of the Department of Intelligent Systems, Faculty of Radiophysics and Computer Technologies, Belarusian State University.
Nezavisimosti av., 4, Minsk, 220030
References
1. Ekman P. Telling Lies: Clues to Deceit in the Marketplace, Politics, and Marriage. New York, W. W. Norton & Company, 1985, 368 p.
2. Ekman P. Unmasking the Face: A Guide to Recognizing Emotions From Facial Expressions. Los Altos, CA, Malor Books, 2003, 212 p.
3. Ekman P. Emotions Revealed: Understanding Faces and Feelings. Phoenix, Nairobi, Kenyan, 2004, 336 p.
4. Ekman P., Rosenberg E. L. What the Face Reveals: Basic and Applied Studies of Spontaneous Expression Using the Facial Action Coding System (FACS). New York, Oxford University Press, 2005. – 662 p.
5. Facial Action Coding System (FACS). United Kingdom, 2025. Available at: https://www.eiagroup.com/resources/facial-expressions/facs-explained/ (accessed 01.08.2025).
6. Py-Feat: Python Facial Expression Analysis Toolbox. Germany, 2022–2025. Available at: https://py-feat.org/pages/intro.html (accessed 08.08.2025).
7. Cheong J. H., Xie T., Byrne S., Chang L. J. Py-Feat: Python Facial Expression Analysis Toolbox. Germany, Dartmouth College, 2021. Available at: https://www.researchgate.net/publication/350749851_Py-Feat_Python_Facial_Expression_Analysis_Toolbox (accessed 01.08.2025).
8. Kotova K. A. Formation of a dataset for training the random forest method in tasks of lie detection based on facial expressions. Intellektual'nye, sensornye i mehatronnye sistemy-2025 : sbornik nauchnyh trudov (po materialam studencheskih nauchno-tehnicheskih konferencij) [Intelligent, Sensory and Mechatronic Systems-2025: Collection of Scientific Papers (Based on Materials from Student Scientific and Technical Conferences)]. Ed. board: A. V. Staselovich, E. A. Bogdanova ; compiled by S. A. Rybchak, E. A. Bogdanova, P. S. Kolesnikov. Minsk, Belorusskij nacional'nyj tehnicheskij universitet, 2025, pp. 70–74 (In Russ.).
9. Davison A. K., Merghani W., Yap M. H. Objective classes for micro-facial expression recognition. Journal of Imaging, 2018, vol. 4, no. 10. Available at: https://www.mdpi.com/2313-433X/4/10/119 (accessed 01.08.2025).
Review
For citations:
Kotova K.A., Sadov V.S. Extracting and analysing micro-expressions in a neural network lie detection task. Informatics. 2025;22(3):35-44. (In Russ.) https://doi.org/10.37661/1816-0301-2025-22-3-35-44


















