RESEARCH AND SOFTWARE IMPLEMENTATION OF HAND GESTURE RECOGNITION METHODS

2025;
: 156-165
Received: August 08, 2025
Revised: August 18, 2025
Accepted: September 15, 2025
1
Lviv Polytechnic National University
2
Lviv Polytechnic National University
3
Lviv Polytechnic National University
4
Lviv Polytechnic National University Department of Computer Design Systems, Lviv, Ukraine

The article presents the development of an interactive system for recognizing and classifying human hand gestures based on machine learning technologies. A new approach to gesture representation is proposed, combining spatial and temporal characteristics of the location of key points of the hand, which ensures high accuracy, noise resistance, and adaptability of the system to various conditions of use. A distinctive feature of the development is the interactive learning method, which allows users without special technical knowledge to quickly add new gestures even with a limited amount of training data. A series of experiments was conducted to test various machine learning strategies, which made it possible to identify optimal models and confirm the effectiveness of the proposed approach. The system can be easily integrated into modern human-computer interaction interfaces and can be used in virtual and augmented reality, medical technologies, educational systems, and contactless control services. The results obtained open up opportunities for further improvement of the system, in particular in terms of increasing performance, scalability, and personalization to meet user needs.

[1] M. Linardakis, I. Varlamis, і G. Th. Papadopoulos, “Survey on Hand Gesture Recognition from Visual Input,” arXiv preprint arXiv:2501.11992, 2025.

[2] H. Hakim, D. Darmawan, і H. A. Nugroho, “Dynamic Hand Gesture Recognition Using 3D-CNN and LSTM,” у Proc. Int. Conf. on Advanced Computer Science and Information Systems (ICACSIS), 2019, сс. 93–98.

[3] Y. Liu, F. Wang, і J. Chen, “Spatio-Temporal Dynamic Attention Graph Convolutional Network for Skeleton-based Gesture Recognition,” Pattern Recognition Letters, vol. 178, сс. 85–92, 2024.

[4] A. Sultana, “A Systematic Review on sEMG-based Hand Gesture Recognition Using Deep Learning Methods,” Bioengineering, vol. 10, no. 3, сс. 225–239, 2023. https://doi.org/10.3390/bioengineering10030225

[5] K. Zhang, Y. Zhao, і L. Huang, “MuViH: A Multi-View Hand Gesture Dataset for Robust Recognition,” IEEE Access, vol. 13, сс. 45012–45025, 2025. https://doi.org/10.1109/ACCESS.2025.102957

[6] Zhang F., Bazarevsky V., Vakunov A., et al. MediaPipe Hands: On-device Real-time Hand Tracking. arXiv preprint arXiv:2006.10214, 2020.

[7] Chang W., Hsu F., Cherng S. Deep Learning Recognition Method for Hand Gestures. Applied Sciences, 2021, 11(6), 2735.

[8] Wang Y., Lin Z., Liang X., et al. GestureRecognizer: A Toolbox for Gesture Recognition Built with Temporal Convolution Networks. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems, 2019, pp. 1-4.

[9] Li Y., Sun X., Zhang Y., et al. Human-computer interaction using dynamic hand gestures based on deep learning. Information Sciences, 2021, 545, pp. 713-730.

[10] Pomorova O.V., Hovorushchenko T.O. Intelektualnyi analiz ta obrobka danykh. Khmelnytskyi: KhNU, 2018. 236 s.

[11] MediaPipe - ofitsiinyi sait biblioteky dlia rozpiznavannia obiektiv. [Elektronnyi resurs]. – Rezhym dostupu: https://developers.google.com/mediapipe

[12] TensorFlow - ofitsiina dokumentatsiia freimvorku dlia mashynnoho navchannia. [Elektronnyi resurs]. – Rezhym dostupu: https://www.tensorflow.org/api_docs

[13] OpenCV - biblioteka kompiuternoho zoru z vidkrytym kodom. [Elektronnyi resurs]. – Rezhym dostupu: https://opencv.org/

[14] Scikit-learn - biblioteka mashynnoho navchannia dlia Python. [Elektronnyi resurs]. – Rezhym dostupu: https://scikit-learn.org/

[15] PyTorch - biblioteka dlia mashynnoho navchannia z vidkrytym kodom. [Elektronnyi resurs]. – Rezhym dostupu: https://pytorch.org/

[16] Molchanov P., Yang X., Gupta S., et al. Online Detection and Classification of Dynamic Hand Gestures with Recurrent 3D Convolutional Neural Networks. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, 2016, pp. 4207-4215.

[17] GitHub repozytorii proiektu MediaPipe. [Elektronnyi resurs]. – Rezhym dostupu: https://github.com/google/mediapipe

[18] Kaggle - dataset zhestiv ruk dlia mashynnoho navchannia. [Elektronnyi resurs]. – Rezhym dostupu: https://www.kaggle.com/datasets/gti-upm/leapgestrecog

[19] NumPy - biblioteka dlia naukovykh obchyslen u Python. [Elektronnyi resurs]. – Rezhym dostupu: https://numpy.org/

[20] Martynenko S.S., Kot T.M. Metody rozpiznavannia dynamichnykh zhestiv dlia keruvannia interfeisamy. Systemy obrobky informatsii, 2019, 2(157), s. 74-82.

[21] Python Software Foundation - ofitsiinyi sait movy prohramuvannia Python. [Elektronnyi resurs]. – Rezhym dostupu: https://www.python.org/

[22]  Natsionalna biblioteka Ukrainy imeni V.I. Vernadskoho - elektronnyi arkhiv naukovykh publikatsii. [Elektronnyi resurs]. – Rezhym dostupu: http://www.nbuv.gov.ua/

[23] Stack Overflow - forum dlia rozrobnykiv prohramnoho zabezpechennia. [Elektronnyi resurs]. – Rezhym dostupu: https://stackoverflow.com/

[24] Kharchenko V.S., Yakovlev S.V. Metody mashynnoho navchannia dlia kiberfizychnykh system: analiz ta perspektyvy. Radioelektronni i kompiuterni systemy, 2020, 1(93), s. 24-36.

[25] Real Python - Resurs dlia vyvchennia Python ta yoho zastosuvannia v realnykh proiektakh. [Elektronnyi resurs]. – Rezhym dostupu: https://realpython.com/

[26] GitLab - platforma dlia rozrobky prohramnoho zabezpechennia. [Elektronnyi resurs]. – Rezhym dostupu: https://about.gitlab.com/

[27] W3Schools - Resurs dlia vyvchennia veb-tekhnolohii. [Elektronnyi resurs]. – Rezhym dostupu: https://www.w3schools.com/