Face-Based Engagement Detection Methods: A Review

2026;
: pp. 41–51
Received: March 28, 2025
Revised: November 07, 2025
Accepted: December 05, 2025

Qarbal I., Sael N., Ouahabi S. Face-Based Engagement Detection Methods: A Review.  Mathematical Modeling and Computing. Vol. 13, No. 1, pp. 41–51 (2026)

1
Faculty of Sciences, Hassan II University of Casablanca, Casablanca, Morocco
2
Faculty of Sciences, Hassan II University of Casablanca, Casablanca, Morocco
3
Faculty of Sciences, Hassan II University of Casablanca, Casablanca, Morocco

The detection of student engagement in online learning environments has become increasingly important with the widespread adoption of e-learning platforms.  This paper reviews current approaches for monitoring student engagement based on facial expressions, gaze tracking, fatigue and drowsiness detection, and multimodal systems.  By analyzing facial expressions, systems can detect emotional states such as happiness, frustration, and boredom, offering real-time feedback to instructors.  Gaze tracking provides insights into students focus, although challenges such as hardware costs and lighting conditions affect its accuracy.  Fatigue and drowsiness detection, through blinking and yawning analysis, helps identify cognitive overload, while multimodal systems that combine facial, behavioral, and physiological data offer a more comprehensive picture of engagement.  This review highlights the potential of these methods while addressing the need for more robust, scalable, and privacy-conscious systems for real-time engagement monitoring in di-verse e-learning contexts.

  1. Rothkrantz L.  Dropout Rates of Regular Courses and MOOCs.  Computers Supported Education.  25–46 (2017).
  2. Mishra L., Gupta T., Shree A.  Online teaching-learning in higher education during lockdown period of COVID-19 pandemic.  International Journal of Educational Research Open.  1, 100012 (2020).
  3. Villaroya S. M., Gamboa-Montero J. J., Bernardino A., Maroto-Gomez M., Castillo J. C., Salichs M. A.  Real-time Engagement Detection from Facial Features.  2022 IEEE International Conference on Development and Learning (ICDL).  231–237 (2022).
  4. Dewan M. A. A., Murshed M., Lin F.  Engagement detection in online learning: a review.  Smart Learn. Environ.  6 (1), 1 (2019).
  5. Pal S., Mukhopadhyay S., Suryadevara N.  Development and Progress in Sensors and Technologies for Human Emotion Recognition.  Sensors.  21 (16), 5554 (2021).
  6. Bhardwaj P., Gupta P. K., Panwar H., Siddiqui M. K., Morales-Menendez R., Bhaik A.  Application of Deep Learning on Student Engagement in e-learning environments.  Computers & Electrical Engineering.  93, 107277 (2021).
  7. Meriem B., Benlahmar H., Naji M. A., Sanaa E., Wijdane K.  Determine the Level of Concentration of Students in Real-Time from their Facial Expressions.  International Journal of Advanced Computer Science and Applications (IJACSA).  13 (1), 159–166 (2022).
  8. Gupta S., Kumar P., Tekchandani R.  EDFA: Ensemble deep CNN for assessing student's cognitive state in adaptive online learning environments.  International Journal of Cognitive Computing in Engineering.  4, 373–387 (2023).
  9. Lasri I., Solh A. R., Belkacemi M. E.  Facial Emotion Recognition of Students using Convolutional Neural Network.  2019 Third International Conference on Intelligent Computing in Data Sciences (ICDS).  1–6 (2019).
  10. Hsia C.-H., Chiang B., Ke L.-Y., Ciou Z.-Y., Lai C.-F.  Student Engagement Analysis Using Facial Expression in Online Course.  2022 IET International Conference on Engineering Technologies and Applications (IET-ICETA).  1–2 (2022).
  11. Zhao H., Kim B.-G., Slowik A., Pan D.  Temporal–spatial correlation and graph attention-guided network for micro-expression recognition in English learning livestreams.  Discover Computing.  27 (1), 47 (2024).
  12. Ayvaz U., Gürüler H., Devrim M. O.  Use of Facial Emotion Recognition in e-Learning Systems.  Information Technologies and Learning Tool.  60 (4), 95–104 (2017).
  13. Albarrak K. M., Sorour S. E.  Web-Enhanced Vision Transformers and Deep Learning for Accurate Event-Centric Management Categorization in Education Institu-tions.  Systems.  12 (11), 475 (2024).
  14. Abbas R., Ni B., Ma R., Li T., Lu Y., Li X.  Context-based emotion recognition: A survey.  Neurocomputing.  618, 129073 (2025).
  15. Jamil N., Belkacem A. N., Lakas A.  On enhancing students' cognitive abilities in online learning using brain activity and eye movements.  Education and Information Technologies.  28 (4), 4363–4397 (2023).
  16. Deng R., Gao Y.  A review of eye tracking research on video-based learning.  Education and Information Technologies.  28 (6), 7671–7702 (2023).
  17. Burch M., Haymoz R., Lindau S. The Benefits and Drawbacks of Eye Tracking for Improving Educational Systems.  ETRA '22: 2022 Symposium on Eye Tracking Research and Applications.  53, 1–5 (2022).
  18. Lin Y., Zhang P., Song G.  Study on Learning Status Information in Education Based on Line-of-Sight Tracking in Artificial Intelligence Environment.  2021 (8), (2021).
  19. Haataja E., Salonen V., Laine A., Toivanen M., Hannula M. S.  The Relation Between Teacher–Student Eye Contact and Teachers' Interpersonal Behavior During Group Work: a Multiple–Person Gaze–Tracking Case Study in Secondary Mathematics Education.  Educational Psychology Review.  33 (1), 51–67 (2021).
  20. Li X., Lin J., Tian Z., Lin Y.  An Explainable Student Fatigue Monitoring Mod-ule with Joint Facial Representation.  Sensors.  23 (7), 3602 (2023).
  21. Dedhia R., Dand M., Akhave A., Sonawane B., Scholar U.  Drowsiness Detection in E-Learning.  Journal of Emerging Technologies and Innovative Research.  6 (4), (2019).
  22. Lahoti U., Joshi R., Vyas N., Deshpande K., Jain S.  Drowsiness Detection System for Online Courses.  International Journal of Advanced Trends in Computer Science and Engineering.  9 (2), 1930–1934 (2020).
  23. Abdulkader R., Tayseer Mohammad Ayasrah F., Nallagattla V. R. G., Kant Hiran K., Dadheech P., Balasubramaniam V., Sengan S.  Optimizing student engagement in edge-based online learning with advanced analytics.  Array.  19, 100301 (2023).
  24. Safarov F., Akhmedov F., Abdusalomov A. B., Nasimov R., Cho Y. I.  Real-Time Deep Learning-Based Drowsiness Detection: Leveraging Computer-Vision and Eye-Blink Analyses for Enhanced Road Safety.  Sensors.  23 (14), 6459 (2023).
  25. Li D., Cui Z., Cao F., Cui G., Shen J., Zhang Y.  Learning State Assessment in Online Education Based on Multiple Facial Features Detection.  Computational Intelligence and Neuroscience.  2022, 3986470 (2022).
  26. Uçar M. U., Özdemir E.  Recognizing Students and Detecting Student Engagement with Real-Time Image Processing.  Electronics.  11 (9), 1500 (2022).
  27. Gupta S., Kumar P., Tekchandani R.  A multimodal facial cues based engagement detection system in e-learning context using deep learning approach.  Multimedia Tools and Applications.  82 (18), 28589–28615 (2023).
  28. Kawamura R., Shirai S., Takemura N., Alizadeh M., Cukurova M., Takemura H.  Detecting Drowsy Learners at the Wheel of e-Learning Platforms With Multimodal Learning Analytics.  IEEE Access.  9, 115165–115174 (2021).
  29. Watanabe K., Sathyanarayana T., Dengel A., Ishimaru S.  EnGauge: Engagement Gauge of Meeting Participants Estimated by Facial Expression and Deep Neural Network.  IEEE Access.  11, 52886–52898 (2023).
  30. Sassi A., Chérif S., Jaafar W.  Intelligent Framework for Monitoring Student Emotions During Online Learning.  Engineering Applications of Neural Networks.  207–219 (2024).
  31. Abdellaoui B., Remaida A., Sabri Z., Abdellaoui M., El Hafidy A., El Bouzekri El Idrissi Y., Moumen A.  Analyzing emotions in online classes: Unveiling insights through topic modeling, statistical analysis, and random walk techniques.  International Journal of Cognitive Computing in Engineering.  5, 221–236 (2024).