Development of a deep learning-based system in Python 3.9 with YOLOv5: A case study on real-time fish counting based on classification

2025;
: pp. 682–692
https://doi.org/10.23939/mmc2025.02.682
Received: April 04, 2025
Revised: June 24, 2025
Accepted: June 26, 2025

Multajam R., Ayob A. F. M., Jamaludin Sh., Mada Sanjaya W. S., Sambas A., Rusyn V., Senyk Y. A.  Development of a deep learning-based system in Python 3.9 with YOLOv5: A case study on real-time fish counting based on classification.  Mathematical Modeling and Computing. Vol. 12, No. 2, pp. 682–692 (2025)

1
Universiti Malaysia Terengganu
2
Universiti Malaysia Terengganu
3
Universiti Malaysia Terengganu
4
Universitas Islam Negeri Sunan Gunung Djati
5
Universiti Sultan Zainal Abidin; Universitas Muhammadiyah Tasikmalaya
6
Yuriy Fedkovych Chernivtsi National University
7
National Forestry University of Ukraine

This study developed a real-time fish classification and counting system for six types of fish using the YOLOv5 machine learning model with high accuracy.  The system achieved an F1-score of 0.87 and a precision confidence curve with an all-classes value of 1.00 at a confidence level of 0.920, demonstrating the model's reliability in object detection and classification.  Real-time testing showed that the system could operate quickly and accurately under various environmental conditions with an average inference speed of 30 FPS.  However, several challenges remain, such as sensitivity to low-light conditions.  Overall, this system has significant potential for applications in aquaculture, particularly for automated and real-time fish monitoring.  With compatibility through the ONNX format, the system is also flexible for integration into IoT-based devices or cross-platform applications, providing a solid foundation for further advancements in computer vision-based fish monitoring technology.

  1. Chen P.-H. C., Gadepalli K., MacDonald R., Liu Y., Kadowaki S., Nagpal K., Kohlberger T., Dean J., Corrado G. S., Hipp J. D., Mermel C. H., Stumpe M. C.  An augmented reality microscope with real-time artificial intelligence integration for cancer diagnosis.  Nature Medicine.  25, 1453–1457 (2019).
  2. Multajam R., Ayob A. F. M., Mada Sanjaya W. S., Sambas A., Rusyn V.  Color-based image processing techniques for laser range finder: a comparative study on air and water distance detection.  Sixteenth International Conference on Correlation Optics.  129381A (2024).
  3. Isabelle D. A., Westerlund M.  A review and categorization of artificial intelligence-based opportunities in wildlife, ocean and land conservation.  Sustainability.  14 (4), 1979 (2022).
  4. Khokher M. R., Little L. R., Tuck G. N., Smith D. V., Qiao M., Devine C., O'Neill H., Pogonoski J. J., Arangio R., Wang D.  Early lessons in deploying cameras and artificial intelligence technology for fisheries catch monitoring: where machine learning meets commercial fishing.  Canadian Journal of Fisheries and Aquatic Sciences.  79 (2), 257–266 (2022).
  5. Ismail N., Malik O. A.  Real-time visual inspection system for grading fruits using computer vision and deep learning techniques.  Information Processing in Agriculture.  9 (1), 24–37 (2022).
  6. Hu J., Zhao D., Zhang Y., Zhou C., Chen W.  Real-time nondestructive fish behavior detecting in mixed polyculture system using deep-learning and low-cost devices.  Expert Systems with Applications.  178, 115051 (2021).
  7. Rico-Diaz A. J., Rabunal J. R., Gestal M., Mures O. A., Puertas J.  An application of fish detection based on eye search with artificial vision and artificial neural networks.  Water.  12 (11), 3013 (2020).
  8. Unlu E., Zenou E., Riviere N., Dupouy P.-E.  Deep learning-based strategies for the detection and tracking of drones using several cameras.  IPSJ Transactions on Computer Vision and Applications.  11, 7 (2019).
  9. Jalal A., Salman A., Mian A., Shortis M., Shafait F.  Fish detection and species classification in underwater environments using deep learning with temporal information.  Ecological Informatics.  57, 101088 (2020).
  10. Boudhane M., Nsiri B.  Underwater image processing method for fish localization and detection in submarine environment.  Journal of Visual Communication and Image Representation.  39, 226–238 (2016).
  11. Reynard D., Shirgaokar M.  Harnessing the power of machine learning: Can twitter data be useful in guiding resource allocation decisions during a natural disaster?  Transportation Research Part D: Transport and Environment.  77, 449–463 (2019).
  12. Klapp I., Arad O., Rosenfeld L., Barki A., Shaked B., Zion B.  Ornamental fish counting by non-imaging optical system for real-time applications.  Computers and Electronics in Agriculture.  153, 126–133 (2018).
  13. Zhang L., Li W., Liu C., Zhou X., Duan Q.  Automatic fish counting method using image density grading and local regression.  Computers and Electronics in Agriculture.  179, 105844 (2020).
  14. Liu H., Lang B.  Machine learning and deep learning methods for intrusion detection systems: A survey.  Applied Sciences.  9 (20), 4396 (2019).
  15. Hong S., Zhou Y., Shang J., Xiao C., Sun J.  Opportunities and challenges of deep learning methods for electrocardiogram data: A systematic review.  Computers in Biology and Medicine.  122, 103801 (2020).
  16. Iqbal M. A., Wang Z., Ali Z. A., Riaz S.  Automatic fish species classification using deep convolutional neural networks.  Wireless Personal Communications.  116, 1043–1053 (2021).
  17. Wang D., Li W., Liu X., Li N., Zhang C.  UAV environmental perception and autonomous obstacle avoidance: A deep learning and depth camera combined solution.  Computers and Electronics in Agriculture.  175, 105523 (2020).
  18. Li X., Shang M., Qin H., Chen L.  Fast accurate fish detection and recognition of underwater images with fast R-CNN.  OCEANS 2015 - MTS/IEEE Washington.  1–5 (2015).
  19. Brownscombe J. W., Hyder K., Potts W., Wilson K. L., Pope K. L., Danylchuk A. J., Cooke S. J., Clarke A., Arlinghaus R., Post J. R.  The future of recreational fisheries: Advances in science, monitoring, management, and practice.  Fisheries Research.  211, 247–255 (2019).
  20. Khan A. N., En X., Raza M. Y., Khan N. A., Ali A.  Sectorial study of technological progress and CO$_2$ emission: Insights from a developing economy.  Technological Forecasting and Social Change.  151, 19862 (2020).
  21. Fan F.-L., Xiong J., Li M., Wang G.  On interpretability of artificial neural networks: A survey.  IEEE Transactions on Radiation and Plasma Medical Sciences.  5 (6), 741–760 (2021).
  22. Abdul Aziz M., Bukhari W., Sukhaimie M., Izzuddin T., Norasikin M., Rasid A., Bazilah N.  Development of smart sorting machine using artificial intelligence for chili fertigation industries.  Journal of Automation Mobile Robotics and Intelligent Systems.  15, 44–52 (2021).
  23. Sanjaya W. M., Anggraeni D., Zakaria K., Juwardi A., Munawwaroh M.  The design of face recognition and tracking for human-robot interaction.  2017 2nd International conferences on Information Technology, Information Systems and Electrical Engineering (ICITISEE).  315–320 (2017).
  24. Chahal A., Gulia P.  Machine Learning and Deep Learning.  International Journal of Innovative Technology and Exploring Engineering.  8 (12), 4910–4914 (2019).
  25. Choi R. Y., Coyner A. S., Kalpathy-Cramer J., Chiang M. F., Campbell J. P.  Introduction to machine learning, neural networks, and deep learning.  Translational Vision Science & Technology.  9 (2), 14 (2020).
  26. Torfi A., Shirvani R. A., Keneshloo Y., Tavaf N., Fox E. A.  Natural language processing advancements by deep learning: A survey.  Preprint arXiv:2003.01200 (2020).
  27. Dong S., Wang P., Abbas K.  A survey on deep learning and its applications.  Computer Science Review.  40, 100379 (2021).
  28. Akcay H. G., Kabasakal B., Aksu D., Demir N., Oz M., Erdogan A.  Automated bird counting with deep learning for regional bird distribution mapping.  Animals.  10 (7), 1207 (2020).
  29. Redmon J., Divvala S., Girshick R., Farhadi A.  You only look once: Unified, real-time object detection.  2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR). 779–788 (2016).
  30. Zhao Z.-Q., Zheng P., Xu S.-T., Wu X.  Object detection with deep learning: A review.  IEEE Transactions on Neural Networks and Learning Systems.  30 (11), 3212–3232 (2019).
  31. Jing L., Yang X., Tian Y.  Video you only look once: Overall temporal convolutions for action recognition.  Journal of Visual Communication and Image Representation.  52, 58–65 (2018).
  32. Du J.  Understanding of object detection based on CNN family and YOLO.  Journal of Physics: Conference Series.  1004, 012029 (2018).
  33. Ayob A., Khairuddin K., Mustafah Y., Salisa A., Kadir K.  Analysis of pruned neural networks (mobilenetv2-yolo v2) for underwater object detection.  11th National Technical Seminar on Unmanned System Technology 2019.  87–98 (2021).
  34. Shafiee M. J., Chywl B., Li F., Wong A.  Fast YOLO: A Fast You Only Look Once System for Real-time Embedded Object Detection in Video.  Journal of Computational Vision and Imaging Systems.  3 (1), (2017).
  35. Patricio D. I., Rieder R.  Computer vision and artificial intelligence in precision agriculture for grain crops: A systematic review.  Computers and Electronics in Agriculture.  153, 69–81 (2018).
  36. Diwan T., Anirudh G., Tembhurne J. V.  Object detection using YOLO: Challenges, architectural successors, datasets and applications.  Multimedia Tools and Applications.  82, 9243–9275 (2023).
  37. Multajam R., Ayob A. F. M., Mada Sanjaya W. S., Sambas A., Rusyn V., Samila A.  Real-time detection and classification of fish in underwater environment using YOLOV5: A comparative study of deep learning architectures.  Informatyka, Automatyka, Pomiary w Gospodarce i Ochronie Środowiska.  14, 91–95 (2024).
  38. Ajayi O. G., Ashi J., Guda B.  Performance evaluation of YOLO v5 model for automatic crop and weed classification on UAV images.  Smart Agricultural Technology.  5, 100231 (2023).
  39. Abhijit Akhil S., Kumar V. A., Jose B. K., Abubeker K.  Computer vision assisted bird–eye chilli classification framework using yolo v5 object detection model.  Power Engineering and Intelligent Systems.  217–226 (2023).
  40. Mada Sanjaya W. S.  Deep Learning Citra Medis Berbasis Pemrograman Python.  BOLABOT (2023).