Performance evaluation of a novel Conjugate Gradient Method for training feed forward neural network

2023;
: pp. 326–337
https://doi.org/10.23939/mmc2023.02.326
Received: June 22, 2022
Revised: January 28, 2023
Accepted: February 05, 2023

Mathematical Modeling and Computing, Vol. 10, No. 2, pp. 326–337 (2023)

1
Department of Mathematical Science, Faculty of Computing and Mathematics, Kano University of Science and Technology
2
School of Quantitative Sciences, Universiti Utara Malaysia; Institute of Strategic Industrial Decision Modelling (ISIDM), SQS, Universiti Utara Malaysia
3
Department of Mathematical Science, Faculty of Computing and Mathematics, Kano University of Science and Technology
4
School of Dental Sciences, Universiti Sains Malaysia
5
Faculty of Informatics and Computing, Universiti Sultan Zainal Abidin

In this paper, we construct a new conjugate gradient method for solving unconstrained optimization problems.  The proposed method satisfies the sufficient decent property irrespective of the line search and the global convergence was established under some suitable.  Further, the new method was used to train different sets of data via a feed forward neural network.  Results obtained show that the proposed algorithm significantly reduces the computational time by speeding up the directional minimization with a faster convergence rate.

  1. Sulaiman I. M., Mamat M.  A new conjugate gradient method with descent properties and its application to regression analysis.  Journal of Numerical Analysis, Industrial and Applied Mathematics.  14 (1–2), 25–39 (2020).
  2. Dennis J. E., Schnable R. B.  Numerical Methods for Unconstrained Optimization and Nonlinear Equations.  SIAM, Philadelphia (1993).
  3. Abashar A., Mamat M., Rivaie M., Ismail M.  Global convergence properties of a new class of conjugate gradient method for unconstrained optimization.  Applied Mathematics and Computation.  8 (67), 3307–3319 (2014).
  4. Rivaie M., Mamat M., Mohd I., Fauzi M.  A comparative study of conjugate gradient coefficient for unconstrained optimization.  Australian Journal of Basic and Applied Sciences.  5 (9), 947–951 (2011).
  5. Rivaie M., Mamat M., Leong W. J., Mohd I.  A new conjugate gradient coefficient for large scale nonlinear unconstrained optimization.  International Journal of Mathematical Analysis.  6 (23), 1131–1146 (2012).
  6. Yakubu U. A., Sulaiman I. M., Mamat M., Ghazali P., Khalid K.  The global convergence properties of a descent conjugate gradient method.  Journal of Advanced Research in Dynamical and Control Systems.  12 (2), 1011–1016 (2020).
  7. Malik M., Mamat M., Abas S. S., Sulaiman I. M., Sukono.  A new spectral conjugate gradient method with descent condition and global convergence property for unconstrained optimization.  Journal of Mathematical and Computational Science.  10 (5), 2053–2069 (2020).
  8. Awwal A. M., Sulaiman I. M., Malik M., Mamat M., Kumam P., Sitthithakerngkiet K.  A Spectral RMIL+ Conjugate Gradient Method for Unconstrained Optimization With Applications in Portfolio Selection and Motion Control.  IEEE Access.  9, 75398–75414 (2021).
  9. Ishak M. I., Marjugi S. M., June W.  A new modified conjugate gradient method under the strong Wolfe line search for solving unconstrained optimization problems.  Mathematical Modeling and Computing.  9 (1), 111–118 (2022).
  10. Kamfa K., Waziri M. Y., Mamat M., Mohamed M. A., Puspa L. G.  A New Modified Three Term CG Search Direction for Solving Unconstrained Optimization Problems.  Journal of Advanced Research in Modeling and Simulation.  1 (1), 23–30 (2018).
  11. Andrei N.  An unconstrained optimization test functions collection.  Advanced Modelling and Optimization.  10 (1), 147–161 (2008).
  12. Kamfa K., Mamat M., Abashar A., Rivaie M., Ghazali P. L., Salleh Z.  Another modified conjugate gradient coefficient with global convergence properties.  Applied Mathematical Sciences.  9 (37), 1833–1844 (2015).
  13. Sun J., Zhang L.  Global convergence of conjugate gradient methods without line search.  Annals of Operation Research.  103, 161–173 (2001).
  14. Wu Q.-j.  A Nonlinear Conjugate Gradient Method without Line Search and Its Global Convergence.  2011 International Conference on Computational and Information Sciences. 1148–1152 (2011).
  15. Hager W. W., Zhang H.  A survey of nonlinear conjugate gradient methods.  Pacific Journal of Optimization.  2 (1), 35–58 (2006).
  16. Kamilu K., Waziri M. Y., Mamat M., Mohamad A. M.  A derivative free Newton-like method with improved rational approximation model for solving nonlinear equations.  Far East Journal of Mathematical Sciences.  105 (1), 119–129 (2018).
  17. Kamilu K., Waziri M. Y., Ibrahim S. M., Mamat M., Abas S. S.  An Efficient Hybrid BFGS-CG Search Direction for Solving Unconstrained Optimization Problems.  Journal of Advanced Research in Dynamical and Control Systems.  12 (2), 1035–1041 (2020).
  18. Kamfa K., Sulaiman I. M., Waziri M. Y., Abashar A.  Another improved three term PRP-CG method with global convergent properties for solving unconstrained optimization problems.  Malaysian Journal of Computing and Applied Mathematics.  1 (1), 1–10 (2018).
  19. Kamfa K., Waziri M. Y., Sulaiman I. M., Ibrahim M. A. H., Mamat M.  An Efficient Three Term CG Method using a Modified FR Formula for Solving Unconstrained Optimization Problems.  Journal of Advanced Research in Dynamical and Control System.  12 (2), 1027–1034 (2020).
  20. Kamfa K. U., Mamat M., Abashar A., Rivaie M., Ghazali P. L. B., Salleh Z.  Another Modified DPRP Conjugate Gradient Method with Global Convergent Properties.  Far East Journal of Mathematical Sciences.  9 (37), 1833–1844 (2015).
  21. Hestenes M. R., Stiefel E.  Methods of conjugate gradients for solving linear systems.  Journal of Research of the National Bureau of Standards.  49 (6), 409–435 (1952).
  22. Polak E., Ribiere G.  Note sur la convergence de méthodes de directions conjuguées.  ESAIM: Mathematical Modelling and Numerical Analysis-Modélisation Mathématique et Analyse Numérique.  3 (16), 35–43 (1969).
  23. Polyak B. T.  The conjugate gradient method in extremal problems.  USSR Computational Mathematics and Mathematical Physics.  9 (4), 94–112 (1969).
  24. Fletcher R., Powell M. J. D.  A rapidly convergent descent method for minimization.  The Computer Journal.  6 (2), 163–168 (1963).
  25. Liu Y., Storey C.  Efficient generalized conjugate gradient algorithms, part 1: Theory.  Journal of Optimization Theory and Applications.  69 (1), 129–137 (1991).
  26. Fletcher R.  Practical Methods of Optimization. John Wiley & Sons (2020).
  27. Dai Y., Han J., Liu G., Sun D., Yin H., Yuan Y. X.  Convergence properties of nonlinear conjugate gradient methods.  SIAM Journal on Optimization.  10 (2), 345–358 (2000).
  28. Andrei N.  Nonlinear Conjugate Gradient Methods for Unconstrained Optimization.  Springer Optimization and its application (2020).
  29. Yuan G., Wei Z., Lu X.  Global convergence of BFGS and PRP methods under a modified weak Wolfe–Powell line search.  Applied Mathematical Modelling.  47, 811–825 (2017).
  30. Zhang L., Zhou W., Li D.-H.  A descent modified Polak–Ribière–Polyak conjugate gradient method and its global convergence.  IMA Journal of Numerical Analysis.  26 (4), 629–640 (2006).
  31. Dai Z.  Comments on a new class of nonlinear conjugate gradient coefficients with global convergence properties.  Applied Mathematics and Computation.  276, 297–300 (2016).
  32. Yousif O. O. O.  The convergence properties of RMIL+ conjugate gradient method under the strong Wolfe line search.  Applied Mathematics and Computation.  367, 124777 (2020).
  33. Sulaiman I. M., Malik M., Awwal A. M., Kumam P., Mamat M., Al-Ahmad S.  On three-term conjugate gradient method for optimization problems with applications on COVID-19 model and robotic motion control.  Advances in Continuous and Discrete Models.  2022, 1 (2022).
  34. Gilbert J. C., Nocedal J.  Global convergence properties of conjugate gradient methods for optimization.  SIAM Journal on Optimization.  2 (1), 21–42 (1992).
  35. Wei Z., Yao S., Liu L.  The Convergence Properties of some New Conjugate Gradient Methods.  Applied Mathematics and Computation.  183 (2), 1341–1350 (2006).
  36. Dai Z., Wen F.  Another improved Wei–Yao–Liu non-linear conjugate gradient method with sufficient descent property.  Applied Mathematics and Computation.  218 (14), 7421–7430 (2012).
  37. Zabidin S., Adel A., Ahmad A.  Two efficient modifications of AZPRP conjugate gradient method with sufficient descent property.  Journal of Inequalities and Applications.  2022, 14 (2022).
  38. Kamfa K., Ibrahim S. M., Sufahani S. F., Yunus R. Y., Mamat M.  A modified BFGS method via new rational approximation model for solving unconstrained optimization problems and its application.  Advances in Mathematics: Scientific Journal.  5, 10771–10786 (2020).
  39. Ma G., Lin H., Han D.  Two modified conjugate gradient methods for unconstrained optimization with applications in image restoration problems.  Journal of Applied Mathematics and Computing.  68, 4733–4758 (2022).
  40. Dolan E., Moré J. J.  Benchmarking optimization software with performance profile.  Mathematical Programming.  91, 201–213 (2002).
  41. Yoksal A. L., Abbo K. K., Hisham M. K.  Training feed forward neural network with modified Fletcher–Reeves method.  Journal of Multidisciplinary Modelling and Optimization.  1 (1), 14–22 (2018).
  42. Livieris I., Pintelas P.  Performance evaluation of descent CG methods, for neural net-works training.  Proceedings of the 9th Hellenic European Research on Computer Mathematics and its Applications Conference (HERCMA '09). 40–46 (2009).
  43. Rumelhart D. E., Hinton G. E., Williams R. J.  Learning representations by back-propagating errors.  Nature.  323, 533–536 (1986).
  44. Battiti R.  First-and second-order methods for learning: between steepest descent and Newton's method.  Neural Computation.  4 (2), 141–166 (1992).