A new modified conjugate gradient method under the strong Wolfe line search for solving unconstrained optimization problems

2022;
: pp. 111–118
https://doi.org/10.23939/mmc2022.01.111
Received: July 07, 2021
Revised: December 19, 2021
Accepted: December 20, 2021

Mathematical Modeling and Computing, Vol. 9, No. 1, pp. 111–118 (2022)

1
Department of Mathematics, Universiti Putra Malaysia
2
Department of Mathematics, Universiti Putra Malaysia
3
Department of Mathematics, Universiti Putra Malaysia

Conjugate gradient (CG) method is well-known due to efficiency to solve the problems of unconstrained optimization because of its convergence properties and low computation cost.  Nowadays, the method is widely developed to compete with existing methods in term of their efficiency.  In this paper, a modification of CG method will be proposed under strong Wolfe line search.  A new CG coefficient is presented based on the idea of make use some parts of the previous existing CG methods to retain the advantages.  The proposed method guarantees that the sufficient descent condition holds and globally convergent under inexact line search.  Numerical testing provides strong indication that the proposed method has better capability when solving unconstrained optimization compared to the other methods under inexact line search specifically strong Wolfe–Powell line search.

  1. Hestenes M. R., Stiefel E.  Methods of conjugate gradients for solving linear systems.  Journal of Research of the National Bureau of Standards.  49 (6), 409–436 (1952).
  2. Fletcher R., Reeves C. M.  Function minimization by conjugate gradients.  The Computer Journal.  7 (2), 149–154 (1964).
  3. Polak E., Ribiere G.  Note sur la convergence de méthodes de directions conjuguées.  Revue Franзaise d’informatique et de Recherche Opérationnelle.  Série Rouge.  3 (16), 35–43 (1969).
  4. Polyak B. T.  The conjugate gradient method in extremal problems.  USSR Computational Mathematics and Mathematical Physics.  9 (4), 94-–112 (1969).
  5. Fletcher R.  Practical Methods of Optimization. Wiley-Interscience, New York, NY, USA (1987).
  6. Zoutendijk G.  Some algorithms based on the principle of feasible directions.  Nonlinear programming.  Proceedings of a Symposium Conducted by the Mathematics Research Center, the University of Wisconsin–Madison, May 4–6, 1970. 93–121 (1970).
  7. Powell M. J. D.  Nonconvex minimization calculations and the conjugate gradient method.  In: Griffiths D. F. (eds) Numerical Analysis.  Lecture Notes in Mathematics, vol. 1066. Springer, Berlin, Heidelberg. 122–141 (1984).
  8. Powell M. J. D.  Convergence Properties of Algorithms for Nonlinear Optimization.  SIAM Review.  28 (4), 487–500 (1986).
  9. Powell M. J. D.  Restart procedures for the conjugate gradient method.  Mathematical programming.  12 (1), 241–254 (1977).   
  10. Al-Baali M.  Descent property and global convergence of the Fletcher–Reeves method with inexact line search.  IMA Journal of Numerical Analysis.  5 (1), 121–124 (1985).
  11. Touati-Ahmed D., Storey C.  Efficient hybrid conjugate gradient techniques.  Journal of optimization theory and applications.  64 (2), 379–397 (1990).
  12. Gilbert J. C., Nocedal J.  Global Convergence Properties of Conjugate Gradient Methods for Optimization.  SIAM Journal on Optimization.  2 (1), 21–42 (1992).           
  13. Jiang X., Jian J.  Improved Fletcher–Reeves and Dai–Yuan conjugate gradient methods with the strong Wolfe line search.  Journal of Computational and Applied Mathematics.  348, 525–534 (2019).   
  14. Mtagulwa P., Kaelo P.  An efficient modified PRP-FR hybrid conjugate gradient method for solving unconstrained optimization problems.  Applied Numerical Mathematics.  145, 111–120 (2019).
  15. Rivaie M., Mamat M., June L. W., Mohd I.  A new class of nonlinear conjugate gradient coefficients with global convergence properties.  Applied Mathematics and Computation.  218 (22), 11323–11332 (2012).
  16. Pytlak R.  Conjugate gradient algorithms in nonconvex optimization.  Springer Science & Business Media. Vol. 89 (2008).
  17. Hamoda M., Mamat M., Rivaie M., Salleh Z.  A conjugate gradient method with Strong Wolfe–Powell line search for unconstrained optimization.  Applied Mathematical Sciences.  10 (15), 721–734 (2016).
  18. Rivaie M., Mamat M., Abashar A.  A new class of nonlinear conjugate gradient coefficients with exact and inexact line searches.  Applied Mathematics and Computation.  268, 1152–1163 (2015).   
  19. Andrei N.  An Unconstrained Optimization Test Functions Collection.  Advanced Modelling and Optimization.  10 (1), 147–161 (2008).
  20. Hillstrom K. E.  A Simulation Test Approach to the Evaluation of Nonlinear Optimization Algorithms.   ACM Transactions on Mathematical Software.  3 (4), 305–315 (1977).                   
  21. Dolan E. D., Moré J. J.  Benchmarking optimization software with performance profiles.  Mathematical Programming. Series B.  91 (2), 201–213 (2012).