A new hybrid conjugate gradient method as a convex combination methods
DOI:
https://doi.org/10.31926/but.mif.2025.5.67.2.14Keywords:
hybrid conjugate gradient method, sufficient descent direction, global convergence, numerical comparisonsAbstract
The conjugate gradient (CG) method is a widely employed algorithm for solving large-scale unconstrained optimization problems due to its fast convergence and efficient memory usage. In this paper, we suggest a new hybrid nonlinear conjugate gradient method, which the conjugate gradient coefficient βk is a convex combination of βkNPRP and βkDY . The parameter θk is computed in such a way that the conjugacy condition is satisfied. With the strong Wolfe line search, the descent property and global convergence of the new hybrid method are proved. The numerical results also show that our method is robust and efficient.