A descent family of hybrid conjugate gradient methods with global convergence property for nonconvex functions

Document Type : Research Article

Author

Department of Applied Mathematics, Tarbiat Modares University, P.O.Box 14115-175, Tehran, Iran

Abstract

In this paper, we present a new hybrid conjugate gradient method for unconstrained optimization that possesses sufficient descent property independent of any line search. In our method, a convex combination of the Hestenes-Stiefel (HS) and the Fletcher-Reeves (FR) methods, is used as the conjugate parameter and the hybridization parameter is determined by minimizing the distance between the hybrid conjugate gradient direction and direction of the three-term HS method proposed by M. Li (\emph{A family of three-term nonlinear conjugate gradient methods close to the memoryless BFGS method,} Optim. Lett. \textbf{12} (8) (2018) 1911--1927). Under some standard assumptions, the global convergence property on general functions is established. Numerical results on some test problems in the CUTEst library illustrate the efficiency and robustness of our proposed method in practice. 

Keywords