A modified three-term conjugate gradient method for large –scale optimization
Main Article Content
Abstract
We propose a three-term conjugate gradient method in this paper . The basic idea is to exploit the good properties of the BFGS update. Quasi – Newton method lies a good efficient numerical computational, so we suggested to be based on BFGS method. However, the descent condition and the global convergent is proven under Wolfe condition. The new algorithm is very effective e for solving the large – scale unconstrained optimization problem.
Article Details

This work is licensed under a Creative Commons Attribution 4.0 International License.
Tikrit Journal of Pure Science is licensed under the Creative Commons Attribution 4.0 International License, which allows users to copy, create extracts, abstracts, and new works from the article, alter and revise the article, and make commercial use of the article (including reuse and/or resale of the article by commercial entities), provided the user gives appropriate credit (with a link to the formal publication through the relevant DOI), provides a link to the license, indicates if changes were made, and the licensor is not represented as endorsing the use made of the work. The authors hold the copyright for their published work on the Tikrit J. Pure Sci. website, while Tikrit J. Pure Sci. is responsible for appreciate citation of their work, which is released under CC-BY-4.0, enabling the unrestricted use, distribution, and reproduction of an article in any medium, provided that the original work is properly cited.
References
[1] Hestenes, M. R., & Stiefel, E. (1952). Methods of conjugate gradients for solving linear systems (Vol. 49): NBS Washington, DC.
[2] Fletcher, R. (1987). Practical methods of optimization john wiley & sons. New York, 80.
[3] Zhang, L., Zhou, W., & Li, D.-H. (2006). A descent modified Polak–Ribière–Polyak conjugate gradient method and its global convergence. IMA Journal of Numerical Analysis, 26(4), 629-640.
[4] Yu, G., Guan, L., & Li, G. (2008). Global convergence of modified Polak-Ribière-Polyak conjugate gradient methods with sufficient descent property. Journal of Industrial & Management Optimization, 4(3), 565.
[5] Dai, Y.-H., & Liao, L.-Z. (2001). New conjugacy conditions and related nonlinear conjugate gradient methods. Applied Mathematics and Optimization, 43(1), 87-101.
[6] Liu, Y., & Storey, C. (1991). Efficient generalized conjugate gradient algorithms, part 1: theory. Journal of optimization theory and applications, 69(1), 129-137.
[7] Dai, Y.-H., & Liao, L.-Z. (2001). New conjugacy conditions and related nonlinear conjugate gradient methods. Applied Mathematics and Optimization, 43(1), 87-101.
[8] Gilbert, J. C., & Nocedal, J. (1992). Global convergence properties of conjugate gradient methods for optimization. SIAM Journal on optimization, 2(1), 21-42.
[9] Zhang, L. (2009). New versions of the Hestenes-Stiefel nonlinear conjugate gradient method based on the secant condition for optimization. Computational & Applied Mathematics, 28(1).
[10] Li, D.-H., & Fukushima, M. (2001). A modified BFGS method and its global convergence in nonconvex minimization. Journal of Computational and Applied Mathematics, 129(1-2), 15-35.
[11] Li, M. (2018). A modified Hestense–Stiefel conjugate gradient method close to the memoryless BFGS quasi-Newton method. Optimization Methods and Software, 33(2), 336-353.
[12] Liu, D. C., & Nocedal, J. (1989). On the limited memory BFGS method for large scale optimization. Mathematical programming, 45(1-3), 503-528.
[13] Nocedal, J. (1980). Updating quasi-Newton matrices with limited storage. Mathematics of computation, 35(151), 773-782.
[14] Shanno, D. F. (1978). Conjugate gradient methods with inexact searches. Mathematics of operations research, 3(3), 244-256.
[15] Amini, K., Faramarzi, P., & Pirfalah, N. (2019). A modified Hestenes–Stiefel conjugate gradient method with an optimal property. Optimization Methods and Software, 34(4), 770-782.
[16] Liu, D., & Xu, G. A Perry descent conjugate gradient method with restricted spectrum (No. 2010-11, p. 08). Technical Report of Optimization
[17] Dai, Y.-H., & Yuan, Y. (1999). A nonlinear conjugate gradient method with a strong global convergence property. SIAM Journal on optimization, 10(1), 177-182.
[18] Zoutendijk, G. (1970). Nonlinear programming, computational methods. Integer and nonlinear programming, 37-86.
[19] Andrei, N. (2008). An unconstrained optimization test functions collection. Adv. Model. Optim, 10(1), 147-161.
[20] Dolan, E. D., & Moré, J.J.(2002). Benchmarking optimization software with performance profiles. Mathematical programming, 91(2), 201-213.