A New Hybrid of DY and CGSD Conjugate Gradient Methods for Solving Unconstrained Optimization Problems

Main Article Content

Zeyad Mohammed Abdullah
Iman Khalid Jamalaldeen

Abstract

In this article, we present a new hybrid conjugate gradient method for solving large Scale in unconstrained optimization problems. This method is a convex combination of Dai-Yuan conjugate gradient and Andrei- sufficient descent condition, satisfies the famous D-L conjugacy condition and in the same time solidarities with the newton direction with the suitable condition. The suggestion method always yields a descent search direction at each it iteration. Under strong wolfe powell(SWP) line search condition, the direction satisfy the global convergence of the proposed method is established. Finally, the results we achieved are good and it is show that our method is forceful and effective.

Article Details

How to Cite
Zeyad Mohammed Abdullah, & Iman Khalid Jamalaldeen. (2021). A New Hybrid of DY and CGSD Conjugate Gradient Methods for Solving Unconstrained Optimization Problems. Tikrit Journal of Pure Science, 26(5), 86–91. https://doi.org/10.25130/tjps.v26i5.183
Section
Articles

References

[1] Dai, Y. H., & Yuan, Y. (1999). A nonlinear conjugate gradient method with a strong global convergence property. SIAM Journal on optimization, 10(1), 177-182.

[2] Wolfe, P. (1969). Convergence conditions for ascent methods. SIAM review, 11(2), 226- 235.

[3] Wolfe, P. (1971). Convergence conditions for ascent methods. II: Some corrections. SIAM review, 13(2), 185-188. [4] Hestenes, M. R., & Stiefel, E. (1952). Methods of conjugate gradients for solving linear systems. Journal of research of the National Bureau of Standards, 49(6), 409-436. [5] Fletcher, R., & Reeves, C. M. (1964). Function minimization by conjugate gradients. The computer journal, 7(2), 149-154. [6] Polak, E., & Ribiere, G. (1969). Note sur la convergence de méthodes de directions conjuguées. ESAIM: Mathematical Modelling and Numerical Analysis-Modélisation Mathématique et Analyse Numérique, 3(R1), 35-43.

[7] Polyak, B. T. (1969). The conjugate gradient method in extremal problems. USSR Computational Mathematics and Mathematical Physics, 9(4), 94-112.

[8] Fletcher, R. (1987). Practical methods of optimization john wiley & sons. New York, 80, 4.

[9] Liu, Y., & Storey, C. (1991). Efficient generalized conjugate gradient algorithms, part 1: theory. Journal of optimization theory and applications, 69(1), 129-137.

[10] Jardow, F. N., & Al-Naemi, G. M. (2020). A new hybrid conjugate gradient algorithm for unconstrained optimization with inexact line search. Indonesian Journal of Electrical Engineering and Computer Science, 20(2), 939-947.

[11] Dai, Y. H., & Yuan, Y. (2001). An efficient hybrid conjugate gradient method for unconstrained optimization. Annals of Operations Research, 103(1), 33-47.

[12] Dai, Y., & Yuan, Y. (2003). A class of globally convergent conjugate gradient methods. Science in China Series A: Mathematics, 46(2), 251-261.

[13] Andrei, N. (2008). A Dai–Yuan conjugate gradient algorithm with sufficient descent and conjugacy conditions for unconstrained optimization. Applied Mathematics Letters, 21(2), 165-171.

[14] Jardow, F. N., & Al-Naemi, G. M. (2020). A new hybrid conjugate gradient algorithm for unconstrained optimization with inexact line search. Indonesian Journal of Electrical Engineering and Computer Science, 20(2), 939-947.

[15] Dai, Y. H., & Liao, L. Z. (2001). New conjugacy conditions and related nonlinear conjugate gradient methods. Applied Mathematics and Optimization, 43(1), 87-101.

[16] I. Bongartz, A.R. Conn, N.I.M. Gould, P.L. Toint, CUTE: Constrained and unconstrained testing environments, ACM Trans. Math. Software 21 (1995) 123–160.

[17] N. Andrei, Test functions for unconstrained optimization.

http://www.ici.ro/camo/neculai/SCALCG/evalfg.for.