In unconstrained optimization, we minimize an objective function that depends on real variables with no restrictions at all on the value of these variables. The unconstrained optimization problem is stated by:
where is a real vector with component and is a smooth function and its gradient g is available . A nonlinear conjugate gradient method generates a sequence Starting from an initial guess Using the recurrence
where is the positive step size obtained by carrying out a one dimensional search, known as the line searches . Among them, the so-called strong wolf line search conditions require that  .
where , is to find an approximation of where the descent property must be satisfied and no longer searching in the direction when is far from the solution. Thus by strong Wolfe line search conditions we in herit the advantages of exact line search with inexpensive and low computational cost .
The search direction is generated by:
where and is the gradient and conjugate gradient coefficient of f(x) respectively at the point . The different choices for the parameter correspond to different conjugate gradient methods. The most popular formulas for is Hestenes Stiefel method (HS), Fletcher-Reeves method (FR), Polak-Ribiere- Polyak method (PR), conjugate―Descent method (CD), Liu―Storey method (LS), and Dai-Yuan method (DY), etc
These methods are identical when f is a strongly convex quadratic function and the line search is exact, since the gradient are mutually orthogonal, and the parameters in these methods are equal. When applied to general nonlinear function with inexact line searches, however, the behavior of these methods is marked different . We are going to summarize some well known conjugate gradient method in Table 1.
An important class of conjugate gradient methods is the hybrid conjugate gradient algorithms. The hybrid computational schemes perform better than the classical conjugate gradient methods. They are defined by (2) and (5) where the parameter is computed as projections or as convex combinations of different conjugate gradient methods .
We are going to summarize some well known hybrid conjugate gradient method in Table 2.
We propose a new hybrid CG method based on combination of MMWU  and RMAR  conjugate gradient methods for solving unconstrained optimization method with suitable conditions. The corresponding conjugate gradient parameters are
Table 1. Some well known conjugate gradient coefficients.
Table 2. Hybrid conjugate gradient methods.
We defined the parameter in the proposed method by:
Observe that if , then , and if , then .
By choosing the appropriate value of the parameter In the convex combination, the search direction of our algorithm not only is the Newton direction, but also satisfies the famous DL conjugate condition proposed by Dai and Liao . Under the strong Wolfe line search conditions, we prove the global convergence of our algorithm. The numerical results also show the feasibility and effectiveness of our algorithm.
This paper is organized as follows. Section 2 we introduce our new hybrid conjugate gradient method (HFG), and we obtain the parameter using some approaches and give us a specific algorithm. Section 3, we prove that it generates direction satisfying the sufficient descent condition under strong Wolfe line search conditions. The global convergence property of the proposed method is established in Section 4. Some numerical results are reported in Section 5.
2. A New Hybrid Conjugate Gradient Method
In this section, we will describe a new proposed hybrid conjugate gradient method. In order to obtain the sufficient descent direction, we will compute as follows. We combine and in a convex combination in order to have a good algorithm for unconstrained optimization.
The direction is generated by the rule
where defined in (8), the iterates of our method are computed by means of the recurrence (2), where the step size Is determined according to the strong Wolf conditions (3) and (4).
The scale parameter satisfying , which will be determined in a specific way to be described later. Observe that if , then , and
If , then . On the other hand, if , then is a convex combination of and .
From (8) and (9) it is obvious that:
Our motivation to select the parameter in such a manner that the defection given in (10) is equal to the Newton direction . There for
Now multiplying (11) by from the left, we get
Therefore, in order to have an algorithm for solving large scale problems we assume that pair satisfies the secant equation
From (12), we get
Denoting we get
after some algebra, we get
Now, we specify a complete hybrid conjugate gradient method (HFG) which posses some nice properties of conjugate gradient and Newton method.
Step 1: Select , , set . Compute and , set .
Step 2: Test the stopping criteria, i.e. if , then stop.
Step 3: Compute by strong Wolfe line search conditions in (3) & (4).
Step 4: Compute , . Compute And
Step 5: If then set . If , then set , otherwise compute as (13).
Step 6: Compute by (8).
Step 7: Generate
Step 8: If the restart criteria of Powell , is satisfied, then set , otherwise define
Step 9: Set , and continue with step 2.
3. The Sufficient Descent Condition
In this section, we are going to apply the following theorem to illustrate that the search direction Obtained by hybrid FG satisfies the sufficient descent condition which plays Avit of role in analyzing the global convergence.
For further considerations we need the following assumptions
The level sets are bounded.
In a neighborhood N of S, the function f is continuously differentiable and its gradient is Lipschitz continuous, i.e., there exists a constant , such that
Under these assumptions of if there exists a positive constant ( & ) & such that
Let the sequences and be generated by a hybrid FG method. Then the search direction satisfies the sufficient descent condition:
where , with .
Proof. We shall show that satisfies the sufficient descent condition holds for , the proof is a trivial one, i.e. and so . Now we have
We can rewrite the above direction by the following manner:
After some arrangement, we get
Multiplying (15) by from the left, we get
Firstly, if , then , we are going to prove that the sufficient descent condition holds for MMWU method in the presence of the strong Wolfe line search condition, because in  they proved this method satisfied the sufficient descent condition with exact line search.
Applications (17) in (16), we get
where , with .
So, it is proved that satisfies the sufficient descent condition.
Now let then , we are going to prove that the sufficient descent condition holds for RMAR method in the presence of the strong Wolfe line search condition because in  they proved this method satisfied the sufficient descent condition with exact line search.
Multiplying the above equation from left by we get
In , they proved that
Used (17), and (19) the direction become
where with and .
So, it is proved that satisfied the sufficient descent condition.
Now, we are going to prove the direction satisfy the sufficient descent condition when , firstly for
We have from Lipschitz condition and
with a mathematical calculation, we get
Now, secondly for
From (19), Lipschitz condition and , we get
Since , so
From (18), (20), (21) and (22) we get
with and .
So, it is proved that Satisfied the sufficient descent condition.
4. Converge Analysis
Let Assumption 2.1 and 2.2 hold. In  it is proved that for any conjugate gradient method with strong Wolfe line search conditions, it holds:
Let Assumption 2.1 and 2.2 holds. Consider the method (2) and (5) where the Is a descent direction and αk is received from the strong wolf line search. If
Suppose that assumption 2.1 and 2.2 holds. Consider the algorithm HFG were and is obtained by the strong Wolfe line search and is the descent direction. Then
Proof. Because the descent condition holds, we have. So using lemma 3.1, it is sufficient to prove that is bounded above. From (10).
They proved that in  and .
By (4), we have.
with some mathematical calculation, we get
5. Numerical Experiments
In this section we selected some of test functions in Table 3 from CUTE library, along with other large scale optimization problems presented in Andrei   and Bongartz et al. .
All codes are written in double precision FORTRAN Language and compiled Visual F90 (default compiler settings) on a Workstation Intel Pentium 4. The value of is always computed by cubic fitting procedure.
We selected 26 large scale unconstrained optimization problems in the extended
Table 3. It gives the comparison depending in the NOI and NOF between, and the proposed method.
Table 4. The percentage performance of the proposed methods.
Figure 1. The comparison between the three methods.
or generalized form. Each problem was tested three times for a gradually increasing number of variables: N = 1000, 5000 and 10,000, all algorithms implemented the strong Wolfe line search (3) and (4) conditions with and and the same stopping criterion is used.
In some cases, the computation stopped due to the failure of the line search to find the positive step size, and thus it was considered as a failure denoted by (F).
We record the number of iteration calls (NOI), the number of function evaluations calls (NOF), and the dimensions of test problems calls (N), for the purpose of our comparisons.
Table 3 gives the comparison depending in the NOI and NOF between, and the proposed method.
Table 4 gives the percentage performance of the proposed methods against and. We have seen that. Method saves (NOI 0.8%), (NOF 7.6%), and method saves (NOI 28.7%), (NOF 40.0%) compared with method.
While Figure 1 gives the comparison between, and, using a well-known Wood test function.
 Liu, A.J. and Li, S. (2014) New Hybrid Conjugate Gradient Method for Unconstrained Optimization. Applied Mathematics and Computation, 245, 36-43.
 Salleh, Z. and Alhawarat, A. (2016) An Efficient Modification of the Hes-tenes-Stiefel Nonlinear Conjugate Gradient Method with Restart Property. Journal of Inequalities and Applications, 2016, Article No. 110.
 Alhawarat, A., Mamat, M., Rivaie, M. and Salleh, Z. (2015) An Efficient Hybrid Conjugate Gradient Method with the Strong Wolfe-Powell Line Search. Mathematical Problems in Engineering, 2015, Article ID: 103517.
 Hestenes, M.R. and Stiefel, E. (1952) Methods of Conjugate Gradients for Solving Linear Systems. Journal of Research of the National Bureau of Standards, 49, 409-436. https://nvlpubs.nist.gov/nistpubs/jres/049/jresv49n6p409_A1b.pdf
 Fletcher, R. and Reeves, C. (1964) Function Minimization by Conjugate Gradients. The Computer Journal, 7, 149-154. https://doi.org/10.1093/comjnl/7.2.149
 Polak, E. and Ribie’re, G. (1969) Note sur la convergence de me’thodes de directions conjugue’s. Revue Francsis d’Infermatique et de recherché Operationnelle, 3, 35-43.
 Polyak, B.T. (1969) The Conjugate Gradient Method in Extreme Problems. USSR Computational Mathematics and Mathematical Physics, 9, 94-112.
 Fletcher, R. (1987) Practical Methods of Optimization. 2nd Edition, John Wiley & Sons, Inc., Hoboken. https://www.wiley.com/en-ba
 Liu, D. and Story, C. (1991) Efficient Generalized Conjugate Gradient Algorithms. Part 1: Theory. Journal of Optimization Theory and Applications, 69, 129-137.
 Al-Naemi, Gh.M. and Hamed, E.T. (2013) New Conjugate Method with Wolfe Type Line Searches for Nonlinear Programming. Australian Journal of Basic and Applied Sciences, 7, 622-632. https://www.researchgate.net/publication/330686295
 Andrei, N. (2010) Acceleration Hybrid Conjugate Gradient Algorithm with Modified Secant Condition for Unconstrained Optimization. Numerical Algorithms, 54, 23-46. https://link.springer.com/article/10.1007/s11075-009-9321-0
 Andrei, N. (2008) Another Nonlinear Conjugate Gradient Algorithm for Unconstrained Optimization. Optimization Methods and Software, 24, 89-104.
 Yan, H., Chen, L. and Jiao, B. (2009) HS-LS-CD Hybrid Conjugate Gradient Algorithm for Unconstrained Optimization. 2nd International Workshop on Computer Science and Engineering, Vol. 1, Qingdao, 28-30 October 2009, 264-268.
 Li, S. and Sun, Z.B. (2010) A New Hybrid Conjugate Gradient Method and Its Global Convergence for Unconstrained Optimization. International Journal of Pure and Applied Mathematics, 63, 84-93.
 Djordjevic, S.S. (2018) New Hybrid Conjugate Gradient Method as a Convex Combination of HS and FR Conjugate Gradient Methods. Journal of Applied Mathematics and Computation, 2, 366-378. https://doi.org/10.26855/jamc.2018.09.002
 Zheng, X.Y., Dong, X.L., Shi, J.R. and Yang, W. (2019) Further Comment on Another Hybrid Conjugate Gradient Algorithm for Unconstrained Optimization by Andrei. Numerical Algorithm, 1-6. https://doi.org/10.1007/s11075-019-00771-1
 Abdullahi, I. and Ahmad, R. (2016) Global Convergence Analysis of a Nonlinear Conjugate Gradient Method for Unconstrained Optimization Problems. Indian Journal of Science and Technology, 9, 1-9.
 Livieris, I.E., Tampakas, V. and Pintelas, P. (2018) A Descent Hybrid Conjugate Gradient Method Based on the Memoryless BFGS Update. Numerical Algorithms, 79, 1169-1185. https://link.springer.com/article/10.1007/s11075-018-0479-1
 Mandara, A.V., Mamat, M., Waziri, M.Y., Mohamed, M.A. and Yakubu, U.A. (2018) A New Conjugate Gradient Coefficient with Exact Line Search for Unconstrained Optimization. Far East Journal of Mathematical Sciences, 105, 193-206.
 Liu, J.K. and Li, S.J. (2014) New Hybrid Conjugate Gradient Method for Unconstrained Optimization. Applied Mathematics and Computation, 245, 36-43.
 Yunus, R.B., Mamat, M. and Abashar, A. (2018) Comparative Study of Some New Conjugate Gradient Methods. UniSZA Research Conference (URC 2015), Kuala Terengganu, 14-16 April 2015, 616-621.
 Andrei, N. (2008) An Unconstrained Optimization Test Functions Collection. Advanced Modeling and Optimization, 10, 147-161.
 Bongartz, I., Conn, A.R., Gould, N. and Toint, P.L. (1995) CUTE: Constrained and Unconstrained Testing Environment. ACM Transactions on Mathematical Software, 21, 123-160. https://doi.org/10.1145/200979.201043