Abstract

This paper gives a modified Hestenes and Stiefel (HS) conjugate gradient algorithm under the Yuan-Wei-Lu inexact line search technique for large-scale unconstrained optimization problems, where the proposed algorithm has the following properties: the new search direction possesses not only a sufficient descent property but also a trust region feature; the presented algorithm has global convergence for nonconvex functions; the numerical experiment showed that the new algorithm is more effective than similar algorithms.

1. Introduction

Consider the minimization optimization models defined by where the function and There exist many good algorithms for (1), such as the quasi-Newton methods [1] and the conjugate gradient methods [25], where the iterative formula of the conjugate gradient algorithm for (1) is designed bywhere is the th iterative point, is the steplength, and is the so-called conjugate gradient search direction withwhere is a scalar determined from different conjugate gradient formulas and the HS method [3] is one of the most well-known conjugate gradient methods, which is where and The HS method has good numerical results for (1); however, the convergent theory is not interesting especially for the nonconvex function. At present, there exist many good conjugate gradients (see [68], etc.). Yuan, Wei, and Lu [9] gave a modified weak Wolfe-Powell (we called it YWL) line search for steplength designed bywhere ,   , and denotes the Euclidean norm. It is well known that there exist two open problems which are the global convergence of the normal BFGS method and the global convergence of the PRP method for nonconvex functions under the inexact line search technique, where the first problem is regarded as one of the most difficult one thousand mathematical problems of the 20th century [10]. Yuan et al. [9] partly solved these two open problems under the YWL technique, and the numerical performance shows that the YWL technique is more competitive than the normal weak Wolfe-Powell technique. Further study work can be found in their paper [11]. By (5), it is not difficult to see that the YWL conditions are equivalent to the weak Wolfe-Powell (WWP) conditions if holds, which implies that the YWL technique includes the WWP technique in some sense. Motivated by the above observations, we will make a further study and propose a new algorithm for (1). The main features of this paper are as follows:(i)A modified HS conjugate gradient formula is given, which has not only a sufficient descent property but also a trust region feature.(ii)The global convergence of the given HS conjugate gradient algorithm for nonconvex functions is established.(iii)Numerical results show that the new HS conjugate gradient algorithm under the YWL line search technique is better than the normal weak Wolfe-Powell technique.

This paper is organized as follows. In Section 2, a modified HS conjugate gradient algorithm is introduced. The global convergence of the given algorithm for nonconvex functions is established in Section 3 and numerical results are reported in Section 4.

2. Motivation and Algorithm

The nonlinear conjugate gradient algorithm is simple and has low memory requirement properties and is very effective for large-scale optimization problems, where the HS method is one of the most effective methods. However, the normal HS method has good numerical performance but fails in the convergence of nonconvex functions under the inexact line search technique. In order to overcome this shortcoming, a modified HS formula is defined by where and , , and are positive constants. This formula is inspired by the idea of these two papers [6, 8]. In recent years, lots of scholars like to study the three-term conjugate gradient formula because of its good properties [7]. In the next section, we will prove that the new formula possesses not only a sufficient descent property but also a trust region feature. The sufficient descent property is good for the convergence and the trust region makes the convergence easy to prove. Now, we give the steps of the proposed algorithm as follows.

Algorithm 1 (the modified three-term HS conjugate gradient algorithm (M-TT-HS-A)).
Step  1. Let and .
Step  2. If holds, stop.
Step  3. Find by the YWL line search satisfying (5) and (6).
Step  4. Let .
Step  5. Compute the direction by (7).
Step  6. The algorithm stops if
Step  7. Let and go to Step .

3. Sufficient Descent Property, Trust Region Feature, and Global Convergence

This section will prove some properties of Algorithm 1.

Lemma 2. The search direction is designed by (7); the following two relations hold:where is a constant.

Proof. If , it is easy to have (8) and (9). If , by formula (7), we have Then, (8) holds as well as (9) by letting This completes the proof.

Inequality (8) shows that the new formula has a sufficient descent property and inequality (9) proves that the new formula possesses a trust region feature. Both of these properties (8) and (9) are good theory characters and they play an important role in the global convergence of a conjugate gradient algorithm. The following global convergence theory will explain all this.

The following general assumptions are needed.

Assumption A. (i) The defined level set is bounded.
(ii) The objective function is bounded below, twice continuously differentiable, and is Lipschitz continuous; namely, the following inequality is true:where is the Lipschitz constant.

By Lemma 2 and Assumption A, similar to [9], it is not difficult to show that the YWL line search technique is reasonable and Algorithm 1 is well defined. Here, we do not state it anymore. Now, we prove the global convergence of Algorithm 1 for nonconvex functions.

Theorem 3. Let Assumption A hold, and the iterate sequence is generated by M-TT-HS-A. Then, the relation is true.

Proof. By (5), (8), and (9), we obtain Summing these inequalities for to and using Assumption A (ii) generate Inequality (14) implies that is true. By (6) and (8) again, we get Thus, the inequality holds, where the first inequality follows (8) and the last inequality follows (11). Then, we have By (15) and (18), we have Therefore, we get (12) and the proof is complete.

4. Numerical Results Performance

This section will give numerical results of Algorithm 1 and the similar algorithms for comparing them. We will give another two algorithms for comparison; they are listed as follows.

Algorithm 2 (the normal three-term formula [8] under the YWL technique).
Step  1. Let and
Step  2. If holds, stop.
Step  3. Find by the YWL line search satisfying (5) and (6).
Step  4. Let .
Step  5. Compute the direction by Step  6. The algorithm stops if
Step  7. Let and go to Step .

Algorithm 3 (the normal three-term formula [8] under the WWP technique).
Step  1. . Let and
Step  2. If holds, stop.
Step  3. Find by the WWP line search satisfying Step  4. Let .
Step  5. Compute the direction by (20).
Step  6. The algorithm stops if
Step  7. Let and go to Step .

4.1. Problems and Experiment

The following are some notes.

Test Problems. These problems and the related initial points are listed in Table 1; the detailed problems can be found in Andrei [12], and some papers also use these problems [13].

Experiments. Codes are run on Intel(R) Xeon(R) CPU, E5507 @2.27 GHz, and 6.00 GB memory and Windows 7 operation system and written by MATLAB R2009a.

Parameters. , , , and .

Dimension. Large-scale dimensions , and

Stop Rules. stop rule: if , let ; otherwise, set If or holds, the program stops.

Other Cases. The line search technique accepts if the searching number is more than 6 and the algorithm will stop if the total iteration number is larger than 800.

The numerical results are listed in Table 2, where“Number” is the tested problems number;“Dim.” is the problems dimension;“NI” is the total iteration number;“CPU” is the system CPU time in seconds;“NFG” is the total number of functions and gradients.

4.2. Results and Discussion

We use the tool of Dolan and Moré [14] to analyze the efficiency of the three given algorithms. Figures 1 and 2 show that the performance of Algorithm 1 is the best and that Algorithm 1 has the best robust property among those three methods and Algorithm 2 is better than Algorithm 3, which shows that the given formula (7) is competitive to the normal three-term conjugate gradient formula (20) and the YWL line search technique is more effective than the norm WWP technique, and all of these conclusions are coincident with the results of [9]. Algorithm 1 in Figure 3 is competitive to the other two algorithms and it has the best robust property. It is not difficult to see that Figure 3 shows that Algorithm 1 is not so good and we think the reason is formula (7) or the YWL technique since more information is needed and hence more CPU time is necessary.

5. Conclusions

This paper proposes a modified HS three-term conjugate gradient algorithm for large-scale optimization problems and the given algorithm has some good features.

(1) The modified HS three-term conjugate gradient possesses a trust region property, which makes the global convergence of the general functions easy to get. However, the normal HS formula including many other conjugate gradient formulas does not have this feature, which may be the crucial point for the global convergence of the general functions.

(2) The largest dimension of the test problems is 30000 variables and the numerical results show that the presented algorithm is competitive to other similar methods. More experiments will be done to prove the performance of the proposed algorithm in the future.

Conflicts of Interest

The authors declare that there are no conflicts of interest regarding the publication of this paper.

Authors’ Contributions

Mr. Xiangrong Li and Dr. Songhua Wang wrote this paper in English. Dr. Zhongzhou Jin and Dr. Hongtruong Pham carried out the experiment. All the authors read and approved the final manuscript.

Acknowledgments

This work is supported by the National Natural Science Foundation of China (Grants nos. 11661009 and 11261006), the Guangxi Science Fund for Distinguished Young Scholars (no. 2015GXNSFGA139001), the Guangxi Natural Science Key Fund (no. 2017GXNSFDA198046), and the Basic Ability Promotion Project of Guangxi Young and Middle-Aged Teachers (no. 2017KY0019).