Mathematical Problems in Engineering

Mathematical Problems in Engineering / 2015 / Article

Research Article | Open Access

Volume 2015 |Article ID 839659 | https://doi.org/10.1155/2015/839659

Guanghui Zhou, Qin Ni, "A Spectral Dai-Yuan-Type Conjugate Gradient Method for Unconstrained Optimization", Mathematical Problems in Engineering, vol. 2015, Article ID 839659, 7 pages, 2015. https://doi.org/10.1155/2015/839659

A Spectral Dai-Yuan-Type Conjugate Gradient Method for Unconstrained Optimization

Academic Editor: Paolo Maria Mariano
Received01 Oct 2015
Accepted13 Dec 2015
Published20 Dec 2015

Abstract

A new spectral conjugate gradient method (SDYCG) is presented for solving unconstrained optimization problems in this paper. Our method provides a new expression of spectral parameter. This formula ensures that the sufficient descent condition holds. The search direction in the SDYCG can be viewed as a combination of the spectral gradient and the Dai-Yuan conjugate gradient. The global convergence of the SDYCG is also obtained. Numerical results show that the SDYCG may be capable of solving large-scale nonlinear unconstrained optimization problems.

1. Introduction

As well known, a great deal of issues, which are studied in scientific research fields, can be translated to unconstrained optimization problems. The spectral conjugate gradient (SCG) method does nice jobs among various algorithms for solving nonlinear optimization problems. The spectral conjugate gradient combines the spectral gradient and the conjugate gradient. To the SCG method, the choice of spectral parameter is crucially important. In this paper, we propose a new spectral conjugate gradient method based on the Dai-Yuan conjugate gradient method by providing a new spectral parameter. Our purpose is to obtain an efficient algorithm for the unconstrained optimization.

An unconstrained optimization problem is customarily expressed asThe nonlinear function considered in this paper is continuously differentiable; the gradient of is denoted by . We usually impose the following properties on function .(P1)The function is bounded below and is continuously differentiable in a neighbourhood of the level set , where is the starting point.(P2)The gradient of is Lipschitz continuous in ; that is, there exists a constant , such that for all , .

Generally, a sequence is obtained in an algorithm for solving (1) and has the following format:where is a search direction and is the step size. At each iterative point , we usually determine firstly and then compute by some principles.

There are different ways to determine the direction . In the classical steepest-descent method, . In the conjugate gradient (CG) method, is of the form where is a scalar parameter characterizing the conjugate gradient method. The best-known expressions of are Hestenes-Stiefel (HS) [1], Fletcher-Reeves (FR) [2], Polak-Ribiere-Polyak (PRP) [3, 4], and Dai-Yuan (DY) [5] formulas. They are defined by respectively, where denotes the Euclidean norm and .

There also are some approaches to determine the step size in (2). Unfortunately, the steepest-descent method performs poorly. Barzilai and Borwein improved the steepest-descent method greatly by providing a spectral choice of step size in [6]. Their algorithm has the form where or with . Many algorithms are proved convergent under the Wolfe condition; that is, the step size satisfieswith .

In recent years, some scholars developed a new method—spectral conjugate gradient (SCG) method—for solving (1). For example, Raydan introduced the spectral gradient method for large-scale unconstrained optimization in [7]. He combined a nonmonotone line search strategy that guarantees global convergence with the Barzilai and Borwein method. Utilizing spectral gradient and conjugate gradient ideas, Birgin and Martinez proposed a spectral conjugate gradient method in [8]. In their algorithm, the search direction has the formIn [8], the best combination of this formula, the scaling, and the initial choice of step-length is also studied. Following [8], some papers discussed the various choices of the spectral parameter based on different . For example, Du and Chen [9] gave a modified spectral FR conjugate gradient method with Wolfe-type line search based on FR formula. Their spectral parameters and are expressed asYu et al. [10] presented a modification of spectral Perry’s conjugate gradient formula, which possessed the sufficient descent property independent of line search condition. Their search direction is defined by (8) and has the formwhereLiu and Li [11] proposed a spectral DY-type projection method for nonlinear monotone systems of equations. The direction is also determined by (8) and the parameters are defined by

We will propose a new SCG method based on the Dai-Yuan-type conjugate gradient method in this paper. A new selection of is introduced in our algorithm such that the sufficient descent condition holds. In addition, the global convergence of the new method is obtained.

The present paper is organized as follows. In Section 2, we outline our new method for unconstrained nonlinear optimization, and we show that the sufficient descent condition holds under mild assumptions. The global convergence is proved in Section 3, while the numerical results compared with CG-DESCENT are given in Section 4. At last section, we draw some conclusions about our new spectral conjugate gradient method.

2. Spectral Dai-Yuan-Type Conjugate Gradient Method

In this paper, we consider the spectral conjugate gradient method in which the search direction is of the formwhereand the scalar parameter is defined by (4). This is a new spectral conjugate gradient method for solving problem (1) because the expression (14) of spectral parameter is completely different from those in other papers. The search direction (13) is a combination of the spectral gradient and the Dai-Yuan conjugate gradient. We hope that (14) may be an efficient choice.

In order to obtain the global convergence of our method, we assume that the step size satisfies the strong Wolfe condition; that is, the step size satisfies (6) andwith . It is easy to see that (7) holds if (15) holds.

The following is a detailed description of the spectral Dai-Yuan-type conjugate gradient (SDYCG) algorithm.

SDYCG Algorithm

Step 1 (initialization). Choose , set , and take .

Step 2 (check the convergence condition). Compute ; if , then stop.

Step 3 (form the search direction). If , then . Else, compute and by formulas (14) and (4), respectively; then compute by (13) and (14).

Step 4 (line search). Find which satisfies the strong Wolfe conditions (6) and (15).

Step 5 (compute the new point). Set and and go to Step 2.

The framework of the SDYCG algorithm is similar to other spectral conjugate gradient algorithms. However we choose a different spectral parameter (see (14)) which is the main difference between SDYCG and the others.

The global convergence of SDYCG algorithm will be given in the next section. Before that, we will show that the search direction (13) can ensure the sufficient descent condition.

Lemma 1. Suppose that the sequence is generated by the SDYCG algorithm; thenfor all .

Proof. Since is calculated by formula (13), we can get, if , that whereas when , from (14), we find Furthermore,From the second formula of (13), we obtainThe proof is completed.

This lemma gives the fact that the direction is a descent direction. Besides, possesses the following property.

Lemma 2. Suppose that the SDYCG algorithm is implemented with the step size that satisfies the strong Wolfe conditions (6) and (15). If for all , then

Proof. By (14) and , we haveWith (15) and (21), we get Therefore, inequality (20) follows. This completes the proof.

Inequality (20) gives the close relationship of the inner product of gradient and direction between the adjacent two iterations. It will play an important role in our global convergence analysis.

3. Convergence Analysis

Dai and Yuan stated in [5] that the following result had been essentially proved by Zoutendijk and Wolfe.

Lemma 3. Suppose that the function has the properties and . Assume that is a descent direction and is obtained by the Wolfe conditions (6) and (7). ThenOne customarily calls (23) the Zoutendijk condition.

Theorem 4. Suppose that the function has the properties and . Sequences , , and are generated by SDYCG algorithm. Then either for some or

Proof. Suppose that for all and (24) is not true. Then there exists a constant , such thatfor all of the iterations.
The second equality of (13) implieswe getDividing both sides by , we have Combining (20) in Lemma 2, we see the inequalitySumming both sides, we obtainSo, from (25) and (30), we getThis relation is equivalent toSumming over , we obtainFrom the SDYCG algorithm, the step size satisfies the strong Wolfe condition, so the Wolfe condition (7) holds. And the directions obtained by the algorithm are descent from Lemma 1. But the last equality contradicts the Zoutendijk condition (23). Hence, our original assertion (25) must be false, giving that either for some or (24) holds.

4. Numerical Results

In order to test the numerical performance of the SDYCG algorithm, we choose some unconstrained problems with the initial points from CUTEr library [12, 13]. They are listed in Table 1.


Number Function name

1 ENGVAL1
2 FLETCBV2
3 TOINTGSS
4 COSINE
5 ARWHEAD
6 EDENSCH
7 EG2
8 GENROSE
9 LIARWHD
10 Generalized White & Holst
11 Extended Wood
12 Extended quadratic penalty QP1
13 BDEXP
14 HIMMELBG
15 Hager
16 Extended TET
17 Diagonal 5
18 Extended Himmelblau
19 Diagonal 6
20 Extended DENSCHNF
21 LIARWHD
22 Extended BD1
23 Extended Hiebert
24 Extended Tridiagonal 2
25 QUARTC
26 Extended DENSCHNB
27 Extended Rosenbrock
28 Raydan 2
29 Diagonal 2
30 Diagonal 4
31 Extended Maratos
32 Quadratic QF1
33 Extended quadratic exponential EP1
34 DQDRTIC
35 NONSCOMP
36 Extended Freudenstein & Roth
37 Extended White & Holst
38 Raydan 1
39 Extended Tridiagonal 1
40 Extended Cliff
41 Extended Trigonometric
42 Extended Beale
43 Generalized Tridiagonal 1
44 Generalized PSC1
45 Extended PSC1
46 Extended Powell
47 BDQRTIC
48 FLETCBV3
49 FLETCHCR
50 FREUROTH
51 GENHUMPS
52 NONDIA
53 NONDQUAR
54 SROSENBR
55 TQUARTIC
56 Extended Penalty

The experiments are run on a personal computer with a 64-bit processor, 2.5 GHz of CPU, and 4 GB of RAM memory. All the codes are written in MATLAB language and are compiled with this software.

We would like to compare the SDYCG with the CG-DESCENT. The CG-DESCENT is a conjugate gradient algorithm with guaranteed descent proposed by Hager and Zhang in [14]. It has been proven an excellent algorithm in recent years.

To make the comparison as fair as possible, we use the criterion to terminate the executions and impose restriction on the number of iterations less than in both algorithms. All the step sizes satisfy the strong Wolfe conditions (6) and (15).

We use the performance profiles proposed by Dolan and Moré [15] to show the efficiency of comparisons. Performance profiles can be used as a tool for benchmarking and comparing optimization software. The performance profile for a solver is the (cumulative) distribution function for a performance metric. For example, if computing time is chosen as a metric, then we compute the ratio of the computing time of the solver versus the best time of all of the solvers. That is, for each method, we plot the fraction (-axis) of problems for which the method is within a factor (-axis) of the best time. The curve of a solver being above others means that it has the highest probability of being the optimal solver. We use a scale for to capture the performance of all the solvers.

In order to observe the numerical results of the SDYCG and the CG-DESCENT, we choose three different dimensions of each test function. The dimensions are , , and , respectively. According to the numerical results obtained in every dimension, we plot two figures based on CPU time and iterations, respectively.

We can find from Figure 1 that the SDYCG is similar to the CG-DESCENT when because their curves crosses each other. The predominance of the SDYCG appears in Figure 2 when . If the test dimension is chosen as , the SDYCG is better than the CG-DESCENT from the fact that its curve is almost completely above that of the CG-DESCENT in Figure 3.

Furthermore, we are interested in the robustness of our SDYCG algorithm. Ten problems are selected from CUTEr to be tested. The numerical results listed in Table 2 are obtained by changing the initial iteration point every time. “” represents the standard initial iteration point of the problem; “iter.” and “time(s)” indicate the iterative number and time (in seconds).


Number Function name
Iter. Time(s) Iter. Time(s)Iter. Time(s)

1 DQDRTIC 2450.3436 290 0.3951 338 0.4567
2 QUARTC 1 0.0012 28 0.1273 27 0.1482
3 Diagonal 65 0.0176 8 0.0267 4 0.0153
4 Extended DENSCHNB 13 0.0110 18 0.0159 16 0.0177
5 Extended DENSCHNF 22 0.0636 17 0.0519 24 0.0820
6 LIARWHD 12 0.0186 11 0.0193 17 0.0320
7 EDENSCH 30 0.1897 30 0.18753 30 0.1860
8 EG2 9 0.0404 9 0.0392 9 0.0404
9 ENGVAL1 28 0.0893 29 0.1064 31 0.1294
10FLETCBV2 53 0.1080 53 0.1377 53 0.1355

The conclusion that can be drawn is that the SDYCG is a robust algorithm and it may be capable of solving large-scale nonlinear unconstrained optimization problem.

5. Conclusions

We propose a new spectral conjugate gradient method for nonlinear unconstrained optimization. This method, which we call the SDYCG, is built based on the Dai-Yuan conjugate gradient method. A new spectral choice is provided in the search direction. Numerical results show that the SDYCG is comparable with the CG-DESCENT. The SDYCG algorithm may be capable of solving large-scale nonlinear unconstrained optimization problems.

Conflict of Interests

The authors declare that there is no conflict of interests regarding the publication of this paper.

Acknowledgments

This work was supported by the National Natural Science Foundation of China (11471159), the Natural Science Foundation of Jiangsu Province (BK20141409), and the Foundation of Education Department of Anhui Province (2014jyxm161).

References

  1. M. R. Hestenes and E. Stiefel, “Methods of conjugate gradients for solving linear systems,” Journal of Research of the National Bureau of Standards, vol. 49, pp. 409–436 (1953), 1952. View at: Publisher Site | Google Scholar | Zentralblatt MATH | MathSciNet
  2. R. Fletcher and C. M. Reeves, “Function minimization by conjugate gradients,” The Computer Journal, vol. 7, pp. 149–154, 1964. View at: Publisher Site | Google Scholar | MathSciNet
  3. E. Polak and G. Ribiere, “Note sur la convergence de méthodes de directions conjuguées,” Revue Française d'Informatique et de Recherche Opérationnelle Série Rouge, vol. 3, pp. 35–43, 1969. View at: Google Scholar
  4. B. T. Polyak, “The conjugate gradient method in extremal problems,” USSR Computational Mathematics and Mathematical Physics, vol. 9, no. 4, pp. 94–112, 1969. View at: Publisher Site | Google Scholar | Zentralblatt MATH
  5. Y. H. Dai and Y. Yuan, “A nonlinear conjugate gradient method with a strong global convergence property,” SIAM Journal on Optimization, vol. 10, no. 1, pp. 177–182, 1999. View at: Publisher Site | Google Scholar | MathSciNet
  6. J. Barzilai and J. M. Borwein, “Two-point step size gradient methods,” IMA Journal of Numerical Analysis, vol. 8, no. 1, pp. 141–148, 1988. View at: Publisher Site | Google Scholar | Zentralblatt MATH | MathSciNet
  7. M. Raydan, “The Barzilai and BORwein gradient method for the large scale unconstrained minimization problem,” SIAM Journal on Optimization, vol. 7, no. 1, pp. 26–33, 1997. View at: Publisher Site | Google Scholar | Zentralblatt MATH | MathSciNet
  8. E. G. Birgin and J. M. Martinez, “A spectral conjugate gradient method for unconstrained optimization,” Applied Mathematics and Optimization, vol. 43, no. 2, pp. 117–128, 2001. View at: Publisher Site | Google Scholar | MathSciNet
  9. S.-Q. Du and Y.-Y. Chen, “Global convergence of a modified spectral FR conjugate gradient method,” Applied Mathematics and Computation, vol. 202, no. 2, pp. 766–770, 2008. View at: Publisher Site | Google Scholar | MathSciNet
  10. G. Yu, L. Guan, and W. Chen, “Spectral conjugate gradient methods with sufficient descent property for large-scale unconstrained optimization,” Optimization Methods & Software, vol. 23, no. 2, pp. 275–293, 2008. View at: Publisher Site | Google Scholar | MathSciNet
  11. J. Liu and S. Li, “Spectral DY-type projection method for nonlinear monotone systems of equations,” Journal of Computational Mathematics, vol. 33, no. 4, pp. 341–355, 2015. View at: Publisher Site | Google Scholar | MathSciNet
  12. N. I. Gould, D. Orban, and P. L. Toint, “CUTEr and SifDec: a constrained and unconstrained testing environment, revisited,” ACM Transactions on Mathematical Software, vol. 29, no. 4, pp. 373–394, 2003. View at: Google Scholar
  13. N. Andrei, “An unconstrained optimization test functions collection,” Advanced Modeling and Optimization, vol. 10, no. 1, pp. 147–161, 2008. View at: Google Scholar | MathSciNet
  14. W. W. Hager and H. Zhang, “A new conjugate gradient method with guaranteed descent and an efficient line search,” SIAM Journal on Optimization, vol. 16, no. 1, pp. 170–192, 2005. View at: Publisher Site | Google Scholar | MathSciNet
  15. E. D. Dolan and J. J. Moré, “Benchmarking optimization software with performance profiles,” Mathematical Programming, vol. 91, no. 2, pp. 201–213, 2002. View at: Publisher Site | Google Scholar | MathSciNet

Copyright © 2015 Guanghui Zhou and Qin Ni. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.


More related articles

 PDF Download Citation Citation
 Download other formatsMore
 Order printed copiesOrder
Views917
Downloads583
Citations

Related articles

We are committed to sharing findings related to COVID-19 as quickly as possible. We will be providing unlimited waivers of publication charges for accepted research articles as well as case reports and case series related to COVID-19. Review articles are excluded from this waiver policy. Sign up here as a reviewer to help fast-track new submissions.