Mathematical Problems in Engineering

Mathematical Problems in Engineering / 2020 / Article
Special Issue

Machine Learning and its Applications in Image Restoration

View this Special Issue

Research Article | Open Access

Volume 2020 |Article ID 4381515 | https://doi.org/10.1155/2020/4381515

Zhan Wang, Pengyuan Li, Xiangrong Li, Hongtruong Pham, "A Modified Three-Term Type CD Conjugate Gradient Algorithm for Unconstrained Optimization Problems", Mathematical Problems in Engineering, vol. 2020, Article ID 4381515, 14 pages, 2020. https://doi.org/10.1155/2020/4381515

A Modified Three-Term Type CD Conjugate Gradient Algorithm for Unconstrained Optimization Problems

Guest Editor: Maojun Zhang
Received26 Jul 2020
Accepted13 Aug 2020
Published04 Sep 2020

Abstract

Conjugate gradient methods are well-known methods which are widely applied in many practical fields. CD conjugate gradient method is one of the classical types. In this paper, a modified three-term type CD conjugate gradient algorithm is proposed. Some good features are presented as follows: (i) A modified three-term type CD conjugate gradient formula is presented. (ii) The given algorithm possesses sufficient descent property and trust region property. (iii) The algorithm has global convergence with the modified weak Wolfe–Powell (MWWP) line search technique and projection technique for general function. The new algorithm has made great progress in numerical experiments. It shows that the modified three-term type CD conjugate gradient method is more competitive than the classical CD conjugate gradient method.

1. Introduction

Considering the problemwhere is a continuously differentiable function. This kind of model is often used to solve some problems in applied mathematics, economics, engineering, and so on. Generally, the following iteration formula is used to generate the next iteration point:where and denote the next iteration point and current iteration point, respectively. is a step-length and is the search direction. The search direction generated by Conjugated Gradient (CG) method is defined by the following formula:where is the gradient of at and is a parameter. Different will generate different CG methods [110]. There are six classical forms of :where , and is the Euclidean norm. Those formulas can be divided into two categories: One includes PRP method, HS method, and LS method, which have good numerical performance; the other includes FR method, CD method, and DY method, which have good theory convergence. About these methods, many scholars had applied them to solve nonlinear monotone equations and normal optimization problems, and some good results were achieved [1116].

Zhang et al. [17] presented a modified PRP CG formula as follows:

They proved that the modified PRP method is globally convergent with Armijo-type line search.

Yuan et al. [18] proposed another modified PRP CG formula withwhere and . Yuan et al. [18] obtained the global convergence of the modified PRP method with a modified weak Wolfe–Powell (MWWP) line search technique, which was proposed by Yuan et al. [19]:where , , and . Yuan et al. [16] obtained the global convergence of PRP method by using the above modified weak Wolfe–Powell line search technique and a projection technique:where is the next point and parameter . With this projection technique, any unsatisfactory iteration points generated by the normal PRP algorithm will be projected onto a surface to overcome the failure to converge.

Motivated by the above researches, a modified three-term CD conjugate gradient algorithm is presented with (7), (8), and (9) in this paper. Some good properties are obtained as follows:(i)A modified three-term type CD conjugate gradient formula is presented(ii)The given algorithm possesses sufficient descent property and trust region property(iii)The algorithm has global convergence with the MWWP line search technique and projection technique for general function

This paper is organized as follows: the next section will introduce the modified CD formula and relative algorithm; section 3 gives the proof of the global convergence of the new algorithm; numerical experiments are given in section 4; some conclusions are presented in section 5. Throughout this paper, we use to denote the Euclidean norm, and are replaced by and , respectively.

2. Motivation and Algorithm

The convergence of CD conjugate gradient method has been proved [4]; however, the numerical results of this method are worse than the PRP method and others. Therefore, it is necessary to propose a new search direction to improve the numerical performance of the CD method. Meanwhile, the sufficient descent property is significant for obtaining the convergence of the conjugate gradient method:

Then, we also hope that we can propose a new method that possesses this property. Inspired by the above discussion, a modified three-term type CD conjugate gradient formula is designed by the following:where , and . The steps of the given algorithm are listed as follows.

3. Convergence Analysis

In this section, we are going to analyse the convergence of the proposed algorithm. The following assumptions are needed.

Assumption 1. (i)The level set is bounded.(ii) is twice continuously differentiable and bounded below, and the gradient function is Lipschitz continuous, which means that there exists a constant such that

Lemma 1. Let the search direction be generated by formula (11), then, the following relations hold:where constants .

Proof. According to the definition of ,For the parameter , we have the following:then holds. Letting , the relation (13) is obtained.
Using the definition of the parameter , we analyse the value of in two cases:

Case 1. . Similar to Lemma 1 of [20], , where is a scalar; then

Case 2. . ThenLetting , we have . Thus,Let ; then (14) holds. The proof is complete.

Remark 1. The relation (14) shows that the optimization algorithm possesses the trust region feature.
The following theorem is obtained for the global convergence of Algorithm 1.

Step 1: Given the initial point , constants , , , let , .
 Step 2: If , then, stop; otherwise, proceed to the next step.
 Step 3: Compute step size by the line search (7) and (8).
 Step 4: Let .
 Step 5: If holds, let , , and go to Step 7; otherwise, go to Step 6.
 Step 6: Let be defined by (9), , and .
 Step 7: If , stop; otherwise, proceed to next step.
 Step 8: Calculate the search direction by (11).
 Step 9: Let , and go to step 2.

Theorem 1. Assume that and are generated by Algorithm 1 and Lemma 1 holds. Then,

Proof. From the line search (8), we have the following:According to (ii) of Assumption 1,Then, we have the following:Using Lemma 1,From Assumption 1, line search (7) and sufficient descent property (13),Summing these inequalities from to ,From Assumption 1, it is easy to know that is bounded. Then, . The proof is complete.

4. Numerical Results

This section will report the numerical experiments with some classical optimization problems, the nonlinear Muskingum model, and the application in image restoration problems. All the tests are coded in MATLAB R2014a, run on a PC with a 2.50 GHz CPU, and 4.00 GB of memory running the Windows 10 operating system.

4.1. Normal Unconstrained Optimization Problems

In this subsection, the numerical experiments would be done with some test problems from [20], and all test problems are listed in Table 1. We compare Algorithm 1 with the classical CD conjugated method (called Algorithm 2) and the classical PRP conjugated method (called Algorithm 3). The detailed experimental data are list in Table 2. Figures 13 show the performance of these three algorithms related to CPU, NI, and NFG. Some columns of Tables 1 and 2 and Figures 13 have the following meanings:(i)No: the serial number of the problem.(ii)Dim: the dimension of the variable .(iii)NI: the iteration numbers.(iv)NFG: the sum of function value and gradient value.(v)CPU: the calculation time in seconds.(vi)Dimension: the dimensions are 3000, 9000, and 15000.(vii)Initialization: the parameters of the algorithms are chosen to be , , and , and the initial search direction .(viii)Stop rules: the following Himmeblau stop rule [21] is used: if , let ; otherwise, let . For every problem, if the conditions or are satisfied, then the program is stopped. This program is also stopped when the number of iterations is greater than one thousand.


NoTest problems

1Extended freudenstein and roth function
2Extended trigonometric function
3Extended rosenbrock function
4Extended white and holst function
5Extended beale function
6Extended penalty function
7Perturbed quadratic function
8Raydan 1 function
9Raydan 2 function
10Diagonal 1 function
11Diagonal 2 function
12Diagonal 3 function
13Hager function
14Generalized tridiagonal 1 function
15Extended tridiagonal 1 function
16Extended three exponential terms function
17Generalized tridiagonal 2 function
18Diagonal 4 function
19Diagonal 5 function
20Extended himmelblau function
21Generalized PSC1 function
22Extended PSC1 function
23Extended powell function
24Extended block diagonal BD1 function
25Extended maratos function
26Extended cliff function
27Quadratic diagonal perturbed function
28Extended wood function
29Extended hiebert function
30Quadratic function QF1 function
31Extended quadratic penalty QP1 function
32Extended quadratic penalty QP2 function
33A quadratic function QF2 function
34Extended EP1 function
35Extended Tridiagonal-2 function
36BDQRTIC function (CUTE)
37TRIDIA function (CUTE)
38ARWHEAD function (CUTE)
39ARWHEAD function (CUTE)
40NONDQUAR function (CUTE)
41DQDRTIC function (CUTE)
42EG2 function (CUTE)
43DIXMAANA function (CUTE)
44DIXMAANB function (CUTE)
45DIXMAANC function (CUTE)
46DIXMAANE function (CUTE)
47Partial perturbed quadratic function
48Broyden tridiagonal function
49Almost perturbed quadratic function
50Tridiagonal perturbed quadratic function
51EDENSCH function (CUTE)
52VARDIM function (CUTE)
53STAIRCASE S1 function
54LIARWHD function (CUTE)
55DIAGONAL 6 function
56DIXON3DQ function (CUTE)
57DIXMAANF function (CUTE)
58DIXMAANG function (CUTE)
59DIXMAANH function (CUTE)
60DIXMAANI function (CUTE)
61DIXMAANJ function (CUTE)
62DIXMAANK function (CUTE)
63DIXMAANL function (CUTE)
64DIXMAAND function (CUTE)
65ENGVAL1 function (CUTE)
66FLETCHCR function (CUTE)
67COSINE function (CUTE)
68Extended DENSCHNB function (CUTE)
69DENSCHNF function (CUTE)
70SINQUAD function (CUTE)
71BIGGSB1 function (CUTE)
72Partial perturbed quadratic PPQ2 function
73Scaled quadratic SQ1 function


No.DimAlgorithm 1Algorithm 2Algorithm 3
NINFGCPU timeNINFGCPU timeNINFGCPU time

130007410.48437520730.07812516650.109375
1900013890.32812520700.26562516620.28125
1150007410.203125421440.812520850.5625
2300016911381.46875832600.328125893610.484375
290001083961.34375912860.875892980.890625
2150001255902.765625932941.328125953061.453125
330009490.031251094150.203125321060.0625
390005260.078125913480.2656256290.015625
3150005260.015625963710.43759500.078125
43000432000.2031251675720.515625291040.109375
4900021910.203125512020.515625291090.21875
41500021910.359375451760.625431640.625
530002140.015625371300.109375883670.6875
59000411280.296875361250.28125291090.71875
5150003260.046875381310.421875531710.734375
63000923270.1718751133580.156251243910.3125
690001043400.3906251334200.406251444530.5625
6150001163700.656251514740.6406251454660.765625
7300042347462.4062596830231.203125100030411.265625
790001023730.359375100030752.453125100030623.09375
7150002729871.5781252147180.921875100030464.296875
8300018650.04687523740.0312525790.046875
8900017620.07812523740.0937525790.109375
81500017620.1562523740.12525790.171875
9300019650.031259280.0156253250.03125
990003260.0468759280.046875231000.09375
9150003260.0468759280.03125241100.234375
1030002140.0156252140.0156252140
109000214021402140
10150002140.031252140.0156252140.03125
1130003190.03125752630.1406254210
1190003190.031251755930.8281256360.0625
11150003190.14062522880.1406257370.078125
12300017570.0312515460.062517550.046875
12900016540.12516490.0937517560.1875
121500016540.12516490.10937517560.15625
1330002140231380.0781252140.03125
1390002140.031253180.0468752140
13150002140.0156252140.0156252140.0625
14300018850.56258270.15625371691.0625
1490007390.7968754140.21875452744.890625
14150007411.2968754140.359375503439.75
15300017590.484375521771.046875331750.796875
15900025831.328125501692.65625331752.359375
151500029952.5625411383.578125331753.796875
163000316013400.03125401510.109375
1690008230.04687517520.062524730.09375
161500017590.1562515460.0937527820.171875
1730002140.015625401220.1875531640.234375
179000361480.578125351070.375471490.546875
17150005420.1562527830.546875491551.078125
183000361630.078125312038445911.875
189000402290.156253120.04687539146753.65625
1815000513250.359375312039447115.421875
1930004300.0156253110.0156252140
1990004390.07812531102140.015625
19150004390.093753110.0468752140.015625
203000381340.0937522700.031253180
209000401420.1562530940.06253180.015625
2015000401370.1875381180.1253180.046875
21300021640.046875381210.09375271030.125
21900021640.265625381210.21875271030.1875
211500021640.1875381210.29687526920.234375
2230002140.0156256350.06256280.09375
2290002140.0468756350.156256280.171875
22150002140.093756350.3281256280.515625
233000231090.0781251806100.54687520710.09375
239000231090.281251906571.5100030037.625
2315000231090.3593752067072.7343751000300311.65625
2430009440.015625501910.1093756680.015625
2490009440.125251070.1256690.125
24150009440.125311290.256690.15625
253000627018530.04687533980.046875
2590009350.07812528830.046875511650.171875
25150006260.0625341010.0937515690.078125
263000862730.28125963050.171875953040.15625
2690001033510.593751073380.453125943010.40625
2615000964021.03125983130.671875953040.640625
27300024428091.62522940.03125291190.046875
27900019522902.40625642560.203125100030043.25
271500094637845.984375502110.29687521800.078125
2830009460.03125441510.09375361260.078125
2890009460.0625501780.15625361230.09375
28150009460.078125371240.171875341250.203125
293000318042504240.046875
299000318042404240.046875
29150003180.031254320.031254240.03125
30300013513880.609375100030630.921875100030131.125
30900059141994.0625100030661.9375100030132.328125
30150001073940.421875100030832.609375100030133.359375
313000321080.078125401250.0625441390.0625
31900031980.09375431360.109375451420.125
3115000331040.1875441390.21875481510.25
32300031940.078125331020.046875501530.078125
32900027840.140625641950.328125411280.234375
321500029900.265625451400.3125461430.34375
33300041304120390
339000260.046875260260
3315000260260260.015625
3430004120.0312541103160
3490004120.0156255140.0156257200.03125
34150004120.0156256170.031259260.09375
3530003170.0156258230.0312514410.03125
3590003170.0312512350.0312521620.09375
35150005140.0312515440.07812525740.140625
363000883441.7520810.3593757380.203125
369000351492.12517690.8757380.453125
3615000943949331232.7968757380.84375
37300038014130.734375100031111.32812523850.046875
37900032111831.37534411551.109375100030133.671875
3715000100036866.40625100030784.375100030054.90625
383000943740.18759300.0156255190.03125
389000215858122740.0468754140.03125
381500029411732.12517570.0781254140.046875
393000618018530.015625381130.046875
3990006180.0312519560.03125411220.109375
39150006180.062520590.09375431280.1875
403000592340.40625100031166.0468753190.046875
4090003180.06251000310916.031253170.046875
40150003170.0781251000310225.7656253170.078125
41300016960.062528980.0312516600.046875
419000231390.12523780.062516600.0625
4115000342420.3437519660.07812516600.078125
4230006470.01562542806420.03125
4290006470.0781254280.0468756420.03125
42150007600.1254280.06256420.0625
433000211880.29687519600.21875581860.421875
4390002140.062520630.359375682161.234375
431500024810.76562520630.59375722282.3125
4430002140.01562527840.203125662160.4375
4490002140.04687528870.5752411.328125
44150002140.07812528870.78125772472.3125
45300013400.09375431320.28125752280.515625
45900013400.328125451380.8125792401.484375
451500013400.375461411.34375812462.28125
463000866630.96875601940.40625241490.28125
4690002140.0468751535102.9062517810915.453125
46150002140.0781251545194.703125200135510.203125
4730007623013.8593757622913.51562520713.71875
4790005518892.2343753411153.96875114517.125
471500082304397.687586276382.312592306416.421875
48300012520.0937530970.14062523820.109375
4890001196652.71875351150.4375371330.5
481500017714858.859375331090.68751263832.546875
49300051654222.796875100030930.890625100030131.09375
4990001535600.5100031711.9375100030132.390625
491500048718002.109375100030722.890625100030133.453125
50300019416098.406251000302514.7656251000301315.203125
5090003203100431695608.4843751000301342.484375
501500024188820.4843751000310771.3751000301370.25
513000492180.2812516490.07812528850.125
5190002140.0312515460.17187528850.28125
51150002140.07812515460.23437528850.46875
5230001494740.4218751735480.468751825750.453125
5290001705411.1406251926071.156252066491.28125
52150001785652.0468752016362.031252156781.984375
5330003260.0312544114870.4531253260.046875
5390003270.01562543014441.03125561930.140625
53150002140.03125100030783.468752140.015625
54300038114030.76562521930.031253180.046875
549000481850.203125983540.31253180.03125
5415000100039787652380.3906253180.015625
55300019650.10937514430.0468753250.0625
5590003260.0937515460.171875231000.390625
55150003260.14062515460.34375231040.75
5630003160100031250.984375100029991.0625
5690003160.046875100031512.078125100029992.1875
56150003160.015625100031253.359375100029993.71875
5730002140.015625712260.513713982.203125
5790002140.046875672111.2343751026023.046875
57150002140.078125692211.984375210154610.84375
58300013400.093751093580.79687558419853.921875
58900013400.218751193982.23437522880.453125
581500013400.3906251494904.6562522880.6875
5930003180.015625481700.3437518920383.140625
5990003180.09375461630.875239268310.328125
59150003180.09375541871.71875256291217.8125
6030004437711103570.734375804660.859375
6090002140.0468751846193.515625977433.453125
60150002140.0781251846245.85937510410006.4375
6130002140.015625732360.51562517518912.875
6190002140.046875782551.45312518811935.796875
61150002140.109375692292.031251146494.96875
6230003180.015625862800.625271030.1875
6290003180.078125712401.281251267023.609375
62150003180.078125471601.390625754153.25
63300015460.1562513945511576271.1875
63900015460.2656252528184.734375723091.515625
631500015460.4531252217426.953125873633.046875
64300018550.140625732220.437524860.1875
64900018550.34375772341.32812524860.484375
641500018550.578125792402.32812524860.859375
6530003190.10937531940.515625331020.578125
6590003190.51562530911.437531961.5
65150003190.687530912.48437531962.484375
6630006300.2031254210.0781254190.078125
6690006300.54210.2343754190.265625
661500012047911.0156254210.4531254190.375
67300010640.093757690.0312511720.046875
6790007420.046875382580.79687511690.15625
6715000221360.890625241700.7187511670.234375
68300024800.07812522670.03125444910.328125
689000214023700.0468752140.03125
68150002140.062523700.0781252140.03125
6930005330531630.10937528880.0625
6990005330.046875511570.2187520710.140625
69150005330.046875511570.35937520710.171875
703000662771.125431670.67187513700.203125
70900010450.521901.0312515770.71875
7015000682454.968759340.5937525982.03125
7130003160421330.0468753160.046875
7190003160.015625421330.06253160
71150003160.015625421330.1253160.046875
72300020676541.437516358331.45312515602.78125
729000100036941734.984375100030361571.609375156024.125
7215000100036114739.8125163568751.625156066.703125
73300056261182.67187532010550.28125100030170.984375
739000100090698.703125100030481.90625100030172.375
731500079856886.53125100030412.71875100030173.078125

From the detailed experimental data of Table 2, it is obvious to see that most of the problems can be solved quickly. For most of the problems, it takes less CPU time to solve those problems with the proposed algorithm. Meanwhile, progress has also been made in NI and NFG. Generally, the algorithm of the proposed method is promising versus the other algorithms. About the numerical results, the algorithm of Dolan and More [22] will be used to more directly show the performance profiles of these algorithms. In Figure 1, the curve of Algorithm 1 is always above the other algorithms. In Figure 2, the curves have the same trend. Algorithm 1 almost solves about of the test problems in , while Algorithm 2 just solves and Algorithm 3 just solves in . Figure 3 shows a similar trend as Figure 2. All the above pictures show that the modified CD conjugate gradient algorithm is more robust and effective compared with the normal CD method and PRP method. In summary, Algorithm 1 is more competitive versus others.

4.2. The Muskingum Model

It is generally known that parameter estimation is a significant task in engineering applications. The nonlinear Muskingum model will be discussed as a common example of such an application in this subsection.

The Muskingum Model [23] is defined by the following:

Some of the variables have the following meanings:(i): the total time number.(ii): the storage time constant.(iii): the observed inflow discharge.(iv): the weighting factor.(v): the observed outflow discharges.(vi) : an additional parameter.(vii) : the time step at time .

In the experiment, the observed data of the flood run-off process from Chenggouwan and Linqing of Nanyunhe River in the Haihe Basin, Tianjin, China, are used. We choose the initial point ; detailed data about and for the years 1960, 1961, and 1964 were obtained (see [24] in detail). was selected. The results of these three algorithms are listed in Table 3. The performance of the presented algorithm is shown in Figures 46.


Algorithms

BFGS [25]10.81560.98261.0219
HIWO [23]13.28130.80010.9933
Algorithm 111.18481.00000.9995

Some conclusions are obtained from this experiment: (1) from Figures 46, we conclude that we can calculate approximations of flood outflows by using Algorithm 1, and Algorithm 1 is effective for the nonlinear Muskingum model; (2) the final points (, and ) of these three algorithms are interesting, and they are competitive with the final points of similar algorithms; and (3) the final points of Algorithm 1 are different from those of the BFGS method and HIWO method, which implies that the nonlinear Muskingum model has different optimum approximation solutions.

4.3. Image Restoration Problems

In this subsection, the above algorithms will be applied to image restoration problems. The original images corrupted by impulse noise are treated as objects here. These problems are regarded as one of the most difficult problems in optimization fields. Related parameter settings are similar to the above subsections, and the program will be stopped when the condition  or  holds. The following three images are selected as processing objects: Banoon (512512), Barbara (512512), and Lena (512512). The detail performances are shown in Figures 79. The CPU time taken to process images is listed in Table 4.


noiseTotal

Algorithm 14.4384.3904.65613.484
Algorithm 24.5635.2345.40615.203
Algorithm 35.1566.3136.78118.250

noiseTotal
Algorithm 18.6097.8918.14124.641
Algorithm 29.21910.68810.54730.454
Algorithm 311.29714.39114.85940.547

noiseTotal
Algorithm 115.75010.57811.98438.312
Algorithm 213.89114.43814.70343.032
Algorithm 314.85919.92220.37555.156

It is easy to see that all the algorithms are successful for image restoration problems. The results in Table 4 reveal that the CPU time of Algorithm 1 is less than that of other algorithms, whether for $30\%$ noise problems, noise problems, or for noise problems.

5. Conclusion

In this paper, a modified three-term type CD conjugate gradient algorithm is presented. Some good features are also presented: (i) sufficient descent property holds, (ii) trust region feature also holds, (iii) the algorithm has global convergence with the MWWP line search technique and projection technique for general function, and (iv) numerical results reveal that the new algorithm is more competitive than the normal CD algorithm and PRP algorithm.

In recent years, there have been considerable researches about other types of CG method, while the study of CD method is not enough, and it should not be ignored. We have many works to do in the future: whether this method is suitable for other line search technique (such as Armijo line search, nonmonotone line search), or whether there exist other better modification methods to improve the numerical results of the CD method. All these are worth studying in the next work.

Data Availability

The data used to support the findings of this study are included within the article.

Conflicts of Interest

The authors declare that there are no conflicts of interest regarding the publication of this paper.

Acknowledgments

This work was supported by the National Natural Science Foundation of China (Grant no. 11661009), the High Level Innovation Teams and Excellent Scholars Program in Guangxi Institutions of Higher Education (Grant No.[2019]52), and the Guangxi Natural Science Key Fund (No. 2017GXNSFDA198046). The authors would like to thank the editor and the referees for their valuable comments, which greatly improve this paper.

References

  1. M. Al-Baali, “Descent property and global convergence of the fletcher-reeves method with inexact line search,” IMA Journal of Numerical Analysis, vol. 5, no. 1, pp. 121–124, 1985. View at: Publisher Site | Google Scholar
  2. Y. Dai and Y. Yuan, “A nonlinear conjugate gradient with a strong global convergence properties,” SIAM Journal on Optimization, vol. 10, pp. 177–182, 2000. View at: Publisher Site | Google Scholar
  3. J. W. Daniel, “The conjugate gradient method for linear and nonlinear operator equations,” SIAM Journal on Numerical Analysis, vol. 4, no. 1, pp. 10–26, 1967. View at: Publisher Site | Google Scholar
  4. R. Fletcher, Practical Methods of Optimization, Wiley, New York, NY, USA, 2nd. edition, 1987.
  5. R. Fletcher and C. M. Reeves, “Function minimization by conjugate gradients,” The Computer Journal, vol. 7, no. 2, pp. 149–154, 1964. View at: Publisher Site | Google Scholar
  6. M. R. Hestenes and E. Stiefel, “Methods of conjugate gradients for solving linear systems,” Journal of Research of the National Bureau of Standards, vol. 49, no. 6, pp. 409–436, 1952. View at: Publisher Site | Google Scholar
  7. Y. Liu and C. Storey, “Efficient generalized conjugate gradient algorithms, part 1: theory,” Journal of Optimization Theory and Applications, vol. 69, no. 1, pp. 129–137, 1991. View at: Publisher Site | Google Scholar
  8. E. Polak and G. Ribiere, “Note sur la convergence de méthodes de directions conjuguées,” Revue française d’informatique et de recherche opérationnelle. Série rouge, vol. 3, no. 16, pp. 35–43, 1969. View at: Publisher Site | Google Scholar
  9. B. T. Polyak, “The conjugate gradient method in extremal problems,” USSR Computational Mathematics and Mathematical Physics, vol. 9, no. 4, pp. 94–112, 1969. View at: Publisher Site | Google Scholar
  10. G. Yuan, X. Wang, and Z. Sheng, “Family weak conjugate gradient algorithms and their convergence analysis for nonconvex functions,” Numerical Algorithms, vol. 84, no. 3, pp. 935–956, 2020. View at: Publisher Site | Google Scholar
  11. S.-Y. Liu, Y.-Y. Huang, and H.-W. Jiao, “Sufficient descent conjugate gradient methods for solving convex constrained nonlinear monotone equations,” Abstract and Applied Analysis, vol. 2014, pp. 1–12, 2014. View at: Publisher Site | Google Scholar
  12. X. Y. Wang, S. J. Li, and X. P. Kou, “A self-adaptive three-term conjugate gradient method for monotone nonlinear equations with convex constraints,” Calcolo, vol. 53, no. 2, pp. 133–145, 2016. View at: Publisher Site | Google Scholar
  13. G. Yuan and W. Hu, “A conjugate gradient algorithm for large-scale unconstrained optimization problems and nonlinear equations,” Journal of Inequalities and Applications, vol. 113, no. 1, 2018. View at: Publisher Site | Google Scholar
  14. G. Yuan, T. Li, and W. Hu, “A conjugate gradient algorithm for large-scale nonlinear equations and image restoration problems,” Applied Numerical Mathematics, vol. 147, pp. 129–141, 2020. View at: Publisher Site | Google Scholar
  15. G. Yuan, J. Lu, and Z. Wang, “The PRP conjugate gradient algorithm with a modified WWP line search and its application in the image restoration problems,” Applied Numerical Mathematics, vol. 152, pp. 1–11, 2020. View at: Publisher Site | Google Scholar
  16. G. Yuan, Z. Wei, and Y. Yang, “The global convergence of the polak-ribière-polyak conjugate gradient algorithm under inexact line search for nonconvex functions,” Journal of Computational and Applied Mathematics, vol. 362, pp. 262–275, 2019. View at: Publisher Site | Google Scholar
  17. L. Zhang, W. Zhou, and D.-H. Li, “A descent modified polak-ribière-polyak conjugate gradient method and its global convergence,” IMA Journal of Numerical Analysis, vol. 26, no. 4, pp. 629–640, 2006. View at: Publisher Site | Google Scholar
  18. G. Yuan, W. Hu, and Z. Sheng, “A conjugate gradient algorithm with yuan-wei-lu line search,” in International Conference on Cloud Computing and Security, Springer, Cham, Switzerland, 2017. View at: Google Scholar
  19. G. Yuan, Z. Wei, and X. Lu, “Global convergence of BFGS and PRP methods under a modified weak Wolfe-Powell line search,” Applied Mathematical Modelling, vol. 47, pp. 811–825, 2017. View at: Publisher Site | Google Scholar
  20. G. Yuan, Z. Sheng, B. Wang, W. Hu, and C. Li, “The global convergence of a modified BFGS method for nonconvex functions,” Journal of Computational and Applied Mathematics, vol. 327, pp. 274–294, 2018. View at: Publisher Site | Google Scholar
  21. Y. Yuan and W. Sun, Theory and Methods of Optimization, Science Press of China, Beijing, China, 1999.
  22. E. D. Dolan and J. J. Moré, “Benchmarking optimization software with performance profiles,” Mathematical Programming, vol. 91, no. 2, pp. 201–213, 2002. View at: Publisher Site | Google Scholar
  23. A. Ouyang, L.-B. Liu, Z. Sheng, and F. Wu, “A class of parameter estimation methods for nonlinear muskingum model using hybrid invasive weed optimization algorithm,” Mathematical Problems in Engineering, vol. 2015, pp. 1–15, 2015. View at: Publisher Site | Google Scholar
  24. A. Ouyang, Z. Tang, K. Li, A. Sallam, and E. Sha, “Estimating parameters of muskingum model using an adaptive hybrid PSO algorithm,” International Journal of Pattern Recognition and Artificial Intelligence, vol. 28, pp. 1–29, 2014. View at: Publisher Site | Google Scholar
  25. Z. W. Geem, “Parameter estimation for the nonlinear muskingum model using the BFGS technique,” Journal of Irrigation and Drainage Engineering, vol. 132, no. 5, pp. 474–478, 2006. View at: Publisher Site | Google Scholar

Copyright © 2020 Zhan Wang et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.


More related articles

 PDF Download Citation Citation
 Download other formatsMore
 Order printed copiesOrder
Views313
Downloads210
Citations

Related articles

Article of the Year Award: Outstanding research contributions of 2020, as selected by our Chief Editors. Read the winning articles.