Mathematical Problems in Engineering

Mathematical Problems in Engineering / 2020 / Article
Special Issue

Machine Learning and its Applications in Image Restoration

View this Special Issue

Research Article | Open Access

Volume 2020 |Article ID 7859286 | https://doi.org/10.1155/2020/7859286

Yulun Wu, Mengxiang Zhang, Yan Li, "Modified Three-Term Liu–Storey Conjugate Gradient Method for Solving Unconstrained Optimization Problems and Image Restoration Problems", Mathematical Problems in Engineering, vol. 2020, Article ID 7859286, 20 pages, 2020. https://doi.org/10.1155/2020/7859286

Modified Three-Term Liu–Storey Conjugate Gradient Method for Solving Unconstrained Optimization Problems and Image Restoration Problems

Academic Editor: Wenjie Liu
Received26 Jul 2020
Accepted28 Sep 2020
Published19 Oct 2020

Abstract

A new three-term conjugate gradient method is proposed in this article. The new method was able to solve unconstrained optimization problems, image restoration problems, and compressed sensing problems. The method is the convex combination of the steepest descent method and the classical LS method. Without any linear search, the new method has sufficient descent property and trust region property. Unlike previous methods, the information for the function is assigned to . Next, we make some reasonable assumptions and establish the global convergence of this method under the condition of using the modified Armijo line search. The results of subsequent numerical experiments prove that the new algorithm is more competitive than other algorithms and has a good application prospect.

1. Introduction

Consider the following unconstrained optimization:

According to the research of other scholars, there are abounding effective forms to solve unconstrained optimization problems. For example, there are the steepest descent method, Newton method, and conjugate gradient method [18]. The nonlinear conjugate gradient method is a very effective method for solving large-scale unconstrained optimization problems. The conjugate gradient method has attracted more and more attention [914] because it is easy to calculate and has low memory requirement and application in many fields [1518]. The conjugate gradient iteration formula of (1) is defined aswhere is the step size of the th iteration obtained by a certain line search rule and is search direction of step , which are two important factors for solving unconstrained optimization problems. Among them, is defined aswhere the parameter called the CG parameter is a scalar and . The method of calculating the CG parameters will affect the performance and stability of the whole algorithm, so CG parameters play an utterly significant part. Among the nonlinear conjugate gradient method, we set and let represent the Euclidean norm. Then, the classical methods include the following:HS method [19] (the parameter )FR method [20] (the parameter )PRP method [21, 22] (the parameter )LS method [23] (the parameter )CD method [24] (the parameter )DY method [25] the parameter ()

This paper mainly studies the Liu–Storey method. At the earliest, Liu and Storey proposed a conjugate gradient method to solve the optimization problem in [9]. In this paper, they demonstrate the convergence of the method when using Wolfe line search. The experimental results show that the algorithm is feasible. Later, it was called the LS method. The LS method is the same as the PRP method with exact linear search. Therefore, the LS method and the PRP method have similar forms. As is known to all, the HS method and the PRP method are supposed be the most effective two methods in practical calculation, and a lot of results have been obtained. Consequently, we hope that the theories and analysis techniques to the PRP method can also be applied to the LS method. However, the LS method does not have the nature of automatic descent. In order to surmount the disadvantage, Li [26] proposed an improved Liu–Storey method using Crippo-Lucide search technology, where the direction of search is computed aswhere , and

Nevertheless, this method assumes a minimum step size, and the main reason is the lack of direction of the trust zone. Therefore, a number of scholars have considered the combination of conjugate gradient method and trust region property. The three-term conjugate gradient algorithm is easy to converge because it automatically has sufficient descent. Therefore, this kind of method has been concerned by many scholars. The conjugate gradient method with three terms was first proposed by Zhang et al. [27]; they defined as

Recently, a three-term conjugate gradient algorithm was proposed in Yuan’s paper [28]. The search direction of this algorithm is a based on LS method and the gradient descent method. On the basis of Yuan’s research, a modified three-term conjugate gradient method is proposed, which has more functional information than the original method, where is computed bywhere , , constant , and

The vector not only has gradient information but also has functional information, with good theoretical results and numerical performance (see [29]). One might think that the resulting method is indeed better than the original one; that is why we use instead of .

According to the meaning of and , the following inequality is obtained:

Therefore,

In [28], Yuan et al. used modified Armijo linear search, which selects the largest as the step size in set such thatwhere is the trial step length, which is often set to 1, and are given constants. As is known to all, Armijo line search technology is a basic and the cheapest method. It is used in various algorithms [3032]. Basically, a number of other methods of line search can be regarded as a modification of Armijo line search. In an unpublished article by Yuan, a new modification of Armijo linear search was designed based on [16, 33], which is defined aswhere , , and satisfying (12). The modification of Armijo linear search is verified to be effective in improving the efficiency of the algorithm.

The following is an introduction to the paper: In the second section, we propose an improvement algorithm and establish two important properties (sufficient descent property and trust region property) of the algorithm without using linear search. Then, some other properties of the algorithm are given, and the global convergence of the algorithm is proved under appropriate assumptions. The third section gives numerical experiments on normal unconstrained optimization problems, image restoration problems, and compressive sensing problems. In the last section, the conclusion of this paper is given.

2. Conjugate Gradient Algorithm and Convergence Analysis

This section will give a new modified Liu–Storey method combining (7) and (12), as shown below:

Lemma 1. is generated by formula (7); then, it clears that

Proof. When , it is obvious that and .
When , we haveUsing (10), we have , and then, (13) holds. Also,Then, we haveSetting , we get the right half of inequality (14). By Cauchy–Schwartz inequality and (13), we obviously get the left half of inequality (14). So, we get the results we want to prove.
Then, we will focus on the global convergence of the algorithm. In order to achieve this goal, we have to make the following assumptions.

Assumption 1. (a)The level set is bounded(b)Function: is continuously differentiable and the gradient is Lipschitz continuous, and there exists a constant (Lipschitz constant) such that

Theorem 1. Assumption 1 is kept true, and then, there is a positive constant , which satisfies (12) in Algorithm 1.

Input: input parameters , , and .
Output: .
(1)For an initial solution compute ; set ;
(2)whiledo
(3) Find satisfying (12).
(4) Set .
(5) Compute by (7).
(6).
(7)End

Proof. We construct such a functionFrom Assumption 1, and are bounded, note that , and from (13), we have . Obviously, holds. Now, we discuss the following two scenarios:Case 1: if , then For all positive that are small enough, we can obtain the following inequalities:Case 2: if , then Same as Case 1, we haveSo, from (20) and (21), there exists, at least, a positive constant such that , which also implies that there must be, at least, one local minimum point such thatthat is,Thus, satisfies (12). The proof is completed. So, quite evidently, Algorithm 1 is well defined.

Theorem 2. Let Assumption 1 be satisfied and the iterative sequence be generated by Algorithm 1. After that, we obtain

Proof. We will use the contradiction to draw this conclusion. When (24) is not correct, there exists a constant such thatFrom the third step of the Algorithm (12), we havenamely, and summing up the two sides of the abovementioned inequalities and from Assumption 1 (ii), it is obtained thatThus, from (28), we haveNow, our certificates are divided into the following two cases:Case 1: the step size as . From (29), it is obvious that as . This contradicts (24). Case 2: the step size as .From Algorithm 1, for the obtained step size , it is clear that does not satisfy (12), and by (13) and (14),where ; then, according to the mean value theorem, there exists an such thatCombined with (30)–(32), we haveSo, we get the contradiction. Thus, result (24) is true, and this completes the proof.

3. Numerical Experiments

In this section, we have designed three experiments to study the computational efficiency and performance of the proposed algorithm. The first subsection is normal unconstrained optimization problems, the second subsection is the image restoration problem, and the third subsection is compressive sensing problems. All the programs are compiled with Matlab R2017a and implemented on an Intel(R) Core (TM) i7-4710MQ CPU @ 2.50 GHz, RAM 8.0 GB, and the Windows 10 operating system.

3.1. Normal Unconstrained Optimization Problems

In order to test the numerical performance of the TTMLS algorithm, the NLS algorithm [28], LS method with the normal WWP line search (LS-WWP), and PRP method with the normal WWP line search (PRP-WWP) are also experimented as the comparison group. The results can be seen in Tables 14. The data used in the experiment are as follows:Dimension: we choose 3000 dimensions, 6000 dimensions, and 9000 dimensions to test.Parameters: all the algorithms run with , , , , and .Stop rule (the Himmelblau stop rule [34]): if , let ; otherwise, let . If conditions or are satisfied or the iteration number of more than 1000 is satisfied, we stop the process, where .Symbol representation: NI: the iteration number. NFG: the total number of function and gradient evaluations. CPU: the CPU time in seconds.Text problems: we have tested 74 unconstrained optimization problems, and the list of problems can be seen in Yuan’s work [16].


NrDimTTMLS algorithm
NINFGCPU

130005260.203125
160005260.109375
190005260.234375
23000854023.640625
26000904266.171875
29000914349.125
330006320.0625
360006320.0625
390006320.046875
430005260.0625
460005260.15625
490005260.171875
530005260.015625
560005260.046875
590005260.0625
630004200.0625
660004200.0625
690004200.0625
73000783420.140625
76000803530.234375
79000803540.328125
8300010400.03125
8600010400.0625
8900010400.0625
930006200
960006200.0625
990006200
1030003140.0625
1060003140.0625
1090003140.0625
113000100029992.359375
116000100029994.625
119000100029996.859375
1230003140.0625
1260003140.0625
1290003140.0625
1330004170
1360004170.0625
1390004200.109375
1430007380.25
1460007380.375
1490007380.46875
1530007380.1875
1560007380.265625
1590007380.390625
1630004200
1660004200.046875
1690004200.0625
1730005260.125
1760005260.140625
1790005260.25
183000100030730.8125
186000100030731.421875
189000100030731.984375
19300010290.0625
19600010290.0625
19900010290.125
2030007380.0625
2060007380.0625
2090007380.109375
2130006320.1875
2160006320.265625
2190006320.34375
2230009500.15625
2260009500.25
2290009500.390625
2330006320.125
2360006320.125
2390006320.171875
2430002327140.4375
2460002437470.890625
2490002497651.296875
2530006320.0625
2560006320.0625
2590006320.0625
2630004140
2660004140
2690004140.0625
273000793460.125
276000803530.234375
279000813580.28125
2830005260
2860005260
2890005260.046875
2930007380.015625
2960007380.0625
2990007380.0625
303000763330.0625
306000783420.171875
309000783440.203125
3130005260.03125
3160005260
3190005260.0625
3230005260.125
3260005260.1875
3290005260.1875
3330006320
3360006320
3390006320.0625
3430005260.0625
3460005260.0625
3490005260.046875
3530006280
3560006280
3590006280
3630005260.078125
3660005260.125
3690005260.21875
373000823660.078125
376000843780.140625
379000853820.34375
3830005260
3860005260
3890005260.0625
3930005260.0625
3960005260.0625
3990005260.0625
4030005260.125
4060005260.15625
4090005260.25
413000733120.0625
416000733120.140625
419000733120.265625
423000372040.1875
426000301700.296875
429000432380.59375
4330005260.140625
4360005260.25
4390005260.359375
4430005260.1875
4460005260.25
4490005260.265625
4530005260.140625
4560005260.21875
4590005260.265625
4630005260.125
4660005260.25
4690005260.28125
473000823683.9375
4760008538611.29688
4790008739623.98438
4830006320.1875
4860006320.171875
4890006320.3125
493000783420.0625
496000803530.1875
499000803540.296875
503000783420.9375
506000803531.8125
509000803542.234375
5130007380.125
5160007380.28125
5190007380.375
5230004200.0625
5260004200.125
5290004200.0625
533000100030320.875
536000100030321.609375
539000100030332.4375
5430005260
5460005260
5490005260.046875
55300024740.109375
55600026800.375
55900026800.5
563000100030100.90625
566000100030101.6875
569000100030102.46875
5730005260.125
5760005260.21875
5790005260.265625
5830005260.125
5860005260.21875
5890005260.25
5930005260.125
5960005260.234375
5990005260.28125
6030005260.125
6060005260.25
6090005260.28125
6130005260.1875
6160005260.25
6190005260.3125
6230005260.1875
6260005260.25
6290005260.359375
6330005260.1875
6360005260.25
6390005260.3125
6430005260.140625
6460005260.25
6490005260.28125
6530006320.203125
6560006320.3125
6590006320.28125
6630006320.078125
6660006320.1875
6690006320.25
67300012610.0625
67600014740.09375
67900013690.1875
68300011620.0625
68600011620.0625
68900011620.0625
6930006320.03125
6960006320.0625
6990006320.0625
7030006320.1875
7060006320.25
7090006320.34375
713000100030101.015625
716000100030101.8125
719000100030102.59375
723000823684.1875
7260008638911.51563
7290008739623.92188
733000763330.09375
736000783420.15625
739000783440.234375
743000763330.0625
746000783420.125
749000783440.21875


NrDimNLS algorithm
NINFGCPU

130005260.125
160005260.140625
190005260.21875
23000924284.078125
260001024727.15625
29000974577.84375
330006320.0625
360006320
390006320
430005260.0625
460005260.109375
490005260.109375
530005260.0625
560005260.0625
590005260.0625
630004200
660004200.0625
690004200.0625
73000833610.125
76000853700.25
79000863770.34375
83000100030462.078125
8600010400.0625
8900010400.0625
930006200.0625
960006200
990006200.0625
1030003140
1060003140
1090003140.0625
113000100029992.8125
116000100029995.296875
119000100029997.8125
1230003140
1260003140.03125
1290003140
1330004170
1360004170.0625
1390004200.0625
1430007380.203125
1460007380.375
1490007380.453125
1530007380.1875
1560007380.3125
1590007380.34375
1630004200
1660004200
1690004200.0625
1730005260.125
1760005260.140625
1790005260.234375
183000100030750.75
186000100030751.234375
189000100030752.046875
19300010290
19600010290.0625
19900010290.1875
2030007380.0625
2060007380.0625
2090007380.0625
2130006320.125
2160006320.265625
2190006320.375
2230009500.1875
2260009500.25
2290009500.28125
2330006320.0625
2360006320.125
2390006320.125
24300050515351.109375
24600052916072.3125
24900054216463.515625
2530006320.0625
2560006320
2590006320.046875
2630004140
2660004140.0625
2690004140
273000843650.125
276000853700.3125
279000853720.359375
2830005260.0625
2860005260
2890005260.03125
2930007380
2960007380
2990007380.0625
303000833570.0625
306000833610.1875
309000843660.21875
3130005260.0625
3160005260
3190005260
3230005260.125
3260005260.125
3290005260.125
3330006320.0625
3360006320
3390006320.0625
3430005260.0625
3460005260
3490005260
3530007330
3560007330
3590007330.0625
3630005260.125
3660005260.125
3690005260.1875
373000873840.125
376000903980.25
379000893960.390625
3830005260.0625
3860005260
3890005260.0625
3930005260.0625
3960005260
3990005260.046875
4030005260.125
4060005260.140625
4090005260.25
413000793340.125
416000793340.171875
419000793340.25
423000462350.25
426000392040.4375
429000552720.71875
4330005260.125
4360005260.25
4390005260.3125
4430005260.125
4460005260.15625
4490005260.28125
4530005260.109375
4560005260.203125
4590005260.3125
4630005260.078125
4660005260.21875
4690005260.375
473000883906.078125
4760009140619.84375
4790009241541.73438
4830006320.125
4860006320.1875
4890006320.234375
493000833610.125
496000853700.171875
499000863770.1875
503000833611.125
506000853702.0625
509000863772.453125
5130007380.1875
5160007380.25
5190007380.296875
5230004200.0625
5260004200.0625
5290004200.015625
533000100030151.03125
536000100030161.8125
539000100030162.859375
5430005260
5460005260.0625
5490005260
55300024740.1875
55600026800.421875
55900026800.515625
56300078823740.875
56600078823741.59375
56900078823742.234375
5730005260.125
5760005260.203125
5790005260.28125
5830005260.125
5860005260.21875
5890005260.34375
5930005260.125
5960005260.25
5990005260.234375
6030005260.125
6060005260.234375
6090005260.34375
6130005260.109375
6160005260.203125
6190005260.28125
6230005260.125
6260005260.25
6290005260.359375
6330005260.125
6360005260.25
6390005260.265625
6430005260.1875
6460005260.25
6490005260.25
6530006320.1875
6560006320.28125
6590006320.328125
6630006320.125
6660006320.1875
6690006320.234375
67300022990.125
67600014650.125
67900021950.234375
68300011620.0625
68600011620.0625
68900011620.0625
6930006320
6960006320.0625
6990006320.0625
7030006320.125
7060006320.203125
7090006320.375
71300078823740.8125
71600078823741.46875
71900078823742.265625
723000883905.984375
7260009140619.34375
7290009241542.32813
733000833570.125
736000833610.09375
739000843660.203125
743000833570.0625
746000833610.15625
749000843660.203125


NrDimLS-WWP
NINFGCPU

130001000599820.5625
160001000598639.26563
190001000598652.96875
23000632280.375
26000662310.765625
29000612231.046875
3300032510760.3125
36000100052061.828125
39000100056852.65625
430001000593411.5
460001000583522.32813
490001000584131.6875
530001328190.4375
560001338260.765625
590001348321.203125
63000481980.125
66000512070.125
69000432020.1875
73000100059941.71875
76000100059943
79000100059944.375
83000331360.125
86000331360.125
89000321300.140625
930004190.0625
960004190
990004190.0625
103000290
106000290
109000290.0625
113000100030012.671875
116000100030015.046875
119000100030017.828125
123000181050.125
126000191110.25
129000191110.359375
1330003130
13600013660.1875
139000290
1430006330.1875
1460005270.265625
1490005270.28125
1530001000301019
1560001000301029.40625
1590001000301035.84375
16300022720.0625
16600015840.125
16900015810.15625
173000744321.625
176000744322.96875
179000744324.046875
1830007420.0625
1860008480
1890008480.0625
1930004180
1960004180.0625
1990004180.09375
20300014740
20600014740.046875
20900014740.0625
213000451500.0625
216000211110.125
219000201110.171875
22300012580
22600012580.125
22900012580.09375
2330001000595911.54688
2360001000595922.4375
2390001000595931.98438
243000593130.1875
246000622740.3125
2490007460.0625
25300033980.0625
256000441310.0625
259000100030731.921875
263000511940.125
266000501880.25
269000501880.375
273000100030131.15625
276000100030132.1875
279000100030133.140625
283000925030.0625
286000925030.1875
289000925030.1875
2930003150
2960003150
2990003150
303000100059941.28125
306000100059942.453125
309000100059943.328125
31300010510
31600012640.0625
31900012640.0625
32300010520.0625
326000100059355.921875
32900010540.0625
333000390.0625
336000390
339000260
3430003150
3460007200
3490008230.0625
35300015440
35600019560.046875
35900021620.0625
3630001000598817.95313
3660001000599732.17188
3690001000599143.54688
373000100059912.0625
376000100059883.84375
379000100059885.421875
3830006310
3860006310
3890006310
393000421250
396000441310.125
399000461370.09375
4030001000598225.26563
4060001000589841.26563
4090001000586755.25
413000100030030.96875
416000100030031.640625
419000100030032.421875
4230006260
4260006260.0625
4290007320.0625
4330008460.25
4360008460.4375
4390008460.5625
44300012680.328125
44600011620.46875
44900011620.6875
4530009500.25
4560009500.375
4590009500.5625
463000563336716.0625
466000708423733.29688
469000812486451.15625
47300014443511.59375
47600029388272.65625
4790004461341230.6875
483000512940.8125
486000502881.5
489000502882.078125
493000100059941.296875
496000100059942.328125
499000100059943.359375
5030001000599417.14063
5060001000599431.34375
5090001000599441.54688
51300014500.1875
51600013470.375
51900013470.40625
523000392080.25
526000422240.5
529000432270.71875
533000100030030.9375
536000100030031.609375
539000100030032.453125
543000100059551.4375
546000100059582.59375
549000100059553.96875
5530006320.0625
5560006320.1875
5590006320.1875
563000100029991.0625
566000100029992.0625
569000100029992.75
5730009500.25
5760009500.375
5790009500.578125
583000533141.546875
586000452662.1875
589000412422.734375
5930008440.25
5960008440.328125
5990008440.5
603000563336716.15625
606000710425233.78125
6090001000301343.10938
6130009500.25
6160009500.390625
6190009500.53125
6230006320.140625
6260006320.3125
6290006320.296875
6330006320.125
6360006320.265625
6390005260.265625
6430008440.1875
6460008440.328125
6490008440.5625
65300010460.1875
65600010460.34375
6590009430.453125
6630001000598916.51563
6660001000598929.70313
6690001000598940.28125
673000271080.0625
67600018780.109375
679000211120.9375
683000451360
686000471420.0625
689000481450.0625
693000361140.046875
696000371170.046875
699000371170.0625
7030001000370114.15625
7060001000312523.60938
7090001000318831.20313
713000100059961.375
716000100059962.546875
719000100059963.71875
7230001000301082
72600010003010246.3281
72900010003010519.7656
733000100059941.3125
736000100059942.46875
739000100059943.375
743000100059971.25
746000100059972.390625
749000100059943.21875


NrDimPRP-WWP
NINFGCPU

1300011670.3125
1600022890.625
1900021860.78125
23000702420.4375
26000802630.90625
29000762551.21875
33000331090
360007360
390006360
430007420.125
4600010560.25
4900012710.40625
5300019880.0625
5600019880.109375
5900019880.15625
63000702550.109375
66000742740.171875
69000853040.296875
73000100030441.0625
76000100030442.09375
79000100030442.8125
8300023970.0625
8600023970.125
8900023970.15625
930005250
960005250
990005250
103000290
106000290
109000290
11300066920241.765625
11600021860.125
119000100030187.546875
12300019700.125
12600019700.171875
12900019700.203125
1330003130
13600010540.1875
139000290.0625
1430006330.171875
1460005270.25
1490005270.359375
153000311380.75
156000552332.046875
159000391771.90625
16300010420
16600021660.125
16900022690.1875
173000401360.5625
176000411391.171875
179000441641.71875
1830003160
1860003160
1890003160
1930004210
1960004210.0625
1990004210.109375
203000331070
206000341100.046875
209000351130.125
21300026950.046875
21600019830.125
21900019830.15625
22300013670.0625
22600013670.0625
22900013670.140625
233000100030178.953125
2360001000302317.34375
23900029012088.171875
243000652220.125
246000832760.34375
249000923030.578125
25300033980
256000431280.0625
259000541820.125
263000501840.125
266000521900.25
269000531930.328125
273000301570.046875
276000352040.109375
279000462680.1875
283000241080.0625
286000241080
289000241080.0625
2930004190
2960004190
2990004220
303000100031130.75
306000100031161.125
309000100031411.984375
31300020810
31600020810.0625
31900020810.125
32300026880.0625
326000441420.1875
329000611930.359375
333000390
336000390
339000260.046875
3430003150
3460007200
3490007200.03125
35300014410
35600018530
35900021620.03125
36300019880.3125
366000271260.75
36900017990.75
373000100030151.3125
376000100032062.53125
379000100030983.609375
3830007400
3860005280
3890005280.0625
393000381130.0625
396000401190.0625
399000411220.109375
4030001000313422.75
4060001000328236.375
4090001000314442.73438
41300019730
41600019730
41900019730.0625
42300011430.0625
42600011460.0625
42900011460.0625
4330007400.203125
4360008460.4375
4390008460.46875
44300012680.3125
44600011620.453125
44900011620.71875
4530009500.25
4560009500.46875
4590009500.546875
4630002608356.0625
466000529161318.75
469000611185926.6875
473000461613.59375
4760005718113.54688
4790008434740.0625
483000531650.5625
486000461440.875
489000401260.953125
493000100031130.75
496000100031161.328125
499000100031401.96875
5030001000324511.32813
5060001000308619.51563
5090001000305424.78125
5130007370.125
5160007400.25
5190007400.4375
523000933550.53125
526000993731.171875
5290001074011.875
533000100030220.875
536000100030221.671875
539000100030222.34375
543000472160.0625
546000321650.0625
549000562720.203125
5530005250.0625
5560005250.125
5590005250.1875
563000100030120.984375
566000100030121.671875
569000100030122.5625
5730009500.1875
5760009500.390625
5790009500.609375
583000533141.5625
586000452662.234375
589000412422.65625
5930008440.1875
5960008440.40625
5990008440.46875
6030001034382.71875
606000330104812.03125
60900026591612.73438
6130009500.25
6160009500.34375
6190009500.59375
6230006320.1875
6260006320.203125
6290006320.375
6330006320.140625
6360006320.3125
6390005260.265625
6430008440.203125
6460008440.4375
6490008440.453125
65300012460.25
65600010400.265625
65900010400.4375
6630007270.0625
6660007270.125
6690007270.171875
673000241250.5625
676000251311.125
67900017730.09375
683000431300.0625
686000451360.046875
689000461390.09375
693000381250
696000391280.046875
699000401310.125
7030006310.0625
706000241100.75
7090006310.234375
713000100030090.875
716000100030091.703125
719000100030092.359375
7230001000310580.3125
72600010004018228.0625
72900010003182513.3125
733000100030190.875
736000100030191.5
739000100030192.09375
743000100030390.9375
746000100030571.359375
749000100031732.03125


30% noiseLenaBarbaraManBaboonTotal

TTMLS algorithm18.81324.094100.81322.922166.642
NLS algorithm19.54726.703108.28125.391179.922

45% noise
TTMLS algorithm38.34439.125205.12535.406318.000
NLS algorithm42.42141.688218.34439.422341.875

60% noise
TTMLS algorithm91.06384.203417.04779.656671.969
NLS algorithm97.93888.984438.15687.547712.625

Dolan and Moré [35] provided a way to analyze the efficiency of these algorithms. From Figures 13, we can see that the 3performance of the TTMLS algorithm, TTMLS algorithm, and NLS algorithm is significantly better than that of the LS-WWP method and PRP-WWP method. Figures 1 and 2 show that the TTMLS algorithm and NLS algorithm can better approximate to the target function than the LS-WWP algorithm and PRP-WWP algorithm; thus, the number of iterations and the total number of function and gradient evaluation are smaller. The reason is that the search direction of the TTMLS algorithm contains more function information. Also, the CPU time in Figure 4, TTMLS algorithm is basically the same as the NLS algorithm, which is better than the other two. To sum up, the proposed algorithm has significant advantages.

3.2. Image Restoration Problems

The image restoration problem is a difficult problem in the field of optimization. We will use the TTMLS algorithm and NLS algorithm to minimize to recover the original image from an image corrupted by impulse noise. Afterwards, we compare the performance of the two algorithms. The data used in the experiment are as follows:Parameters: all the algorithms run with , , , and .Stop rule: if or is satisfied, we stop the process.Symbol representation: CPU: the CPU time in seconds. Total: the total CPU time of the four pictures.The information of noise: , , and salt-and-pepper noise.Text problems: we restore the original image from the image destroyed by impulsive noise. The experiments chose Lena , Barbara , Man , and Baboon as the test images.

From Figures 46, we can Table 5 see that both algorithms can recover 30%, 45%, and 60% salt-and-pepper noise images very well. The data show that, for image restoration problems, the TTMLS algorithm has shorter CPU time than the NLS algorithm when the salt-and-pepper noise is 30%, 45%, and 60%. In conclusion, the TTMLS algorithm is promising and competitive.

3.3. Compressive Sensing Problems

The main work of this section is to accurately recover the image from a few of random projections by compressive sensing. The experimental method derives from the model proposed by Dai and Sha [36]. Then, the performance of the TTMLS algorithm and LS method with line search (12) is compared.

It is noted that the gradients and are square matrices in this experiment, and the matrix obtained by may appear as a singular matrix, which results in the invalidation of the algorithm. But, when we calculate , we only need the value of without knowing the information of this square matrix, so in this experiment, we set in . The data used in the experiment are as follows:Parameters: all the algorithms run with , , , and .Stop rule: if or the number of iterations exceeds 500 is satisfied, we stop the process.Symbol representation: PSNR: Peak Signal-to-Noise Ratio. It is an objective criterion for image evaluation.Text problems: compressive sensing problems. The experiments chose Camera man , Fruits , Lena , and Baboon as the test images.

From Figure 7 and Table 6, we can see that both algorithms are effective in compression sensing problems. Meanwhile, from the experimental data, we can see that the TTMLS algorithm has more advantages than the LS algorithm.


AlgorithmCamera manFruitsLenaPeppersTotal

TTMLS31.92933.05331.99633.075130.053
LS30.79231.48631.48931.925125.692

4. Conclusions

In this paper, based on the well-known LS method and combined with improved Armijo linear search, this paper presents a three-term conjugate gradient algorithm. Without any linear search, the search direction of the new three-term conjugate algorithm is proved to have two good properties: sufficient descent and trust region properties. Also, the global convergence of the algorithm is established. The numerical results indicate that the new algorithm is effective. The good performance of the algorithm in image restoration problems and compressive sensing problems also proves that the algorithm is competitive.

Data Availability

Data used in this study can be obtained from the corresponding author on reasonable request.

Conflicts of Interest

There are no potential conflicts of interest.

Acknowledgments

The authors want to thank the support of the funds. This work was supported by the High Level Innovation Teams and Excellent Scholars Program in Guangxi Institutions of Higher Education (Grant no. [2019] 32), the National Natural Science Foundation of China (Grant no. 11661009), the Guangxi Natural Science Foundation (No. 2020GXNSFAA159069), and the Guangxi Natural Science Key Foundation (No. 2017GXNSFDA198046).

References

  1. Y.-H. Dai, “New properties of a nonlinear conjugate gradient method,” Numerische Mathematik, vol. 89, no. 1, pp. 83–98, 2001. View at: Publisher Site | Google Scholar
  2. L. Grippo and S. Lucidi, “A globally convergent version of the Polak-Ribière conjugate gradient method,” Mathematical Programming, vol. 78, no. 3, pp. 375–391, 1997. View at: Publisher Site | Google Scholar
  3. J. Nocedal and S. Wright, “Numerical optimization,” in Springer Series in Operations Research, Springer, Berlin, Germany, 2nd edition, 2006. View at: Google Scholar
  4. Z. Shi, “Restricted PR conjugate gradient method and its global convergence,” Advances in Mathematics, vol. 31, pp. 47–55, 2002. View at: Google Scholar
  5. Z. Wei, S. Yao, and L. Liu, “The convergence properties of some new conjugate gradient methods,” Applied Mathematics and Computation, vol. 183, no. 2, pp. 1341–1350, 2006. View at: Publisher Site | Google Scholar
  6. G. Yuan, X. Wang, and Z. Sheng, “Family weak conjugate gradient algorithms and their convergence analysis for nonconvex functions,” Numerical Algorithms, vol. 84, no. 3, pp. 935–956, 2020. View at: Publisher Site | Google Scholar
  7. W. Zhou, “A short note on the global convergence of the unmodified PRP method,” Optimization Letters, vol. 7, no. 6, pp. 1367–1372, 2013. View at: Publisher Site | Google Scholar
  8. D. Li and M. Fukushima, “On the global convergence of the BFGS method for nonconvex unconstrained optimization problems,” SIAM Journal on Optimization, vol. 11, pp. 1054–1064, 1999. View at: Publisher Site | Google Scholar
  9. N. Andrei, “An unconstrained optimization test functions collection,” Advanced Modeling Optimization, vol. 10, pp. 147–161, 2008. View at: Google Scholar
  10. J. C. Gilbert and J. Nocedal, “Global convergence properties of conjugate gradient methods for optimization,” Siam Journal on Optimization, vol. 2, no. 1, pp. 21–42, 1992. View at: Publisher Site | Google Scholar
  11. X. Li and Q. Ruan, “A modified PRP conjugate gradient algorithm with trust region for optimization problems,” Numerical Functional Analysis and Optimization, vol. 32, no. 5, pp. 496–506, 2011. View at: Publisher Site | Google Scholar
  12. Z. Wei, G. Li, and Q. Li, “Global convergence of the Polak-Ribière-Polyak conjugate gradient method with an Armijo-type inexact line search for nonconvex unconstrained optimization problems,” Mathematics of Computation, 2008. View at: Google Scholar
  13. G. Yu, L. Guan, and Z. Wei, “Globally convergent Polak-Ribière-Polyak conjugate gradient methods under a modified Wolfe line search,” Applied Mathematics and Computation, vol. 215, no. 8, pp. 3082–3090, 2009. View at: Publisher Site | Google Scholar
  14. L. Zhang, W. Zhou, and D. Li, “Global convergence of a modified Fletcher-Reeves conjugate gradient method with Armijo-type line search,” Numerische Mathematik, vol. 104, no. 4, pp. 561–572, 2006. View at: Publisher Site | Google Scholar
  15. F. Wen and X. Yang, “Skewness of return distribution and coefficient of risk premium,” Journal of Systems Science and Complexity, vol. 22, no. 3, pp. 360–371, 2009. View at: Publisher Site | Google Scholar
  16. G. Yuan, J. Lu, and Z. Wang, “The PRP conjugate gradient algorithm with a modified WWP line search and its application in the image restoration problems,” Applied Numerical Mathematics, vol. 152, pp. 1–11, 2020. View at: Publisher Site | Google Scholar
  17. G. Yuan, Z. Wei, and Y. Yang, “The global convergence of the Polak-Ribiere-Polyak conjugate gradient algorithm under inexact line search for nonconvex functions,” Journal of Computational and Applied Mathematics, vol. 362, pp. 262–275, 2019. View at: Publisher Site | Google Scholar
  18. W. Zhou and F. Wang, “A PRP-based residual method for large-scale monotone nonlinear equations,” Applied Mathematics and Computation, vol. 261, pp. 1–7, 2015. View at: Publisher Site | Google Scholar
  19. M. R. Hestenes and E. Stiefel, “Methods of conjugate gradients for solving linear systems,” Journal of Research of the National Bureau of Standards, vol. 49, no. 6, pp. 409–436, 1952. View at: Publisher Site | Google Scholar
  20. R. Fletcher and C. Reeves, “Function minimization by conjugate gradients,” The Computer Journal, vol. 7, no. 2, pp. 149–154, 1964. View at: Publisher Site | Google Scholar
  21. E. Polak and G. Ribière, “Note sur la convergence de méthodes de directions conjuguées,” Revue Française d’informatique et de Recherche Opérationnelle. Série Rouge, vol. 3, no. 16, pp. 35–43, 1969. View at: Publisher Site | Google Scholar
  22. B. T. Polyak, “The conjugate gradient method in extremal problems,” USSR Computational Mathematics and Mathematical Physics, vol. 9, no. 4, pp. 94–112, 1969. View at: Publisher Site | Google Scholar
  23. Y. Liu and C. Storey, “Efficient generalized conjugate gradient algorithms, part 1: theory,” Journal of Optimization Theory and Applications, vol. 69, no. 1, pp. 129–137, 1991. View at: Publisher Site | Google Scholar
  24. R. Fletcher, Practical Method of Optimization Vol. I: Unconstrained Optimization, John Wiley & Sons, New York, NY, USA, 1987.
  25. Y. H. Dai and Y. Yuan, “A nonlinear conjugate gradient method with a strong global convergence property,” SIAM Journal on Optimization, vol. 10, no. 1, pp. 177–182, 1999. View at: Publisher Site | Google Scholar
  26. Z. Li, “A new Liu-Storey type nonlinear conjugate gradient method for unconstrained optimization problems,” Journal of Computational and Applied Mathematics, vol. 225, no. 1, pp. 146–157, 2009. View at: Publisher Site | Google Scholar
  27. L. Zhang, W. Zhou, and D.-H. Li, “A descent modified Polak-Ribière-Polyak conjugate gradient method and its global convergence,” IMA Journal of Numerical Analysis, vol. 26, no. 4, pp. 629–640, 2006. View at: Publisher Site | Google Scholar
  28. G. Yuan, T. Li, and W. Hu, “A conjugate gradient algorithm for large-scale nonlinear equations and image restoration problems,” Applied Numerical Mathematics, vol. 147, pp. 129–141, 2020. View at: Publisher Site | Google Scholar
  29. G. Yuan and Z. Wei, “Convergence analysis of a modified BFGS method on convex minimizations,” Computational Optimization and Applications, vol. 47, no. 2, pp. 237–255, 2010. View at: Publisher Site | Google Scholar
  30. Z. Dai and F. Wen, “Global convergence of a modified hestenes-stiefel nonlinear conjugate gradient method with armijo line search,” Numerical Algorithms, vol. 59, no. 1, pp. 79–93, 2012. View at: Publisher Site | Google Scholar
  31. M. Li and A. Qu, “Some sufficient descent conjugate gradient methods and their global convergence,” Computational and Applied Mathematics, vol. 33, pp. 333–347, 2004. View at: Publisher Site | Google Scholar
  32. L. Zhang, W. Zhou, and D. Li, “Global convergence of the dy conjugate gradient method with armijo line search for unconstrained optimization problems,” Optimization Methods and Software, vol. 22, no. 3, pp. 511–517, 2007. View at: Publisher Site | Google Scholar
  33. G. Yuan, Z. Wei, and X. Lu, “Global convergence of BFGS and PRP methods under a modified weak Wolfe-Powell line search,” Applied Mathematical Modelling, vol. 47, pp. 811–825, 2017. View at: Publisher Site | Google Scholar
  34. Y. Yuan and W. Sun, Theory and Methods of Optimization, Science Press of China, Beijing, China, 1999.
  35. E. Dolan and J. Moré, “Benchmarking optimization software with performance profiles,” Mathematical Programming, vol. 91, no. 2, pp. 201–213, 2001. View at: Publisher Site | Google Scholar
  36. Q. Dai and W. Sha, “The physics of compressive sensing and the gradient-based recovery algorithms,” Mathematics, 2009. View at: Google Scholar

Copyright © 2020 Yulun Wu et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.


More related articles

 PDF Download Citation Citation
 Download other formatsMore
 Order printed copiesOrder
Views210
Downloads254
Citations

Related articles

Article of the Year Award: Outstanding research contributions of 2020, as selected by our Chief Editors. Read the winning articles.