Research Article  Open Access
Yingjie Zhou, Yulun Wu, Xiangrong Li, "A New Hybrid PRPFR Conjugate Gradient Method for Solving Nonlinear Monotone Equations and Image Restoration Problems", Mathematical Problems in Engineering, vol. 2020, Article ID 6391321, 13 pages, 2020. https://doi.org/10.1155/2020/6391321
A New Hybrid PRPFR Conjugate Gradient Method for Solving Nonlinear Monotone Equations and Image Restoration Problems
Abstract
A new hybrid PRPFR conjugate gradient method is presented in this paper, which is designed such that it owns sufficient descent property and trust region property. This method can be considered as a convex combination of the PRP method and the FR method while using the hyperplane projection technique. Under accelerated step length, the global convergence property is gained with some appropriate assumptions. Comparing with other methods, the numerical experiments show that the PRPFR method is more competitive for solving nonlinear equations and image restoration problems.
1. Introduction
Consider the following nonlinear equation:where S is a closed and convex subset. The function is monotone and continuously differentiable which satisfies
This problem of monotone equations has all kinds of applications, such as compressive sensing [1] and chemical equilibrium problems [2]. The conjugate gradient algorithm generates the iteration point viawhere is the step size generated from a proper line search and is a search direction. There are many methods to find , such as the Newton method [3], quasiNewton method [4], and conjugate gradient method [5–13]. As we all know, the Newton method, the quasiNewton method, and their related methods are very popular due to their local superlinear convergence property. But, it is expensive for them to compute the Jacobian matrix or the approximate Jacobian matrix in per iteration while the dimensions are very large.
Due to the simplicity, less storage, efficiency, and nice convergence property, the conjugate gradient method becomes more and more popular for solving nonlinear equations [14–18]. The search direction of the conjugate gradient method is usually defined aswhere and is a parameter. Diverse means a diverse CG method. There are some famous CG methods such as the FR method [7], the PRP method [10], the DY method [5], the LS method [9], the HS method [8], and the CD method [6]. The parameter we mentioned is as follows:where and means the Euclidian norm.
In 1990, the first hybrid conjugate gradient method was proposed by TouatiAhmed and Storey [19], and this method has global convergence while using the strong Wolfe line search. Dai and Yuan [20] proposed a CG method for unconstrained optimization and proved the global convergence of these hybrid computational schemes. Dong and Jiao [21] proposed a convex combination of the PRP and DY methods for solving nonlinear equations. Furthermore, the projection technique proposed by Solodov and Svaiter [22] motivated many scholars to further develop methods for solving (1) [23, 24].
Inspired by the convex combination proposed in [25], we proposed a convex combination of a modified PRP and a modified FR method with trust region property which [25] was not present before. We also try to make the algorithm inherit the convergence of the PRP method and excellent performance of the FR method.
The main contributions of the hybrid CG method are as follows:(i)The given direction is designed as a convex combination of PRP and FR methods(ii)The given direction has the sufficient descent property(iii)The given direction has the trust region property(iv)The global convergence of the presented algorithm is proved(v)Numerical experiments show that the algorithm is more competitive for nonlinear equations and image restoration problems
In Section 2, we introduce the motivation and the PRPFR algorithm. In Section 3, we prove the sufficient property and the trust region property and give the convergence analysis. Section 4 shows the numerical experiment results. The conclusion is reported in the last section.
2. Algorithm
Motivated by the convex combination proposed in [25], we proposed a method which is a convex combination of a modified PRP method and a modified FR method for solving problem (1) and image restoration problems. We have recalled the classic PRP and classic FR conjugate gradient parameters in (5) and (6). For global convergence and good numerical performance, we defined the modified parameters aswhere is a constant.
Next, we utilized the global convergence of and the excellent numerical behavior of by defining a new parameter called . The new parameter is defined aswhere , , .
The definition of is by Li and Fukushima [4], and is proposed in [26]. We now proposed a hybrid gradient search direction aswhere is defined in (13).
Remark 1. By the definition of and , we obtain thatTherefore, we have
Remark 2. By the definition of and , we haveNow, we introduce the hyperplane projection method by Solodov and Svaiter [22]. We first give the definition of the projection operator :There is a fundamental property, that is,The hyperplane projection method is as follows: let be the current iteration point and be obtained by line search direction such that . According to the monotonicity of , we haveif is a solution. Then, the hyperplanestrictly separates the current iteration from the solution of problem (1). The next iterate can be computed by projecting on it. That is,As for the step size, we will use an appropriate line search to make performance better. Andrei [27] presented an acceleration scheme that generates the step size in a multiplicative manner to improve the reduction of the function values along the iterations. The step size is defined as follows:If , let .
Motivated by the above discussions, we proposed a hybrid conjugate gradient algorithm in Algorithm 1.
Algorithm 1. PRPFR conjugate gradient algorithm. Step 0: given an initial point and constant , , , , and , let . Step 1: if , stop. Otherwise, compute by (11)–(14). Step 2: choosing such that Step 3: computing through (24), if , then by (23). Step 4: setting , if , stop. Otherwise, compute the next iterate using (22). Step 5: let , go to Step 2.
3. Convergence Analysis
We will establish the convergence of Algorithm 1 in this section. First, we give the following assumptions.
Assumption 1. The function is Lipschitz continuous; that is, there exists a positive constant L such that
Assumption 2. The solution set of problem (1) is nonempty.
From Assumption 1, there exists a positive constant such that
Lemma 1. Let be defined by (11)–(14), and then satisfies the sufficient descent condition. That is,
Proof. If , we have . If , by (14), we have
Lemma 2. From defined in (11)–(14), satisfies the trust region property independent of the line search. That is,
Proof. According to equality (28) and Cauchy–Schwartz inequality, we haveFurthermore, since and , thenThen, the proof is complete.
Lemma 3. If and can be generated by the PRPRFR algorithm, then the step size satisfies
Proof. From the line search (25), supposing , then does not satisfy the line search (25). That is,Using the Lipschitz continuous of and (28), we haveTherefore,The proof is complete.
Lemma 4. (see [22]). Suppose that satisfies . Let be generated by the PRPFR algorithm. Then,
Lemma 5. Let be generated by the PRPFR algorithm, and then
Proof. Lemma 4 indicates that is bounded andFrom (22) and (25), we have thatFrom the abovementioned, the proof is complete.
Now, we establish the global convergence of the PRPFR algorithm.
Theorem 1. Let be generated by the PRPFR algorithm. Then,
Proof. We assume (41) does not hold, and then there exists such that , . This together with (30) yieldsAccording to (21)–(26) and (42), we getThe fourth inequality holds since and . The last inequality implies that is bounded. By (30), we havewhere . Multiplying both sides of (33) with , we obtain thatThis inequality mentioned above contradicts (38); therefore, (41) holds.
4. Numerical Experiments
In this section, we report some numerical experiments to investigate the computational efficiency of the proposed hybrid conjugate gradient algorithm. The numerical experiments will be divided into two sections including normal nonlinear equations and image restoration problems. All tests in this section are written in MATLAB 2018a, run on a PC with AMD Ryzen 7 4800 U with Radeon Graphics 1.80 Hz, 16 GB of SDRAM memory, and Windows 10 system.
4.1. Normal Nonlinear Equations
In this section, we test some normal nonlinear equations. The concrete test problems come from [28] and are listed in Table 1. To compare the numerical performance of the PRPFR algorithm, we also do the experiments with the modified PRP algorithm [29] and the FR algorithm [7]. The columns of Tables 2–4 have the following meaning.

 
: the serial number of the problem. : the dimensions of x. : the function evaluation numbers. : the iteration numbers. : the calculation time in seconds. : the final function norm evaluations when the program is stopped. 


According to Tables 2–4, it is evident that the three methods can solve most of the test problems with . However, the FR method cannot handle the function 3 with 9000 dimensions, 30000 dimensions, and 90000 dimensions, while the PRPFR can solve the problem with . For more directly knowing the performance of this method, Dolan and Moré [30] introduced a technique to compare the performance of different algorithms. In Figure 1, when , the PRPFR algorithm is obviously better than the FR algorithm and the modified PRP algorithm. In Figure 2, the PRPFR algorithm solves all the problems at approximately , while the FR algorithm solves of the test problems at approximately , and the modified PRP algorithm solves of the test problems at approximately . In Figure 3, the PRPFR algorithm solves of the problems at approximately . The FR algorithm and the modified PRP algorithm solve and of the test problems at approximately and , respectively. From Tables 2–4 and Figures 1–3, it is obvious that the performance of the PRPFR algorithm is better than that of the FR algorithm and the modified PRP algorithm for most problems. Therefore, we conclude that the PRPFR algorithm is competitive to the FR algorithm and the modified PRP algorithm.

4.2. Image Restoration Problems
Image restoration aims to recover the original image from an image damaged by impulse noise. These problems are significant in optimization fields. The stop rule is or . The experiments choose Lena (), Barbara (), and Man () as the test images. Meanwhile, we compare the PRPFR algorithm experiments’ performance with that of the modified PRP algorithm, where the step size is generated by Step 2 and Step 3 in the PRPFR algorithm. The detailed performance results are shown in Figures 4–6. It can be observed that both the PRPFR algorithm and the modified PRP algorithm are able to restore the blur image of these three images.
From Figures 4–6, we can easily notice that both algorithms are successful for restoring these noisy images with , , and noise. According to the results in Table 5, we can draw the conclusion that the PRPFR algorithm is more effective than the modified PRP algorithm for noise problems, noise problems, and noise problems.
5. Conclusion
In this paper, a new hybrid conjugate gradient algorithm that combines the PRP and FR methods is proposed, while using the projection technique. The direction has the sufficient descent and trust region properties automatically. Global convergence of the proposed algorithm is established under appropriate assumptions. The numerical experiments show that the proposed algorithm is competitive and efficient for solving nonlinear equations and image restoration problems.
For further research, we have some thinking as follows: (i) If the convex combination is applied to the quasiNewton method, can it have better properties? (ii) Under other line search techniques, can this conjugate gradient method have global convergence? (iii) Can the proposed algorithm be applied to compressive sensing?
Data Availability
All data are included in the paper.
Conflicts of Interest
There are no potential conflicts of interest.
Acknowledgments
The authors want to thank the support of the funds. This work was supported by the High Level Innovation Teams and Excellent Scholars Program in Guangxi Institutions of Higher Education (Grant no. [2019]32), the National Natural Science Foundation of China (Grant no. 11661009), and the Guangxi Natural Science Key Foundation (no. 2017GXNSFDA198046).
References
 Y. Xiao and H. Zhu, “A conjugate gradient method to solve convex constrained monotone equations with applications in compressive sensing,” Journal of Mathematical Analysis and Applications, vol. 405, no. 1, pp. 310–319, 2013. View at: Publisher Site  Google Scholar
 K. Meintjes and A. P. Morgan, “Chemical equilibrium systems as numerical test problems,” ACM Transactions on Mathematical Software, vol. 16, no. 2, pp. 143–151, 1990. View at: Publisher Site  Google Scholar
 P. N. Brown and Y. Saad, “Convergence theory of nonlinear NewtonKrylov algorithms,” SIAM Journal on Optimization, vol. 4, no. 2, pp. 297–330, 1994. View at: Publisher Site  Google Scholar
 D.H. Li and M. Fukushima, “A modified BFGS method and its global convergence in nonconvex minimization,” Journal of Computational and Applied Mathematics, vol. 129, no. 12, pp. 15–35, 2001. View at: Publisher Site  Google Scholar
 Y. Dai and Y. Yuan, “A nonlinear conjugate gradient method with a strong global convergence property,” SIAM Journal on Optimization, vol. 10, pp. 177–182, 2000. View at: Publisher Site  Google Scholar
 R. Fletcher, “Unconstrained optimization,” Pratical Method of Optimization, vol. 1, 1987. View at: Google Scholar
 R. Fletcher and C. Reeves, “Function minimization by conjugate gradients,” The Computer Journal, vol. 7, no. 2, pp. 149–154, 1964. View at: Publisher Site  Google Scholar
 M. R. Hestenes and E. Stiefel, “Methods of conjugate gradients for solving linear systems,” Journal of Research of the National Bureau of Standards, vol. 49, no. 6, pp. 409–432, 1952. View at: Publisher Site  Google Scholar
 Y. Liu and C. Storey, “Efficient generalized conjugate gradient algorithms, part 1: theory,” Journal of Optimization Theory and Applications, vol. 69, no. 1, pp. 129–137, 1991. View at: Publisher Site  Google Scholar
 E. Polak and G. Ribière, “Note sur la convergence de méthodes de directions conjuguées,” Revue française d'informatique et de recherche opérationnelle. Série rouge, vol. 3, no. 16, pp. 35–43, 1969. View at: Publisher Site  Google Scholar
 G. Yuan, J. Lu, and Z. Wang, “The PRP conjugate gradient algorithm with a modified WWP line search and its application in the image restoration problems,” Applied Numerical Mathematics, vol. 152, pp. 1–11, 2020. View at: Publisher Site  Google Scholar
 G. Yuan, X. Wang, and Z. Sheng, “Family weak conjugate gradient algorithms and their convergence analysis for nonconvex functions,” Numerical Algorithms, vol. 84, no. 3, pp. 935–956, 2020. View at: Publisher Site  Google Scholar
 G. Yuan, Z. Wei, and Y. Yang, “The global convergence of the PolakRibièrePolyak conjugate gradient algorithm under inexact line search for nonconvex functions,” Journal of Computational and Applied Mathematics, vol. 362, pp. 262–275, 2019. View at: Publisher Site  Google Scholar
 A. B. Abubakar and P. Kumam, “A descent DaiLiao conjugate gradient method for nonlinear equations,” Numerical Algorithms, vol. 81, no. 1, pp. 197–210, 2019. View at: Publisher Site  Google Scholar
 A. B. Abubakar and P. Kumam, “An improved threeterm derivativefree method for solving nonlinear equations,” Computational and Applied Mathematics, vol. 37, no. 5, pp. 6760–6773, 2018. View at: Publisher Site  Google Scholar
 Z. Dai and H. Zhu, “A modified HestenesStiefeltype derivativefree method for largescale nonlinear monotone equations,” Mathematics, vol. 8, no. 2, p. 168, 2020. View at: Publisher Site  Google Scholar
 G. Yuan, T. Li, and W. Hu, “A conjugate gradient algorithm for largescale nonlinear equations and image restoration problems,” Applied Numerical Mathematics, vol. 147, pp. 129–141, 2020. View at: Publisher Site  Google Scholar
 G. Yuan and M. Zhang, “A threeterms PolakRibièrePolyak conjugate gradient algorithm for largescale nonlinear equations,” Journal of Computational and Applied Mathematics, vol. 286, pp. 186–195, 2015. View at: Publisher Site  Google Scholar
 D. TouatiAhmed and C. Storey, “Efficient hybrid conjugate gradient techniques,” Journal of Optimization Theory and Applications, vol. 64, no. 2, pp. 379–397, 1990. View at: Publisher Site  Google Scholar
 Y. Dai and Y. Yuan, “An efficient hybrid conjugate gradient method for unconstrained optimization,” Annals of Operations Research, vol. 103, pp. 33–47, 2001. View at: Google Scholar
 J. Dong and B. Jiao, “A new hybrid PRPDY conjugate gradient method,” 2010 Third International Joint Conference on Computational Science and Optimization, vol. 2, pp. 70–74, 2010. View at: Google Scholar
 M. V. Solodov and B. F. Svaiter, “A globally convergent inexact Newton method for systems of monotone equations,” Reformulation: Nonsmooth, Piecewise Smooth, Semismooth and Smoothing Methods, vol. 22, no. 1998, pp. 355–369, 1969. View at: Publisher Site  Google Scholar
 X. Y. Wang, S. J. Li, and X. P. Kou, “A selfadaptive threeterm conjugate gradient method for monotone nonlinear equations with convex constraints,” Calcolo, vol. 53, no. 2, pp. 133–145, 2016. View at: Publisher Site  Google Scholar
 C. Wang, Y. Wang, and C. Xu, “A projection method for a system of nonlinear monotone equations with convex constraints,” Mathematical Methods of Operations Research, vol. 66, no. 1, pp. 33–46, 2007. View at: Publisher Site  Google Scholar
 K. Zahra and A. Ali, “A new hybrid conjugate gradient method for largescale unconstrained optimization problem with nonconvex objective function,” Computational and Applied Mathematics, vol. 38, p. 186, 2019. View at: Google Scholar
 E. G. Birgin and J. Martnez, “A spectral conjugate gradient method for unconstrained optimization,” Applied Mathematics and Optimization, vol. 43, no. 2, pp. 117–128, 2001. View at: Publisher Site  Google Scholar
 N. Andrei, “Another conjugate gradient algorithm with guaranteed descent and conjugacy conditions for largescale unconstrained optimization,” Journal of Optimization Theory and Applications, vol. 159, no. 1, pp. 159–182, 2013. View at: Publisher Site  Google Scholar
 G. Yuan, Z. Wei, and S. Lu, “Limited memory BFGS method with backtracking for symmetric nonlinear equations,” Mathematical and Computer Modelling, vol. 54, no. 12, pp. 367–377, 2011. View at: Publisher Site  Google Scholar
 L. Zhang, W. Zhou, and D.H. Li, “A descent modified PolakRibièrePolyak conjugate gradient method and its global convergence,” IMA Journal of Numerical Analysis, vol. 26, no. 4, pp. 629–640, 2006. View at: Publisher Site  Google Scholar
 E. D. Dolan and J. J. Moré, “Benchmarking optimization software with performance profiles,” Mathematical Programming, vol. 91, no. 2, pp. 201–213, 2002. View at: Publisher Site  Google Scholar
Copyright
Copyright © 2020 Yingjie Zhou et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.