Numerical and Analytical Methods for Variational Inequalities and Related Problems with ApplicationsView this Special Issue
Research Article | Open Access
A Hybrid Gradient-Projection Algorithm for Averaged Mappings in Hilbert Spaces
It is well known that the gradient-projection algorithm (GPA) is very useful in solving constrained convex minimization problems. In this paper, we combine a general iterative method with the gradient-projection algorithm to propose a hybrid gradient-projection algorithm and prove that the sequence generated by the hybrid gradient-projection algorithm converges in norm to a minimizer of constrained convex minimization problems which solves a variational inequality.
Let be a real Hilbert space and a nonempty closed and convex subset of . Consider the following constrained convex minimization problem: where is a real-valued convex and continuously Fréchet differentiable function. The gradient satisfies the following Lipschitz condition: where . Assume that the minimization problem (1.1) is consistent, and let denote its solution set.
It is well known that the gradient-projection algorithm is very useful in dealing with constrained convex minimization problems and has extensively been studied ([1–5] and the references therein). It has recently been applied to solve split feasibility problems [6–10]. Levitin and Polyak  consider the following gradient-projection algorithm: Let satisfy It is proved that the sequence generated by (1.3) converges weakly to a minimizer of (1.1).
Since the Lipschitz continuity of the gradient of implies that it is indeed inverse strongly monotone (ism) [12, 13], its complement can be an averaged mapping. Recall that a mapping is nonexpansive if and only if it is Lipschitz with Lipschitz constant not more than one, that a mapping is an averaged mapping if and only if it can be expressed as a proper convex combination of the identity mapping and a nonexpansive mapping, and that a mapping is said to be -inverse strongly monotone if and only if , where the number . Recall also that the composite of finitely many averaged mappings is averaged. That is, if each of the mappings is averaged, then so is the composite . In particular, an averaged mapping is a nonexpansive mapping . As a result, the GPA can be rewritten as the composite of a projection and an averaged mapping which is again an averaged mapping.
Generally speaking, in infinite-dimensional Hilbert spaces, GPA has only weak convergence. Xu  provided a modification of GPA so that strong convergence is guaranteed. He considered the following hybrid gradient-projection algorithm:
On the other hand, Ming Tian  introduced the following general iterative algorithm for solving the variational inequality where is a -Lipschitzian and -strongly monotone operator with , and is a contraction with coefficient . Then, he proved that if satisfying appropriate conditions, the generated by (1.8) converges strongly to the unique solution of variational inequality
In this paper, motivated and inspired by the research work in this direction, we will combine the iterative method (1.8) with the gradient-projection algorithm (1.3) and consider the following hybrid gradient-projection algorithm:
We will prove that if the sequence of parameters and the sequence of parameters satisfy appropriate conditions, then the sequence generated by (1.10) converges in norm to a minimizer of (1.1) which solves the variational inequality where is the solution set of the minimization problem (1.1).
This section collects some lemmas which will be used in the proofs for the main results in the next section. Some of them are known; others are not hard to derive.
Throughout this paper, we write to indicate that the sequence converges weakly to , implies that converges strongly to . is the weak -limit set of the sequence .
Lemma 2.1 (see ). Assume that is a sequence of nonnegative real numbers such that where and are sequences in and is a sequence in such that(i); (ii)either or ;(iii). Then .
Lemma 2.2 (see ). Let be a closed and convex subset of a Hilbert space , and let be a nonexpansive mapping with . If is a sequence in weakly converging to and if converges strongly to y, then .
Lemma 2.3. Let be a Hilbert space, and let be a nonempty closed and convex subset of . a contraction with coefficient , and a -Lipschitzian continuous operator and -strongly monotone operator with . Then, for , That is, is strongly monotone with coefficient .
Lemma 2.4. Let be a closed subset of a real Hilbert space , given and . Then, if and only if there holds the inequality
3. Main Results
Let be a real Hilbert space, and let be a nonempty closed and convex subset of such that . Assume that the minimization problem (1.1) is consistent, and let denote its solution set. Assume that the gradient satisfies the Lipschitz condition (1.2). Since is a closed convex subset, the nearest point projection from onto is well defined. Recall also that a contraction on is a self-mapping of such that , where is a constant. Let be a -Lipschitzian and -strongly monotone operator on with . Denote by the collection of all contractions on , namely, Now given with , . Let , . Assume that with respect to is continuous and, in addition, . Consider a mapping on defined by It is easy to see that is a contraction. Setting . It is obvious that is a nonexpansive mapping. We can rewrite as First observe that for , we can get Indeed, we have Hence, has a unique fixed point, denoted , which uniquely solves the fixed-point equation The next proposition summarizes the properties of .
Proposition 3.1. Let be defined by (3.6).(i). (ii). (iii).
Proof. (i) Take a , then we have
It follows that
Hence, is bounded.
(ii) By the definition of , we have is bounded, so are and .
(iii) Take , , and we have Therefore, Therefore, as . This means is continuous.
Our main result in the following shows that converges in norm to a minimizer of (1.1) which solves some variational inequality.
Proof. It is easy to see that the uniqueness of a solution of the variational inequality (3.12). By Lemma 2.3, is strongly monotone, so the variational inequality (3.12) has only one solution. Let denote the unique solution of (3.12).
To prove that , we write, for a given , It follows that Hence, To derive that Since is bounded as , we see that if is a sequence in (0,1) such that and , then by (3.16), . We may further assume that due to condition (1.4). Notice that is nonexpansive. It turns out that From the boundedness of and , we conclude that Since , by Lemma 2.2, we obtain This shows that .
We next prove that is a solution of the variational inequality (3.12). Since we can derive that Therefore, for , Since is nonexpansive, we obtain that is monotone, that is, Taking the limit through ensures that is a solution to (3.12). That is to say Hence by uniqueness. Therefore, as . The variational inequality (3.12) can be written as So, by Lemma 2.4, it is equivalent to the fixed-point equation
Taking , in Theorem 3.2, we get the following
Corollary 3.3. We have that converges in norm as to a minimizer of (1.1) which solves the variational inequality Equivalently, we have .
Taking , , in Theorem 3.2, we get the following.
Corollary 3.4. Let be the unique fixed point of the contraction . Then, converges in norm as to the unique solution of the variational inequality
Finally, we consider the following hybrid gradient-projection algorithm, Assume that the sequence satisfies the condition (1.4) and, in addition, that the following conditions are satisfied for and :(i); (ii); (iii); (iv).
Theorem 3.5. Assume that the minimization problem (1.1) is consistent and the gradient satisfies the Lipschitz condition (1.2). Let be generated by algorithm (3.29) with the sequences and satisfying the above conditions. Then, the sequence converges in norm to that is obtained in Theorem 3.2.
Proof. () The sequence is bounded. Setting
Indeed, we have, for ,
In particular, is bounded.
() We prove that as . Let be a constant such that We compute Combining (3.34) and (3.35), we can obtain Apply Lemma 2.1 to (3.36) to conclude that as .
() We prove that . Let , and assume that for some subsequence of . We may further assume that due to condition (1.4). Set . Notice that is nonexpansive and . It turns out that So Lemma 2.2 guarantees that .
() We prove that as , where is the unique solution of the (3.12). First observe that there is some Such that
We now compute Applying Lemma 2.1 to the inequality (3.39), together with (3.38), we get as .
Corollary 3.6 (see ). Let be generated by the following algorithm: Assume that the sequence satisfies the conditions (1.4) and (iv) and that satisfies the conditions (i)–(iii). Then converges in norm to obtained in Corollary 3.4.
Ming Tian is Supported in part by The Fundamental Research Funds for the Central Universities (the Special Fund of Science in Civil Aviation University of China: No. ZXH2012 K001) and by the Science Research Foundation of Civil Aviation University of China (No. 2012KYM03).
- E. S. Levitin and B. T. Poljak, “Minimization methods in the presence of constraints,” Žurnal Vyčislitel' noĭ Matematiki i Matematičeskoĭ Fiziki, vol. 6, pp. 787–823, 1966.
- P. H. Calamai and J. J. Moré, “Projected gradient methods for linearly constrained problems,” Mathematical Programming, vol. 39, no. 1, pp. 93–116, 1987.
- B. T. Polyak, Introduction to Optimization, Translations Series in Mathematics and Engineering, Optimization Software, New York, NY, USA, 1987.
- M. Su and H. K. Xu, “Remarks on the gradient-projection algorithm,” Journal of Nonlinear Analysis and Optimization, vol. 1, pp. 35–43, 2010.
- Y. Yao and H.-K. Xu, “Iterative methods for finding minimum-norm fixed points of nonexpansive mappings with applications,” Optimization, vol. 60, no. 6, pp. 645–658, 2011.
- Y. Censor and T. Elfving, “A multiprojection algorithm using Bregman projections in a product space,” Numerical Algorithms, vol. 8, no. 2–4, pp. 221–239, 1994.
- C. Byrne, “A unified treatment of some iterative algorithms in signal processing and image reconstruction,” Inverse Problems, vol. 20, no. 1, pp. 103–120, 2004.
- J. S. Jung, “Strong convergence of composite iterative methods for equilibrium problems and fixed point problems,” Applied Mathematics and Computation, vol. 213, no. 2, pp. 498–505, 2009.
- G. Lopez, V. Martin, and H. K. Xu :, “Iterative algorithms for the multiple-sets split feasibility problem,” in Biomedical Mathematics: Promising Directions in Imaging, Therpy Planning and Inverse Problems, Y. Censor, M. Jiang, and G. Wang, Eds., pp. 243–279, Medical Physics, Madison, Wis, USA, 2009.
- P. Kumam, “A hybrid approximation method for equilibrium and fixed point problems for a monotone mapping and a nonexpansive mapping,” Nonlinear Analysis, vol. 2, no. 4, pp. 1245–1255, 2008.
- H.-K. Xu, “Averaged mappings and the gradient-projection algorithm,” Journal of Optimization Theory and Applications, vol. 150, no. 2, pp. 360–378, 2011.
- P. Kumam, “A new hybrid iterative method for solution of equilibrium problems and fixed point problems for an inverse strongly monotone operator and a nonexpansive mapping,” Journal of Applied Mathematics and Computing, vol. 29, no. 1-2, pp. 263–280, 2009.
- H. Brezis, Operateur Maximaux Monotones et Semi-Groups de Contractions dans les Espaces de Hilbert, North-Holland, Amsterdam, The Netherlands, 1973.
- P. L. Combettes, “Solving monotone inclusions via compositions of nonexpansive averaged operators,” Optimization, vol. 53, no. 5-6, pp. 475–504, 2004.
- Y. Yao, Y.-C. Liou, and R. Chen, “A general iterative method for an infinite family of nonexpansive mappings,” Nonlinear Analysis, vol. 69, no. 5-6, pp. 1644–1654, 2008.
- M. Tian, “A general iterative algorithm for nonexpansive mappings in Hilbert spaces,” Nonlinear Analysis, vol. 73, no. 3, pp. 689–694, 2010.
- H.-K. Xu, “Iterative algorithms for nonlinear operators,” Journal of the London Mathematical Society. Second Series, vol. 66, no. 1, pp. 240–256, 2002.
- K. Goebel and W. A. Kirk, Topics in Metric Fixed Point Theory, vol. 28 of Cambridge Studies in Advanced Mathematics, Cambridge University Press, Cambridge, UK, 1990.
Copyright © 2012 Ming Tian and Min-Min Li. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.