Abstract

We consider the split feasibility problem (SFP) in Hilbert spaces, inspired by extragradient method presented by Ceng, Ansari for split feasibility problem, subgradient extragradient method proposed by Censor, and variant extragradient-type method presented by Yao for variational inequalities; we suggest an extragradient-type algorithm for the SFP. We prove the strong convergence under some suitable conditions in infinite-dimensional Hilbert spaces.

1. Introduction

The convex feasibility problem (CFP) is to find a common point in the intersection of finitely many convex sets. A popular approach to the CFP is a projection algorithm which employs orthogonal projections onto a set; see [1]. When there are only two sets and constraints are imposed on the solutions in the domain of a linear operator as well as in this operator’s ranges, the problem is said to be a split feasibility problem (SFP) [2]. In other words, the SFP is to find a point such that where and are nonempty closed convex subsets of real Hilbert spaces and , respectively, and is a bounded linear operator. The SFP serves as a model for many inverse problems where constraints are imposed on the solutions in the domain of a linear operator as well as in this operator’s range. There are a number of significant applications of the SFP in intensity-modulated radiation therapy, signal processing, image reconstruction, and others [24].

Various algorithms have been invented to solve the SFP; see [514] and references therein. In particular, Byrne [5] introduced a so-called CQ algorithm, taking an initial point arbitrarily and defining the iterative step as where , is the spectral radius of , and denotes the projection onto set ; that is, .

Extragradient algorithm was first introduced by Korpelevich [15] for computing a solution of a variational inequality and shows the quick convergence. Subsequently, Nadezhkina and Takahashi in [16] applied the method for finding a common element of the set of fixed points of a nonexpansive mapping and the set of solutions of variational inequality. Ceng et al. in [17] proposed an extragradient method, and Yao et al. in [18] presented subgradient extragradient method to solve the split feasibility problem, but all these algorithms have only weak convergence in the infinite-dimensional Hilbert spaces. Hence, to obtain strong convergence, Censor et al. in [19] presented a variant extragradient-type method and Censor et al. in [20] proposed subgradient extragradient method which possesses strong convergence for solving the variational inequality. Motivated by the works given above, in this paper, we construct an extragradient-type method to solve the split feasibility problem. Strong convergence of the algorithm is proved under some suitable conditions in the infinite-dimensional Hilbert spaces.

The paper is organized as follows. Section 2 reviews some preliminaries. Section 3 gives a variant extragradient-type algorithm and shows its convergence. Section 4 gives some conclusions.

2. Preliminaries

Throughout the paper, denotes the identity operator, and denotes the fixed points of an operator ; that is,

In this section, we review some concepts and basic results that will be used later.

Definition 1. Let be a mapping from a set into . Then consider the following:(a)is said to be nonexpansive, if (b) is said to be firmly nonexpansive, if (c) is said to be contractive on , if there exists such that (d) is said to be monotone, if (e) is said to be -inverse strongly monotone (-ism), if there exists a real number such that

The lemma below is the basic properties of the projection operator [21].

Lemma 2. Let be nonempty, closed, and convex. Then, for all and ,(1); (2); (3).

From () of Lemma 2, we know that the projection operator is monotone and nonexpansive, and is nonexpansive.

The lemmas below are necessary for the convergence analysis in the next section.

Lemma 3 (see [22]). Let be a nonempty closed convex subset of a real Hilbert space and let be a -inverse strongly monotone and let be a constant. Then, one has

In particular, if , then is nonexpansive.

Lemma 4 (see [9]). Let and be bounded sequences in a Banach space and let be a sequence in which satisfies the following condition: . Suppose that for all and ; then .

Lemma 5 (see [23, Lemma ]). Let be a sequence of nonnegative real numbers satisfying the condition where are sequences of real numbers such that(i) and , or, equivalently, (ii), or(ii)′ is convergent.Then, .

3. Algorithm and Its Convergence Analysis

In this section, we present the formal statement of our proposal for the algorithm. Denote the solution set of the SFP by

3.1. Variant Extragradient-Type Method

Now, we give the extragradient-type algorithm.

Algorithm 6. Let be a nonempty, closed, and convex subset of a real Hilbert space . Consider the sequences , , , and . For an arbitrary initial point , is the current point. Define a mapping asFor , compute

3.2. Convergence Analysis

In the section, we consider the convergence analysis of Algorithm 6.

Theorem 7. Suppose that . Assume that the algorithm parameters , , , and satisfy the following conditions:(C1) and ;(C2) and ;(C3), , and ;(C4).Then, the sequences generated by Algorithm 6 converge strongly to a point in .

Proof. Picking , we divide the proof into several steps.
(1) First, we prove that , , , and are all bounded.
By conditions and , since and , we have , as . Hence, we may assume that, for all , ; then, . By the property of the projection, we know for any . Hence, that is, Thus, by (12b) and (14), we have Since , from Lemma 3, we know that is nonexpansive. From the above inequality, we have By (C3), we obtain . So, is nonexpansive. Therefore,By induction, we get Then is bounded, and so are , , and .
(2) We claim that and .
Let . From the property of the projection, we know that is nonexpansive. Therefore, we can rewrite in (12c) as then whereIt follows that Hence, Again, by using the nonexpansivity of and , we have By (12a), (12b), and (12c), we have Therefore, Since and , we derive that Note that , , and are bounded. Therefore, From (20) and (28), by Lemma 4, we obtain Hence, From (12b), (12c), Lemma 3, and the convexity of the norm, we deduce Therefore, we have Since ,  , and , we deduce By the property (ii) of the metric projection , we have hence, where is some constant satisfyingIt follows thattherefore which implies that Since , , and , we derive (3) We show that .
By the property of the projection , we have then Hence Therefore, Since , we have Furthermore which implies By condition (C4) and and being bounded, we have ; hence, . We apply Lemma 5 to inequality (44) to deduce that .
The following example of the SFP will show that our algorithm is feasible.
Example. Let , and .
Select as a starting point for the algorithm. We take , , , and . Then, for the method, That is, the method obtains the approximate point in seven steps.

4. Conclusion

In this paper, we presented a strongly convergent method for solving the split feasibility problem in Hilbert spaces inspired by the methods for solving the variational inequalities. Our results improve and develop previous split feasibility problem and related algorithms. Extension of this method for solving the multiple-set split feasibility problem is underway.

Conflict of Interests

The authors declare that there is no conflict of interests regarding the publication of this paper.

Acknowledgments

This work was supported by Natural Science Foundation of Shanghai (14ZR1429200), Innovation Program of Shanghai Municipal Education Commission (15ZZ074), and National Science Foundation of China (71572113).