Research Article | Open Access

# Steepest-Descent Approach to Triple Hierarchical Constrained Optimization Problems

**Academic Editor:**Jong Kyu Kim

#### Abstract

We introduce and analyze a hybrid steepest-descent algorithm by combining Korpelevichâ€™s extragradient method, the steepest-descent method, and the averaged mapping approach to the gradient-projection algorithm. It is proven that under appropriate assumptions, the proposed algorithm converges strongly to the unique solution of a triple hierarchical constrained optimization problem (THCOP) over the common fixed point set of finitely many nonexpansive mappings, with constraints of finitely many generalized mixed equilibrium problems (GMEPs), finitely many variational inclusions, and a convex minimization problem (CMP) in a real Hilbert space.

#### 1. Introduction

Let be a real Hilbert space with inner product and norm ; let be a nonempty closed convex subset of and let be the metric projection of onto . Let be a nonlinear mapping on . We denote by the set of fixed points of and by the set of all real numbers. A mapping is called -Lipschitz continuous if there exists a constant such that In particular, if then is called a nonexpansive mapping; if then is called a contraction.

Let be a nonlinear mapping on . The classical variational inequality problem (VIP) [1] is to find a point such that The solution set of VIP (2) is denoted by .

In 1976, Korpelevich [2] proposed an iterative algorithm for solving the VIP (2) in Euclidean space : with a given number, which is known as the extragradient method. See, for example, [3â€“7] and the references therein.

Let be a real-valued function; let be a nonlinear mapping and let be a bifunction. In 2008, Peng and Yao [8] introduced the following generalized mixed equilibrium problem (GMEP) of finding such that We denote the set of solutions of GMEP (4) by .

In [8], Peng and Yao assumed that is a bifunction satisfying conditions (A1)â€“(A4) and is a lower semicontinuous and convex function with restriction (B1) or (B2), where(A1) for all ;(A2) is monotone; that is, for any ;(A3) is upper-hemicontinuous; that is, for each , (A4) is convex and lower semicontinuous for each ;(B1)for each and , there exists a bounded subset and such that for any , (B2) is a bounded set.

Given a positive number . Let be the solution set of the auxiliary mixed equilibrium problem; that is, for each ,

Let be a convex and continuously FrÃ©chet differentiable functional. Consider the convex minimization problem (CMP) of minimizing over the constraint set : (assuming the existence of minimizers). We denote by the set of minimizers of CMP (8).

On the other hand, let be a single-valued mapping of into and be a set-valued mapping with . Considering the following variational inclusion, find a point such that We denote by the solution set of the variational inclusion (9). Let a set-valued mapping be maximal monotone. We define the resolvent operator associated with and as follows: where is a positive number.

Let and be two nonexpansive mappings. In 2009, Yao et al. [9] considered the following hierarchical VIP: find hierarchically a fixed point of , which is a solution to the VIP for monotone mapping ; namely, find such that The solution set of the hierarchical VIP (11) is denoted by . It is not hard to check that solving the hierarchical VIP (11) is equivalent to the fixed point problem of the composite mapping ; that is, find such that . The authors [9] introduced and analyzed the following iterative algorithm for solving the hierarchical VIP (11):

In this paper, we introduce and study the following triple hierarchical constrained optimization problem (THCOP) with constraints of the CMP (8), finitely many GMEPs and finitely many variational inclusions.

*Problem I*.â€‰ â€‰Let , and be three positive integers. Assume that(i) is a convex and continuously FrÃ©chet differentiable functional with -Lipschitz continuous gradient , is a nonexpansive mapping, and is -inverse-strongly monotone for and ;(ii) is -inverse strongly monotone and is -strongly monotone and -Lipschitz continuous;(iii) is a bifunctions from to satisfying (A1)â€“(A4), and is a lower semicontinuous and convex functional with restriction (B1) or (B2) for ;(iv) is a maximal monotone mapping and is -inverse strongly monotone for ;(v) with . Then the objective is to

Motivated and inspired by the above facts, we introduce and analyze a hybrid iterative algorithm via Korpelevichâ€™s extragradient method, the steepest-descent method, and the gradient-projection algorithm obtained by the averaged mapping approach. It is proven that under mild conditions, the proposed algorithm converges strongly to a unique element of with , that is, the unique solution of the THCOP (13). In this paper, the results we acquired improve and extend the existing results found in this field.

#### 2. Preliminaries

Throughout this paper, we assume that is a real Hilbert space of which inner product and norm are denoted by and , respectively. Let be a nonempty closed convex subset of . We write to indicate that the sequence converges weakly to and to indicate that the sequence converges strongly to . Moreover, we use to denote the weak -limit set of the sequence ; that is,

*Definition 1. *A mapping is called(i)monotone if
(ii)-strongly monotone if there exists a constant such that
(iii)-inverse-strongly monotone if there exists a constant such that

It is obvious that if is -inverse-strongly monotone, then is monotone and -Lipschitz continuous. Moreover, we also have that, for all and ,
So, if , then is a nonexpansive mapping from to .

The metric projection from onto is the mapping which assigns to each point , the unique point , satisfying the property

Some important properties of projections are gathered in the following proposition.

Proposition 2. *For given and :*(i)*;*(ii)*;*(iii)*. (This implies that is nonexpansive and monotone.)*

Next we list some elementary conclusions for the mixed equilibrium problem where is the solution set.

Proposition 3 (see [10]). *Assume that satisfies (A1)â€“(A4) and let be a proper lower semicontinuous and convex function. Assume that either (B1) or (B2) holds. For and , define a mapping as follows:
**
for all . Then the following hold:*(i)*for each is nonempty and single-valued;*(ii)* is firmly nonexpansive; that is, for any ,
*(iii)*;*(iv)* is closed and convex;*(v)* for all and .*

In the following, we recall some facts and tools in a real Hilbert space .

Lemma 4. *Let be a real inner product space. Then there holds the following inequality
*

Lemma 5. *Let be a real Hilbert space. Then the following hold:*(a)* for all ;*(b)* for all and with ;*(c)*if is a sequence in such that , it follows that
*

*Definition 6. *A mapping is said to be an averaged mapping if it can be written as the average of the identity and a nonexpansive mapping; that is,
where and is nonexpansive. More precisely, when the last equality holds, we say that is -averaged. Thus firmly nonexpansive mappings (particularly, projections) are -averaged mappings.

Lemma 7 (see [11]). *Let be a given mapping.*(i)* is nonexpansive if and only if the complement is -ism.*(ii)*If is -ism, then for is -ism.*(iii)* is averaged if and only if the complement is -ism for some . Indeed, for is -averaged if and only if is -ism.*

Lemma 8 (see [11]). *Let be given operators.*(i)*If for some and if is averaged and is nonexpansive, then is averaged.*(ii)* is firmly nonexpansive if and only if the complement is firmly nonexpansive.*(iii)*If for some and if is firmly nonexpansive and is nonexpansive, then is averaged.*(iv)*The composite of finitely many averaged mappings is averaged. That is, if each of the mappings is averaged, then so is the composite . In particular, if is -averaged and is -averaged, where , then the composite is -averaged, where .*(v)*If the mappings are averaged and have a common fixed point, then
The notation denotes the set of all fixed points of the mapping ; that is, .*

Let be a convex functional with -Lipschitz continuous gradient . It is well known that the gradient-projection algorithm (GPA) generates a sequence determined by the gradient and the metric projection : or more generally, where, in both (26) and (27), the initial guess is taken from arbitrarily, and the parameters or are positive real numbers. The convergence of algorithms (26) and (27) depends on the behavior of the gradient .

Lemma 9 (see [12, Demiclosedness principle]). *Let be a nonempty closed convex subset of a real Hilbert space . Let be a nonexpansive self-mapping on . Then is demiclosed. That is, whenever is a sequence in weakly converging to some and the sequence strongly converges to some , it follows that . Here is the identity operator of .*

Lemma 10. *Let be a monotone mapping. In the context of the variational inequality problem the characterization of the projection (see Proposition 2(i)) implies
**Let be a nonempty closed convex subset of a real Hilbert space . We introduce some notations. Let be a number in and let . Associating with a nonexpansive mapping , we define the mapping by
**
where is an operator such that, for some positive constants , is -Lipschitzian and -strongly monotone on ; that is, satisfies the conditions:
**
for all .*

Lemma 11 (see [13, Lemma 3.1]). * is a contraction provided by ; that is,
**
where .*

Lemma 12 (see [13]). *Let be a sequence of nonnegative numbers satisfying the conditions
**
where and are sequences of real numbers such that*(i)* and , or equivalently,
*(ii)*, or .** Then .**Recall that a Banach space is said to satisfy Opialâ€™s property [12] if, for any given sequence which converges weakly to an element , there holds the inequality
**
It is well known that every Hilbert space satisfies Opialâ€™s property in [12].**Finally, recall that a set-valued mapping is called monotone if for all , and imply
**
A set-valued mapping is called maximal monotone if is monotone and for each , where is the identity mapping of . We denote by the graph of . It is known that a monotone mapping is maximal if and only if, for , for every , implies . Let be a monotone, -Lipschitz-continuous mapping and let be the normal cone to at ; that is,
**
Define
**
Then, is maximal monotone such that
**Let be a maximal monotone mapping. Let be two positive numbers.*

Lemma 13 (see [14]). *There holds the resolvent identity
**For , there holds the following relation that
**Based on Huang [15], there holds the following property for the resolvent operator .*

Lemma 14. * is single-valued and firmly nonexpansive; that is,
**Consequently, is nonexpansive and monotone.*

Lemma 15 (see [16]). *Let be a maximal monotone mapping with . Then for any given , is a solution of problem (10) if and only if satisfies
*

Lemma 16 (see [17]). *Let be a maximal monotone mapping with and let be a strongly monotone, continuous, and single-valued mapping. Then, for each , the equation has a unique solution for .*

Lemma 17 (see [16]). *Let be a maximal monotone mapping with and let be a monotone, continuous, and single-valued mapping. Then for each . In this case, is maximal monotone.*

#### 3. Main Results

In this section, we will introduce and analyze a hybrid steepest-descent algorithm for finding a solution of the THCOP (13) with constraints of several problems: the CMP (8), finitely many GMEPs, and finitely many variational inclusions in a real Hilbert space. This algorithm is based on Korpelevichâ€™s extragradient method, the steepest-descent method, and the averaged mapping approach to the gradient-projection algorithm. We prove the strong convergence of the proposed algorithm to a unique solution of THCOP (13) under suitable conditions. Throughout this paper, let be nonexpansive mappings with an integer. We write , for integer , with the mod function taking values in the set (i.e., if for some integers and , then if and if ).

The following is to state and prove the main result in this paper.

Theorem 18. *Let be a nonempty closed convex subset of a real Hilbert space and let be a convex and continuously FrÃ©chet differentiable functional with -Lipschitz continuous gradient . Let be three integers. Let be a bifunctions from to satisfying (A1)â€“(A4), a lower semicontinuous and convex functional with restriction (B1) or (B2), and -inverse-strongly monotone for . Let be a maximal monotone mapping and let be -inverse strongly monotone for . Let be a finite family of nonexpansive mappings on . Let be -inverse strongly monotone and let be -strongly monotone and -Lipschitz continuous. Assume that with . Let , , , , and where and . For arbitrarily given , let be a sequence generated by
**
where (here is nonexpansive and for each ). Assume that
**
and that the following conditions are satisfied:*(i)* and for all ;*(ii)* or ;*(iii)* or ;*(iv)* or ;*(v)* or for ;*(vi)* or for .** Then the following hold:*(a)* is bounded;*(b)*;*(c)* provided ;*(d)* converges strongly to the unique element of provided .*

*Proof. *Let . Since is -Lipschitzian, it follows that is -ism. By Lemma 7(ii), we know that for , is -ism. So by Lemma 7(iii), we deduce that is -averaged. Now since the projection is -averaged, it is easy to see from Lemma 8(iv) that the composite is -averaged for . Hence we obtain that, for each , is -averaged for each . Therefore, we can write
where is nonexpansive and for each . Since is -Lipschitz continuous, we get
Putting , for all , we have
Put
for all and ,
for all , , and , where is the identity mapping on . Then we have that and .

We divide the rest of the proof into several steps.*Stepâ€‰â€‰1*. We prove that is bounded.

Indeed, utilizing (18) and Proposition 3(ii), we have
Utilizing (18) and Lemma 14 we have
Combining (50) and (51), we have
Since is -inverse strongly monotone and , we have
Utilizing Lemma 11, we deduce from (52), , and that for all
where . So, by induction we obtain
Hence is bounded. Since is -inverse strongly monotone, it is known that is -Lipschitz continuous. Thus, from (52), we get
Consequently, the boundedness of ensures the boundedness of , and . From and the nonexpansivity of , it follows that is bounded. Since is -Lipschitz continuous, is also bounded. *Stepâ€‰â€‰2*. We prove that .

Indeed, utilizing (18) and (40), we obtain that
where