About this Journal Submit a Manuscript Table of Contents
Abstract and Applied Analysis
VolumeΒ 2012Β (2012), Article IDΒ 605389, 18 pages
http://dx.doi.org/10.1155/2012/605389
Review Article

Approximate Iteration Algorithm with Error Estimate for Fixed Point of Nonexpansive Mappings

Department of Mathematics, Tianjin Polytechnic University, Tianjin 300387, China

Received 12 July 2012; Accepted 7 August 2012

Academic Editor: XiaolongΒ Qin

Copyright Β© 2012 Yongfu Su. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Abstract

The purpose of this article is to present a general viscosity iteration process {π‘₯𝑛} which defined by π‘₯𝑛+1=(πΌβˆ’π›Όπ‘›π΄)𝑇π‘₯𝑛+𝛽𝑛𝛾𝑓(π‘₯𝑛)+(π›Όπ‘›βˆ’π›½π‘›)π‘₯𝑛 and to study the convergence of {π‘₯𝑛}, where T is a nonexpansive mapping and A is a strongly positive linear operator, if {𝛼𝑛}, {𝛽𝑛} satisfy appropriate conditions, then iteration sequence {π‘₯𝑛} converges strongly to the unique solution π‘₯βˆ—βˆˆπ‘“(𝑇) of variational inequality ⟨(π΄βˆ’π›Ύπ‘“)π‘₯βˆ—,π‘₯βˆ’π‘₯βˆ—βŸ©β‰₯0, for all π‘₯βˆˆπ‘“(𝑇). Meanwhile, a approximate iteration algorithm is presented which is used to calculate the fixed point of nonexpansive mapping and solution of variational inequality, the error estimate is also given. The results presented in this paper extend, generalize, and improve the results of Xu, G. Marino and Xu and some others.

1. Introduction

Iteration methods for nonexpansive mappings have recently been applied to solve convex minimization problems; see, for example, [1–4] and the references therein. A typical problem is to minimize a quadratic function over the set of the fixed points of a nonexpansive mapping on a real Hilbert space 𝐻: minπ‘₯∈𝐹(𝑇)12⟨𝐴π‘₯,π‘₯βŸ©βˆ’βŸ¨π‘₯,π‘βŸ©,(1.1) where 𝐹(𝑇) is the fixed points set of a nonexpansive mapping 𝑇 on 𝐻, 𝑏 is a given point in 𝐻, and π΄βˆΆπ»β†’π» is strongly positive operator, that is, there exists a constant 𝛿>0 with the property ⟨𝐴π‘₯,π‘₯⟩β‰₯𝛿‖π‘₯β€–2,βˆ€π‘₯∈𝐻.(1.2)

Recall that π‘‡βˆΆπ»β†’π» is nonexpansive if ‖𝑇π‘₯βˆ’π‘‡π‘¦β€–β‰€β€–π‘₯βˆ’π‘¦β€– for all π‘₯,π‘¦βˆˆπ». Throughout the rest of this paper, we denote by 𝐹(𝑇) the fixed points set of 𝑇 and assume that 𝐹(𝑇) is nonempty. It is well known that 𝐹(𝑇) is closed convex (cf. [5]). In [4] (see also [2]), it is proved that the sequence {π‘₯𝑛} defined by the iteration method below, with the initial guess π‘₯0 chosen arbitrarily, π‘₯𝑛+1=ξ€·πΌβˆ’π›Όπ‘›π΄ξ€Έπ‘‡π‘₯𝑛+𝛼𝑛𝑏,𝑛β‰₯0,(1.3) converges strongly to the unique solution of minimization problem (1.1) provided the sequence {𝛼𝑛} satisfies certain conditions.

On the other hand, Moudafi [6] introduced the viscosity approximation method for nonexpansive mappings (see [7] for further developments in both Hilbert and Banach spaces). Let 𝑓 be a contraction on 𝐻. Starting with an arbitrary initial guess π‘₯0∈𝐻, define a sequence {π‘₯𝑛} recursively by π‘₯𝑛+1=ξ€·πΌβˆ’π›Όπ‘›π΄ξ€Έπ‘‡π‘₯𝑛+𝛼𝑛𝑓π‘₯𝑛,𝑛β‰₯0,(1.4) where {𝛼𝑛} is a sequence in (0,1). It is proved [6, 7] that under certain appropriate conditions imposed on {𝛼𝑛}, the sequence {π‘₯𝑛} generated by (1.4) converges strongly to the unique solution π‘₯βˆ— in 𝐹(𝑇) of the variational inequality ⟨(πΌβˆ’π‘“)π‘₯βˆ—,π‘₯βˆ’π‘₯βˆ—βŸ©β‰₯0,π‘₯∈𝐹(𝑇).(1.5)

Recently (2006), Marino and Xu [2] combine the iteration method (1.3) with the viscosity approximation method (1.4) and consider the following general iteration method: π‘₯𝑛+1=ξ€·πΌβˆ’π›Όπ‘›π΄ξ€Έπ‘‡π‘₯𝑛+𝛼𝑛π‘₯𝛾𝑓𝑛,𝑛β‰₯0,(1.6) they have proved that if the sequence {𝛼𝑛} of parameters satisfies appropriate conditions, then the sequence {π‘₯𝑛} generated by (1.6) converges strongly to the unique solution of the variational inequality ⟨(π΄βˆ’π›Ύπ‘“)π‘₯βˆ—,π‘₯βˆ’π‘₯βˆ—βŸ©β‰₯0,π‘₯∈𝐹(𝑇),(1.7) which is the optimality condition for the minimization problem minπ‘₯∈𝐹(𝑇)12⟨𝐴π‘₯,π‘₯βŸ©βˆ’β„Ž(π‘₯),(1.8) where β„Ž is a potential function for 𝛾𝑓 (i.e., β„Žξ…ž(π‘₯)=𝛾𝑓(π‘₯) for π‘₯∈𝐻).

The purpose of this paper is to present a general viscosity iteration process {π‘₯𝑛} which is defined by π‘₯𝑛+1=ξ€·πΌβˆ’π›Όπ‘›π΄ξ€Έπ‘‡π‘₯𝑛+𝛽𝑛π‘₯𝛾𝑓𝑛+ξ€·π›Όπ‘›βˆ’π›½π‘›ξ€Έπ‘₯𝑛(1.9) and to study the convergence of {π‘₯𝑛}, where 𝑇 is a nonexpansive mapping and 𝐴 is a strongly positive linear operator, if {𝛼𝑛}, {𝛽𝑛} satisfy appropriate conditions, then iteration sequence {π‘₯𝑛} converges strongly to the unique solution π‘₯βˆ—βˆˆπΉ(𝑇) of variational inequality (1.7). Meanwhile, an approximate iteration algorithm π‘₯𝑛+1=(πΌβˆ’π‘ π΄)𝑇π‘₯𝑛π‘₯+𝑑𝛾𝑓𝑛+(π‘ βˆ’π‘‘)π‘₯𝑛(1.10) is presented which is used to calculate the fixed point of nonexpansive mapping and solution of variational inequality; the convergence rate estimate is also given. The results presented in this paper extend, generalize and improve the results of Xu [7], Marino and Xu [2], and some others.

2. Preliminaries

This section collects some lemmas which will be used in the proofs for the main results in the next section. Some of them are known; others are not hard to derive.

Lemma 2.1 (see [3]). Assume that {π‘Žπ‘›} is a sequence of nonnegative real numbers such that π‘Žπ‘›+1≀1βˆ’πœ†π‘›ξ€Έπ‘Žπ‘›+𝛿𝑛,(2.1) where {πœ†π‘›} is a sequence in (0,1) and {𝛿𝑛} is a sequence in (βˆ’βˆž,+∞) such that(i)βˆ‘βˆžπ‘›=1πœ†π‘›=∞; (ii)limsupπ‘›β†’βˆžπ›Ώπ‘›/π‘Ÿπ‘›β‰€0, or βˆ‘βˆžπ‘›=1|𝛿𝑛|<∞. Then limπ‘›β†’βˆžπ‘Žπ‘›=0.

Lemma 2.2 (see [5]). Let 𝐻 be a Hilbert space, 𝐾 a closed convex subset of 𝐻, and π‘‡βˆΆπΎβ†’πΎ a nonexpansive mapping with nonempty fixed points set 𝐹(𝑇). If {π‘₯𝑛} is a sequence in 𝐾 weakly converging to π‘₯ and if {(πΌβˆ’π‘‡)π‘₯𝑛} converges strongly to 𝑦, then (πΌβˆ’π‘‡)π‘₯=𝑦.

The following lemma is not hard to prove.

Lemma 2.3. Let 𝐻 be a Hilbert space, 𝐾 a closed convex subset of 𝐻, π‘“βˆΆπ»β†’π» a contraction with coefficient 0<β„Ž<1, and 𝐴 a strongly positive linear bounded operator with coefficient 𝛿>0. Then, for 0<𝛾<(𝛿/β„Ž), ⟨π‘₯βˆ’π‘¦,(π΄βˆ’π›Ύπ‘“)π‘₯βˆ’(π΄βˆ’π›Ύπ‘“)π‘¦βŸ©β‰₯(π›Ώβˆ’π›Ύβ„Ž)β€–π‘₯βˆ’π‘¦β€–2,π‘₯,π‘¦βˆˆπ».(2.2) That is, π΄βˆ’π›Ύπ‘“ is strongly monotone with coefficient π›Ώβˆ’π›Ύβ„Ž.

Recall the metric (nearest point) projection 𝑃𝐾 from a real Hilbert space 𝐻 to a closed convex subset 𝐾 of 𝐻 is defined as follows: given π‘₯∈𝐻, 𝑃𝐾π‘₯ is the only point in 𝐾 with the property β€–β€–π‘₯βˆ’π‘ƒπΎπ‘₯β€–β€–=minπ‘¦βˆˆπΎβ€–π‘₯βˆ’π‘¦β€–.(2.3)𝑃𝐾 is characterized as follows.

Lemma 2.4. Let 𝐾 be a closed convex subset of a real Hilbert space 𝐻. Given that π‘₯∈𝐻 and π‘¦βˆˆπΎ. Then 𝑦=𝑃𝐾π‘₯ if and only if there holds the inequality ⟨π‘₯βˆ’π‘¦,π‘¦βˆ’π‘§βŸ©β‰₯0,βˆ€π‘§βˆˆπΎ.(2.4)

Lemma 2.5. Assume that 𝐴 is a strongly positive linear-bounded operator on a Hilbert space 𝐻 with coefficient 𝛿>0 and 0<πœŒβ‰€β€–π΄β€–βˆ’1. Then β€–πΌβˆ’πœŒπ΄β€–β‰€(1βˆ’πœŒπ›Ώ).

Proof. Recall that a standard result in functional analysis is that if 𝑉 is linear bounded self-adjoint operator on 𝐻, then ξ€½||||‖𝑉‖=supβŸ¨π‘‰π‘₯,π‘₯⟩∢π‘₯∈𝐻,β€–π‘₯β€–=1.(2.5) Now for π‘₯∈𝐻 with β€–π‘₯β€–=1, we see that ⟨(πΌβˆ’πœŒπ΄)π‘₯,π‘₯⟩=1βˆ’πœŒβŸ¨π΄π‘₯,π‘₯⟩β‰₯1βˆ’πœŒβ€–π΄β€–β‰₯0(2.6) (i.e., πΌβˆ’πœŒπ΄ is positive). It follows that }β€–πΌβˆ’πœŒπ΄β€–=sup{⟨(πΌβˆ’πœŒπ΄)π‘₯,π‘₯⟩∢π‘₯∈𝐻,β€–π‘₯β€–=1=sup{1βˆ’πœŒβŸ¨π΄π‘₯,π‘₯⟩∢π‘₯∈𝐻,β€–π‘₯β€–=1}≀1βˆ’πœŒπ›Ώ.(2.7)

The following lemma is also not hard to prove by induction.

Lemma 2.6. Assume that {π‘Žπ‘›} is a sequence of nonnegative real numbers such that π‘Žπ‘›+1≀1βˆ’πœ†π‘›ξ€Έπ‘Žπ‘›+ξ€·πœ†π‘›+πœ‡π‘›ξ€Έπ‘€,(2.8) where 𝑀 is a nonnegative constant and {πœ†π‘›},{πœ‡π‘›} are sequences in [0,+∞) such that(i)βˆ‘βˆžπ‘›=0πœ†π‘›=∞; (ii)βˆ‘βˆžπ‘›=0πœ‡π‘›<∞. Then {π‘Žπ‘›} is bounded.

Notation. We use β†’ for strong convergence and ⇀ for weak convergence.

3. A general Iteration Algorithm with Bounded Linear Operator

Let 𝐻 be a real Hilbert space, 𝐴 be a bounded linear operator on 𝐻, and 𝑇 be a nonexpansive mapping on 𝐻. Assume that the fixed point set 𝐹(𝑇)={π‘₯βˆˆπ»βˆΆπ‘‡π‘₯=π‘₯} of 𝑇 is nonempty. Since 𝐹(𝑇) is closed convex, the nearest point projection from 𝐻 onto 𝐹(𝑇) is well defined.

Throughout the rest of this paper, we always assume that 𝐴 is strongly positive, that is, there exists a constant 𝛿>0 such that ⟨𝐴π‘₯,π‘₯⟩β‰₯𝛿‖π‘₯β€–2,βˆ€π‘₯∈𝐻.(3.1) (Note: 𝛿>0 is throughout reserved to be the constant such that (3.1) holds.)

Recall also that a contraction on 𝐻 is a self-mapping 𝑓 of 𝐻 such that ‖𝑓(π‘₯)βˆ’π‘“(𝑦)β€–β‰€β„Žβ€–π‘₯βˆ’π‘¦β€–,βˆ€π‘₯,π‘¦βˆˆπ»,(3.2) where β„Žβˆˆ(0,1) is a constant which is called contractive coefficient of 𝑓.

For given contraction 𝑓 with contractive coefficient 0<β„Ž<1, and π‘‘βˆˆ[0,1),π‘ βˆˆ(0,1),𝑠β‰₯𝑑 such that 0≀𝑑≀𝑠<β€–π΄β€–βˆ’1 and 0<𝛾<𝛿/β„Ž, consider a mapping 𝑆𝑑,𝑠 on 𝐻 defined by 𝑆𝑑,𝑠π‘₯=(πΌβˆ’π‘ π΄)𝑇π‘₯+𝑑𝛾𝑓(π‘₯)+(π‘ βˆ’π‘‘)π‘₯,π‘₯∈𝐻.(3.3) Assume that π‘ βˆ’π‘‘π‘ β†’0,(3.4) it is not hard to see that 𝑆𝑑,𝑠 is a contraction for sufficiently small 𝑠, indeed, by Lemma 2.5 we have ‖‖𝑆𝑑,𝑠π‘₯βˆ’π‘†π‘‘,𝑠𝑦‖‖‖=≀𝑑𝛾‖𝑓(π‘₯)βˆ’π‘“(𝑦)β€–+β€–(πΌβˆ’π‘ π΄)(𝑇π‘₯βˆ’π‘‡π‘¦)β€–+β€–(π‘ βˆ’π‘‘)(π‘₯βˆ’π‘¦)≀(π‘‘π›Ύβ„Ž+1βˆ’π‘ π›Ώ+π‘ βˆ’π‘‘)β€–π‘₯βˆ’π‘¦β€–(1+𝑑(π›Ύβ„Žβˆ’1)βˆ’π‘ (π›Ώβˆ’1))β€–π‘₯βˆ’π‘¦β€–.(3.5) Hence, 𝑆𝑑,𝑠 has a unique fixed point, denoted by π‘₯𝑑,𝑠, which uniquely solves the fixed point equation: π‘₯𝑑,𝑠=(πΌβˆ’π‘ π΄)𝑇π‘₯𝑑,𝑠π‘₯+𝑑𝛾𝑓𝑑,𝑠+(π‘ βˆ’π‘‘)π‘₯𝑑,𝑠.(3.6) Note that π‘₯𝑑,𝑠 indeed depends on 𝑓 as well, but we will suppress this dependence of π‘₯𝑑,𝑠 on 𝑓 for simplicity of notation throughout the rest of this paper. We will also always use 𝛾 to mean a number in (0,𝛿/β„Ž).

The next proposition summarizes the basic properties of π‘₯𝑑,𝑠,(𝑑≀𝑠).

Proposition 3.1. Let π‘₯𝑑,𝑠 be defined via (3.6).(i){π‘₯𝑑,𝑠} is bounded for π‘‘βˆˆ[0,β€–π΄β€–βˆ’1),π‘ βˆˆ(0,β€–π΄β€–βˆ’1).(ii)lim𝑠→0β€–π‘₯𝑑,π‘ βˆ’π‘‡π‘₯𝑑,𝑠‖=0.(iii){π‘₯𝑑,𝑠} defines a continuous surface for (𝑑,𝑠)∈[0,β€–π΄β€–βˆ’1)Γ—(0,β€–π΄β€–βˆ’1),𝑑≀𝑠 into 𝐻.

Proof. Observe, for π‘ βˆˆ(0,β€–π΄β€–βˆ’1), that β€–πΌβˆ’π‘ π΄β€–β‰€1βˆ’π‘ π›Ώ by Lemma 2.5.
To show (i) pick π‘βˆˆπΉ(𝑇). We then have β€–β€–π‘₯𝑑,𝑠‖‖=β€–β€–ξ€·βˆ’π‘(πΌβˆ’π‘ π΄)𝑇π‘₯𝑑,𝑠π‘₯βˆ’π‘+𝑑𝛾𝑓𝑑,𝑠‖‖‖‖π‘₯βˆ’π΄π‘βˆ’π‘ π΄π‘+𝑑𝐴𝑝≀(1βˆ’π‘ π›Ώ)𝑑,𝑠‖‖‖‖π‘₯βˆ’π‘+𝑑𝛾𝑓𝑑,𝑠‖‖≀‖‖π‘₯βˆ’π΄π‘+(π‘ βˆ’π‘‘)‖𝐴𝑝‖(1βˆ’π‘ π›Ώ)𝑑,𝑠‖‖‖‖π‘₯βˆ’π‘+π‘ π›Ύβ„Žπ‘‘,𝑠‖‖‖+≀[]β€–β€–π‘₯βˆ’π‘+‖𝛾𝑓(𝑝)βˆ’π΄π‘(π‘ βˆ’π‘‘)‖𝐴𝑝‖1βˆ’π‘ (π›Ώβˆ’π›Ύβ„Ž)𝑑,π‘ β€–β€–βˆ’π‘+𝑠‖𝛾𝑓(𝑝)βˆ’π΄π‘β€–+(π‘ βˆ’π‘‘)‖𝐴𝑝‖.(3.7) It follows that β€–β€–π‘₯𝑑,𝑠‖‖≀(βˆ’π‘β€–π›Ύπ‘“π‘)βˆ’π΄π‘β€–+π›Ώβˆ’π›Ύβ„Žπ‘ βˆ’π‘‘π‘ β€–π΄π‘β€–π›Ώβˆ’π›Ύβ„Ž<+∞.(3.8) Hence {π‘₯𝑑,𝑠} is bounded.
(ii) Since the boundedness of {π‘₯𝑑,𝑠} implies that of {𝑓(π‘₯𝑑,𝑠)} and {𝐴𝑇π‘₯𝑑,𝑠}, and observe that β€–β€–π‘₯𝑑,π‘ βˆ’π‘‡π‘₯𝑑,𝑠‖‖=β€–β€–ξ€·π‘₯𝑑𝑓𝑑,π‘ ξ€Έβˆ’π‘ π΄π‘‡π‘₯𝑑,𝑠+(π‘ βˆ’π‘‘)π‘₯𝑑,𝑠‖‖,(3.9) we have lim𝑠→0β€–β€–π‘₯𝑑,π‘ βˆ’π‘‡π‘₯𝑑,𝑠‖‖=0.(3.10)
To prove (iii) take 𝑑,𝑑0∈[0,β€–π΄β€–βˆ’1),𝑠,𝑠0∈(0,β€–π΄β€–βˆ’1),𝑠β‰₯𝑑,𝑠0β‰₯𝑑0 and calculate β€–β€–π‘₯𝑑,π‘ βˆ’π‘₯𝑑0,𝑠0β€–β€–=β€–β€–ξ€·π‘‘βˆ’π‘‘0ξ€Έξ€·π‘₯𝛾𝑓𝑑,𝑠+𝑑0𝛾𝑓π‘₯𝑑,𝑠π‘₯βˆ’π‘“π‘‘0,𝑠0βˆ’ξ€·ξ€Έξ€Έπ‘ βˆ’π‘ 0𝐴𝑇π‘₯𝑑,𝑠+ξ€·πΌβˆ’π‘ 0𝐴𝑇π‘₯𝑑,π‘ βˆ’π‘‡π‘₯𝑑0,𝑠0ξ€Έξ€·π‘₯+(π‘ βˆ’π‘‘)𝑑,π‘ βˆ’π‘₯𝑑0,𝑠0ξ€Έ+ξ€·π‘ βˆ’π‘ 0+𝑑0ξ€Έπ‘₯βˆ’π‘‘π‘‘0,𝑠0‖‖≀||π‘‘βˆ’π‘‘0||𝛾‖‖𝑓π‘₯𝑑,𝑠‖‖+𝑑0β€–β€–π‘₯π›Ύβ„Žπ‘‘,π‘ βˆ’π‘₯𝑑0,𝑠0β€–β€–+||π‘ βˆ’π‘ 0||‖‖𝐴𝑇π‘₯𝑑,𝑠‖‖+ξ€·1βˆ’π‘ 0𝛿‖‖π‘₯𝑑,π‘ βˆ’π‘₯𝑑0,𝑠0β€–β€–β€–β€–π‘₯+(π‘ βˆ’π‘‘)𝑑,π‘ βˆ’π‘₯𝑑0,𝑠0β€–β€–+ξ€Ί||π‘ βˆ’π‘ 0||+||π‘‘βˆ’π‘‘0||ξ€»β€–β€–π‘₯𝑑0,𝑠0β€–β€–,(3.11) which implies that 𝑠0π›Ώβˆ’π‘‘0ξ€Έβ€–β€–π‘₯π›Ύβ„Ž+π‘‘βˆ’π‘ π‘‘,π‘ βˆ’π‘₯𝑑0,𝑠0‖‖≀||π‘‘βˆ’π‘‘0||𝛾‖‖𝑓π‘₯𝑑,𝑠‖‖+||π‘ βˆ’π‘ 0||‖‖𝐴𝑇π‘₯𝑑,𝑠‖‖+ξ€Ί||π‘ βˆ’π‘ 0||+||π‘‘βˆ’π‘‘0||ξ€»β€–β€–π‘₯𝑑0,𝑠0β€–β€–β†’0(3.12) as 𝑑→𝑑0,𝑠→𝑠0. Note that lim𝑑→𝑑0,𝑠→𝑠0𝑠0π›Ώβˆ’π‘‘0ξ€Έπ›Ύβ„Ž+π‘‘βˆ’π‘ =𝑠0(π›Ώβˆ’1)βˆ’π‘‘0(π›Ύβ„Žβˆ’1)>0,(3.13) it is obvious that lim𝑑→𝑑0,𝑠→𝑠0β€–β€–π‘₯𝑑,π‘ βˆ’π‘₯𝑑0,𝑠0β€–β€–=0.(3.14) This completes the proof of Proposition 3.1.

Our first main result below shows that π‘₯𝑑,𝑠 converges strongly as 𝑠→0 to a fixed point of 𝑇 which solves some variational inequality.

Theorem 3.2. One has that π‘₯𝑑,𝑠 converges strongly as 𝑠→0(𝑑≀𝑠) to a fixed point Μƒπ‘₯ of 𝑇 which solves the variational inequality: ⟨(π΄βˆ’π›Ύπ‘“)Μƒπ‘₯,Μƒπ‘₯βˆ’π‘§βŸ©β‰€0,π‘§βˆˆπΉ(𝑇).(3.15) Equivalently, One has 𝑃𝐹(𝑇)(πΌβˆ’π΄+𝛾𝑓)Μƒπ‘₯=Μƒπ‘₯, where 𝑃𝐹(𝑇)(β‹…) is the nearest point projection from 𝐻 onto 𝐹(𝑇).

Proof. We first shows the uniqueness of a solution of the variational inequality (3.15), which is indeed a consequence of the strong monotonicity of π΄βˆ’π›Ύπ‘“. Suppose Μƒπ‘₯∈𝐹(𝑇) and Μ‚π‘₯∈𝐹(𝑇) both are solutions to (3.15), then ⟨(π΄βˆ’π›Ύπ‘“)Μƒπ‘₯,Μƒπ‘₯βˆ’Μ‚π‘₯βŸ©β‰€0,⟨(π΄βˆ’π›Ύπ‘“)Μ‚π‘₯,Μ‚π‘₯βˆ’Μƒπ‘₯βŸ©β‰€0.(3.16) Adding up (3.16) gets ⟨(π΄βˆ’π›Ύπ‘“)Μƒπ‘₯βˆ’(π΄βˆ’π›Ύπ‘“)Μ‚π‘₯,Μƒπ‘₯βˆ’Μ‚π‘₯βŸ©β‰€0.(3.17) The strong monotonicity of π΄βˆ’π›Ύπ‘“ implies that Μƒπ‘₯=Μ‚π‘₯ and the uniqueness is proved. Below we use Μƒπ‘₯∈𝐹(𝑇) to denote the unique solution of (3.15).
To prove that π‘₯𝑑,𝑠 converges strongly to Μƒπ‘₯, we write, for a given π‘§βˆˆπΉ(𝑇), π‘₯𝑑,𝑠π‘₯βˆ’π‘§=𝑑𝛾𝑓𝑑,π‘ ξ€Έβˆ’π‘ π‘‘ξ‚ξ€·π΄π‘§+(πΌβˆ’π‘ π΄)𝑇π‘₯𝑑,π‘ ξ€Έβˆ’π‘§+(π‘ βˆ’π‘‘)π‘₯𝑑,𝑠(3.18) to derive that β€–β€–π‘₯𝑑,π‘ β€–β€–βˆ’π‘§2=𝑑π‘₯𝛾𝑓𝑑,π‘ ξ€Έβˆ’π‘ π‘‘ξ‚ξ€·π΄π‘§+(πΌβˆ’π‘ π΄)𝑇π‘₯𝑑,π‘ ξ€Έβˆ’π‘§+(π‘ βˆ’π‘‘)π‘₯𝑑,𝑠,π‘₯𝑑,𝑠π‘₯βˆ’π‘§=𝑑𝛾𝑓𝑑,π‘ ξ€Έβˆ’π΄π‘§,π‘₯𝑑,𝑠+ξ«ξ€·βˆ’π‘§(πΌβˆ’π‘ π΄)Tπ‘₯𝑑,π‘ ξ€Έβˆ’π‘§,π‘₯𝑑,𝑠π‘₯βˆ’π‘§+(π‘ βˆ’π‘‘)𝑑,π‘ βˆ’π΄π‘§,π‘₯𝑑,𝑠‖‖π‘₯βˆ’π‘§β‰€(1βˆ’π‘ π›Ώ)𝑑,π‘ β€–β€–βˆ’π‘§2π‘₯+𝑑𝛾𝑓𝑑,π‘ ξ€Έβˆ’π΄π‘§,π‘₯𝑑,𝑠π‘₯βˆ’π‘§+(π‘ βˆ’π‘‘)𝑑,π‘ βˆ’π΄π‘§,π‘₯𝑑,𝑠.βˆ’π‘§(3.19) It follows that β€–β€–π‘₯𝑑,π‘ β€–β€–βˆ’π‘§2≀𝑑π‘₯𝑠𝛿𝛾𝑓𝑑,π‘ ξ€Έβˆ’π΄π‘§,π‘₯𝑑,𝑠+βˆ’π‘§π‘ βˆ’π‘‘ξ«π‘₯𝑠𝛿𝑑,π‘ βˆ’π΄π‘§,π‘₯𝑑,𝑠=π‘‘βˆ’π‘§ξ€½π›Ύξ«π‘“ξ€·π‘₯𝑠𝛿𝑑,π‘ ξ€Έβˆ’π‘“(𝑧),π‘₯𝑑,𝑠+ξ«βˆ’π‘§π›Ύπ‘“(𝑧)βˆ’π΄π‘§,π‘₯𝑑,𝑠+βˆ’π‘§ξ¬ξ€Ύπ‘ βˆ’π‘‘ξ«π‘₯𝑠𝛿𝑑,π‘ βˆ’π΄π‘§,π‘₯𝑑,π‘ ξ¬β‰€π‘‘βˆ’π‘§ξ‚†β€–β€–π‘₯π‘ π›Ώπ›Ύβ„Žπ‘‘,π‘ β€–β€–βˆ’π‘§2+𝛾𝑓(𝑧)βˆ’π΄π‘§,π‘₯𝑑,𝑠+βˆ’π‘§π‘ βˆ’π‘‘ξ«π‘₯𝑠𝛿𝑑,π‘ βˆ’π΄π‘§,π‘₯𝑑,𝑠,βˆ’π‘§(3.20) which leads to β€–β€–π‘₯𝑑,π‘ β€–β€–βˆ’π‘§2β‰€π‘‘ξ«π‘ π›Ώβˆ’π‘‘π›Ύβ„Žπ›Ύπ‘“(𝑧)βˆ’π΄π‘§,π‘₯𝑑,𝑠+βˆ’π‘§π‘ βˆ’π‘‘ξ«π‘₯π‘ π›Ώβˆ’π‘‘π›Ύβ„Žπ‘‘,π‘ βˆ’π΄π‘§,π‘₯𝑑,𝑠.βˆ’π‘§(3.21) Observe that condition (3.4) implies π‘ βˆ’π‘‘π‘ π›Ώβˆ’π‘‘π›Ύβ„Žβ†’0,(3.22) as 𝑠→0. Since π‘₯𝑑,𝑠 is bounded as 𝑠→0,𝑠β‰₯𝑑, then there exists real sequences {𝑠𝑛},{𝑑𝑛} in [0,1] such that 𝑠𝑛→0,𝑠𝑛β‰₯𝑑𝑛 and {π‘₯𝑑𝑛,𝑠𝑛} converges weakly to a point π‘₯βˆ—βˆˆπ». Using Proposition 3.1 and Lemma 2.2, we see that π‘₯βˆ—βˆˆπΉ(𝑇), therefore by (3.21), we see π‘₯𝑑𝑛,𝑠𝑛→π‘₯βˆ—. We next prove that π‘₯βˆ— solves the variational inequality (3.15). Since π‘₯𝑑,𝑠=(πΌβˆ’π‘ π΄)𝑇π‘₯𝑑,𝑠π‘₯+𝑑𝛾𝑓𝑑,𝑠+(π‘ βˆ’π‘‘)π‘₯𝑑,𝑠,(3.23) we derive that 𝑠(π΄βˆ’π›Ύπ‘“)π‘₯𝑑,𝑠=𝑠𝐴π‘₯𝑑,π‘ βˆ’π‘₯𝑑,𝑠π‘₯+𝑑𝛾𝑓𝑑,𝑠π‘₯βˆ’π‘ π›Ύπ‘“π‘‘,𝑠+(πΌβˆ’π‘ π΄)𝑇π‘₯𝑑,𝑠+(π‘ βˆ’π‘‘)π‘₯𝑑,𝑠=(πΌβˆ’π‘ π΄)𝑇π‘₯𝑑,π‘ βˆ’π‘₯𝑑,𝑠π‘₯+(π‘ βˆ’π‘‘)𝑑,𝑠π‘₯βˆ’π›Ύπ‘“π‘‘,𝑠,ξ€Έξ€Έ(3.24) so that (π΄βˆ’π›Ύπ‘“)π‘₯𝑑,𝑠=1𝑠(πΌβˆ’π‘ π΄)𝑇π‘₯𝑑,π‘ βˆ’π‘₯𝑑,𝑠+π‘ βˆ’π‘‘π‘ ξ€·π‘₯𝑑,𝑠π‘₯βˆ’π›Ύπ‘“π‘‘,𝑠.ξ€Έξ€Έ(3.25) It follows that, for π‘§βˆˆπΉ(𝑇), (π΄βˆ’π›Ύπ‘“)π‘₯𝑑,𝑠,π‘₯𝑑,𝑠=1βˆ’π‘§π‘ ξ«ξ€·(πΌβˆ’π‘ π΄)𝑇π‘₯𝑑,π‘ βˆ’π‘₯𝑑,𝑠,π‘₯𝑑,𝑠+βˆ’π‘§π‘ βˆ’π‘‘π‘ ξ«π‘₯𝑑,𝑠π‘₯βˆ’π›Ύπ‘“π‘‘,𝑠,π‘₯𝑑,𝑠=βˆ’π‘§βˆ’1𝑠(πΌβˆ’π‘‡)π‘₯𝑑,π‘ βˆ’(πΌβˆ’π‘‡)𝑧,π‘₯𝑑,𝑠+ξ«βˆ’π‘§π΄(πΌβˆ’π‘‡)π‘₯𝑑,𝑠,π‘₯𝑑,𝑠+βˆ’π‘§π‘ βˆ’π‘‘π‘ ξ«π‘₯𝑑,𝑠π‘₯βˆ’π›Ύπ‘“π‘‘,𝑠,π‘₯𝑑,π‘ ξ¬β‰€ξ«π΄βˆ’π‘§(πΌβˆ’π‘‡)π‘₯𝑑,𝑠,π‘₯,𝑠𝑑+βˆ’π‘§π‘ βˆ’π‘‘π‘ ξ«π‘₯𝑑,𝑠π‘₯βˆ’π›Ύπ‘“π‘‘,𝑠,π‘₯𝑑,𝑠,βˆ’π‘§(3.26) since πΌβˆ’π‘‡ is monotone (i.e., ⟨π‘₯βˆ’π‘¦,(πΌβˆ’π‘‡)π‘₯βˆ’(πΌβˆ’π‘‡)π‘¦βŸ©β‰₯0 for π‘₯,π‘¦βˆˆπ»). This is due to the nonexpansivity of 𝑇. Now replacing 𝑑,𝑠 in (3.26) with 𝑑𝑛,𝑠𝑛 and letting π‘›β†’βˆž, we, noticing that (πΌβˆ’π‘‡)π‘₯𝑑𝑛,𝑠𝑛→(πΌβˆ’π‘‡)π‘₯βˆ—=0 for π‘₯βˆ—βˆˆπΉ(𝑇), obtain ⟨(π΄βˆ’π›Ύπ‘“)π‘₯βˆ—,π‘₯βˆ—βˆ’π‘§βŸ©β‰€0.(3.27) That is, π‘₯βˆ—βˆˆπΉ(𝑇) is a solution of (3.15), hence π‘₯βˆ—=Μƒπ‘₯ by uniqueness. In a summary, we have shown that each cluster point of π‘₯𝑑,𝑠 equals Μƒπ‘₯. Therefore, π‘₯𝑑,𝑠→̃π‘₯ as 𝑠→0.
The variational inequality (3.15) can be rewritten as ⟨(πΌβˆ’π΄+𝛾𝑓)Μƒπ‘₯βˆ’Μƒπ‘₯,Μƒπ‘₯βˆ’π‘§βŸ©β‰₯0,π‘§βˆˆπΉ(𝑇).(3.28) This, by Lemma 2.4, is equivalent to the fixed point equation 𝑃𝐹(𝑇)(πΌβˆ’π΄+𝛾𝑓)Μƒπ‘₯=Μƒπ‘₯.(3.29) This complete the proof.

Taking 𝑑=𝑠 in Theorem 3.2, we get

Corollary 3.3 (see [7]). One has that π‘₯𝑑=π‘₯𝑑,𝑑 converges strongly as 𝑑→0 to a fixed point Μƒπ‘₯ of 𝑇 which solves the variational inequality: ⟨(π΄βˆ’π›Ύπ‘“)Μƒπ‘₯,Μƒπ‘₯βˆ’π‘§βŸ©β‰€0,π‘§βˆˆπΉ(𝑇).(3.30) Equivalently, One has 𝑃𝐹(𝑇)(πΌβˆ’π΄+𝛾𝑓)Μƒπ‘₯=Μƒπ‘₯, where 𝑃𝐹(𝑇)(β‹…) is the nearest point projection from 𝐻 onto 𝐹(𝑇).

Next we study a general iteration method as follows. The initial guess π‘₯0 is selected in 𝐻 arbitrarily, and the (𝑛+1)th iterate π‘₯𝑛+1 is recursively defined by π‘₯𝑛+1=ξ€·πΌβˆ’π›Όπ‘›π΄ξ€Έπ‘‡π‘₯𝑛+𝛽𝑛π‘₯𝛾𝑓𝑛+ξ€·π›Όπ‘›βˆ’π›½π‘›ξ€Έπ‘₯𝑛,(3.31) where {𝛼𝑛}βŠ‚(0,1),π›½π‘›βˆˆ[0,1),𝛽𝑛≀𝛼𝑛 are sequences satisfying following conditions: (C1)𝛼𝑛→0; (C2)βˆ‘βˆžπ‘›=0𝛼𝑛=∞; (C3) either βˆ‘βˆžπ‘›=0|𝛼𝑛+1βˆ’π›Όπ‘›|<∞ or limπ‘›β†’βˆž(𝛼𝑛+1/𝛼𝑛)=1; (C4)βˆ‘βˆžπ‘›=0(π›Όπ‘›βˆ’π›½π‘›)<∞.Below is the second main result of this paper.

Theorem 3.4. Let {π‘₯𝑛} be general by Algorithm (3.31) with the sequences {𝛼𝑛},{𝛽𝑛} of parameters satisfying conditions (C1)–(C4). Then {π‘₯𝑛} converges strongly to Μƒπ‘₯ that is obtained in Theorem 3.2.

Proof. Since 𝛼𝑛→0 by condition (C1), we may assume, without loss of generality, that 𝛼𝑛<β€–π΄β€–βˆ’1 for all 𝑛.
We now observe that {π‘₯𝑛} is bounded. Indeed, pick any π‘βˆˆπΉ(𝑇) to obtain β€–β€–π‘₯𝑛+1β€–β€–=β€–β€–ξ€·βˆ’π‘πΌβˆ’π›Όπ‘›π΄ξ€Έξ€·π‘‡π‘₯π‘›ξ€Έβˆ’π‘+𝛽𝑛π‘₯𝛾𝑓𝑛+ξ€·π›Όβˆ’π΄π‘π‘›βˆ’π›½π‘›π‘₯ξ€Έξ€·π‘›ξ€Έβ€–β€–β‰€β€–β€–βˆ’π΄π‘πΌβˆ’π›Όπ‘›π΄β€–β€–β€–β€–π‘‡π‘₯π‘›β€–β€–βˆ’π‘+𝛽𝑛‖‖π‘₯𝛾𝑓𝑛‖‖+ξ€·π›Όβˆ’π΄π‘π‘›βˆ’π›½π‘›ξ€Έβ€–β€–π‘₯π‘›β€–β€–β‰€ξ€·βˆ’π΄π‘1βˆ’π›Όπ‘›π›Ώξ€Έβ€–β€–π‘₯π‘›β€–β€–βˆ’π‘+𝛽𝑛𝛾‖‖𝑓π‘₯𝑛‖‖+β€–ξ€»+ξ€·π›Όβˆ’π‘“(𝑝)‖𝛾𝑓(𝑝)βˆ’π΄π‘π‘›βˆ’π›½π‘›ξ€Έβ€–β€–π‘₯π‘›β€–β€–β‰€ξ€·βˆ’π΄π‘1βˆ’π›Ώπ›Όπ‘›ξ€Έβ€–β€–π‘₯π‘›β€–β€–βˆ’π‘+𝛼𝑛‖‖π‘₯π›Ύβ„Žπ‘›β€–β€–βˆ’π‘βˆ’π›Όπ‘›β€–β€–π‘₯π›Ύβ„Žπ‘›β€–β€–βˆ’π‘+𝛽𝑛‖‖π‘₯π›Ύβ„Žπ‘›β€–β€–βˆ’π‘+𝛽𝑛‖𝛼𝛾𝑓(𝑝)βˆ’π΄π‘β€–+π‘›βˆ’π›½π‘›ξ€Έβ€–β€–π‘₯𝑛‖‖=ξ€Ίβˆ’π΄π‘1βˆ’(π›Ώβˆ’π›Ύβ„Ž)𝛼𝑛‖‖π‘₯π‘›β€–β€–βˆ’ξ€·π›Όβˆ’π‘π‘›βˆ’π›½π‘›ξ€Έβ€–β€–π‘₯π›Ύβ„Žπ‘›β€–β€–βˆ’π‘+𝛽𝑛(𝛼‖𝛾𝑓𝑝)βˆ’π΄π‘β€–+π‘›βˆ’π›½π‘›β€–β€–π‘₯ξ€Έξ€·π‘›β€–β€–ξ€Έβ‰€ξ€Ίβˆ’π‘+β€–π‘βˆ’π΄π‘β€–1βˆ’(π›Ώβˆ’π›Ύβ„Ž)𝛼𝑛‖‖π‘₯𝑛‖‖+ξ€·π›Όβˆ’π‘π‘›βˆ’π›½π‘›ξ€Έβ€–β€–π‘₯(1βˆ’π›Ύβ„Ž)π‘›β€–β€–βˆ’π‘+𝛽𝑛(𝛼‖𝛾𝑓𝑝)βˆ’π΄π‘β€–+π‘›βˆ’π›½π‘›ξ€Έβ‰€ξ€Ίξ€Ίβ€–π‘βˆ’π΄π‘β€–1βˆ’(π›Ώβˆ’π›Ύβ„Ž)π›Όπ‘›βˆ’ξ€·π›Όπ‘›βˆ’π›½π‘›ξ€Έ||||β€–β€–π‘₯1βˆ’π›Ύβ„Žξ€»ξ€»π‘›β€–β€–βˆ’π‘+𝛽𝑛‖+𝛼‖𝛾𝑓(𝑝)βˆ’π΄π‘π‘›βˆ’π›½π‘›ξ€Έβ‰€ξ€Ίξ€Ίβ€–π‘βˆ’π΄π‘β€–1βˆ’(π›Ώβˆ’π›Ύβ„Ž)π›Όπ‘›βˆ’ξ€·π›Όπ‘›βˆ’π›½π‘›ξ€Έ||||β€–β€–π‘₯1βˆ’π›Ύβ„Žξ€»ξ€»π‘›β€–β€–βˆ’π‘+(π›Ώβˆ’π›Ύβ„Ž)𝛼𝑛𝑀,π›Ώβˆ’π›Ύβ„Ž(3.32) where 𝑀β‰₯‖𝛾𝑓(𝑝)βˆ’π΄π‘β€–+β€–π‘βˆ’π΄π‘β€– is a constant. By Lemma 2.6 we see that {π‘₯𝑛} is bounded.
As a result, noticing π‘₯𝑛+1βˆ’π‘‡π‘₯𝑛=𝛼𝑛𝐴𝑇π‘₯𝑛+𝛽𝑛π‘₯𝛾𝑓𝑛+ξ€·π›Όπ‘›βˆ’π›½π‘›ξ€Έπ‘₯𝑛(3.33) and 𝛼𝑛→0, we obtain π‘₯𝑛+1βˆ’π‘‡π‘₯𝑛→0.(3.34) But the key is to prove that π‘₯𝑛+1βˆ’π‘₯𝑛→0.(3.35) To see this, we calculate β€–β€–π‘₯𝑛+1βˆ’π‘₯𝑛‖‖=β€–β€–ξ€·πΌβˆ’π›Όπ‘›π΄ξ€Έξ€·π‘‡π‘₯π‘›βˆ’π‘‡π‘₯π‘›βˆ’1ξ€Έβˆ’ξ€·π›Όπ‘›βˆ’π›Όπ‘›βˆ’1𝐴𝑇π‘₯π‘›βˆ’1𝛼+𝛾𝑛𝑓π‘₯𝑛π‘₯βˆ’π‘“π‘›βˆ’1+ξ€·π›Όξ€Έξ€Έπ‘›βˆ’π›Όπ‘›βˆ’1𝑓π‘₯π‘›βˆ’1+ξ€·π›Όξ€Έξ€»π‘›βˆ’π›½π‘›π‘₯ξ€Έξ€·π‘›βˆ’π‘₯π‘›βˆ’1‖‖≀1βˆ’(π›Ώβˆ’π›Ύβ„Ž)𝛼𝑛‖‖π‘₯π‘›βˆ’π‘₯π‘›βˆ’1β€–β€–+||π›Όπ‘›βˆ’π›Όπ‘›βˆ’1||‖‖𝐴𝑇π‘₯π‘›βˆ’1ξ€·π‘₯βˆ’π‘“π‘›βˆ’1ξ€Έβ€–β€–+ξ€·π›Όπ‘›βˆ’π›½π‘›ξ€Έβ€–β€–π‘₯π‘›βˆ’π‘₯π‘›βˆ’1β€–β€–=ξ€·1+π›Όπ‘›βˆ’π›½π‘›βˆ’(π›Ώβˆ’π›Ύβ„Ž)𝛼𝑛‖‖π‘₯π‘›βˆ’π‘₯π‘›βˆ’1β€–β€–+||π›Όπ‘›βˆ’π›Όπ‘›βˆ’1||‖‖𝐴𝑇π‘₯π‘›βˆ’1ξ€·π‘₯βˆ’π‘“π‘›βˆ’1ξ€Έβ€–β€–=ξ€Ίξ€Ί1βˆ’(π›Ώβˆ’π›Ύβ„Žβˆ’1)𝛼𝑛+𝛽𝑛‖‖π‘₯ξ€»ξ€»π‘›βˆ’π‘₯π‘›βˆ’1β€–β€–+||π›Όπ‘›βˆ’π›Όπ‘›βˆ’1||‖‖𝐴𝑇π‘₯π‘›βˆ’1ξ€·π‘₯βˆ’π‘“π‘›βˆ’1ξ€Έβ€–β€–.(3.36) Since βˆžξ“π‘›=0ξ€Ί(π›Ώβˆ’π›Ύβ„Žβˆ’1)𝛼𝑛+𝛽𝑛=βˆžξ“π‘›=0ξ€Ί(π›Ώβˆ’π›Ύβ„Ž)π›Όπ‘›βˆ’ξ€·π›Όπ‘›βˆ’π›½π‘›ξ€Έξ€»=∞(3.37) and condition (C3) holds, an application of Lemma 2.1 to (3.36) implies (3.35) which combined with (3.34), in turns, implies π‘₯π‘›βˆ’π‘‡π‘₯𝑛→0.(3.38) Next we show that limsupπ‘›β†’βˆžβŸ¨π‘‡π‘₯π‘›βˆ’Μƒπ‘₯,𝛾𝑓(Μƒπ‘₯)βˆ’π΄Μƒπ‘₯βŸ©β‰€0,(3.39) where Μƒπ‘₯ is obtained in Theorem 3.2.
To see this, we take a subsequence {π‘₯π‘›π‘˜} of {π‘₯𝑛} such that limsupπ‘›β†’βˆžβŸ¨π‘₯π‘›βˆ’Μƒπ‘₯,𝛾𝑓(Μƒπ‘₯)βˆ’π΄Μƒπ‘₯⟩=limπ‘˜β†’βˆžξ«π‘₯π‘›π‘˜ξ¬.βˆ’Μƒπ‘₯,𝛾𝑓(Μƒπ‘₯)βˆ’π΄Μƒπ‘₯(3.40) We may also assume that π‘₯π‘›π‘˜β‡€π‘§. Note that π‘§βˆˆπΉ(𝑇) in virtue of Lemma 2.2 and (3.38). It follows from the variational inequality (3.15) that limsupπ‘›β†’βˆžβŸ¨π‘₯π‘›βˆ’Μƒπ‘₯,𝛾𝑓(Μƒπ‘₯)βˆ’π΄Μƒπ‘₯⟩=βŸ¨π‘§βˆ’Μƒπ‘₯,𝛾𝑓(Μƒπ‘₯)βˆ’π΄Μƒπ‘₯βŸ©β‰€0.(3.41) So (3.39) holds, thanks to (3.38).
Finally, we prove π‘₯𝑛→̃π‘₯. To this end, we calculate β€–β€–π‘₯𝑛+1β€–β€–βˆ’Μƒπ‘₯2=β€–β€–(πΌβˆ’π›Όπ‘›π΄)(𝑇π‘₯π‘›βˆ’Μƒπ‘₯)+𝛼𝑛(𝛾𝑓(π‘₯𝑛)βˆ’π΄Μƒπ‘₯)βˆ’π›Όπ‘›π›Ύπ‘“(π‘₯𝑛)+𝛽𝑛𝛾𝑓(π‘₯𝑛)+(π›Όπ‘›βˆ’π›½π‘›)π‘₯𝑛‖‖2=β€–β€–ξ€·πΌβˆ’π›Όπ‘›π΄ξ€Έξ€·π‘‡π‘₯π‘›ξ€Έβˆ’Μƒπ‘₯+𝛼𝑛π‘₯𝛾𝑓𝑛+ξ€·π›Όβˆ’π΄Μƒπ‘₯π‘›βˆ’π›½π‘›π‘₯ξ€Έξ€·π‘›βˆ’π›Ύπ‘“(π‘₯𝑛)ξ€Έβ€–β€–2=β€–β€–(πΌβˆ’π›Όπ‘›π΄)(𝑇π‘₯π‘›β€–β€–βˆ’Μƒπ‘₯)2+‖‖𝛼𝑛π‘₯π›Ύπ‘“π‘›ξ€Έξ€Έβ€–β€–βˆ’π΄Μƒπ‘₯2+β€–β€–ξ€·π›Όπ‘›βˆ’π›½π‘›π‘₯𝑛π‘₯βˆ’π›Ύπ‘“π‘›β€–β€–ξ€Έξ€Έ2+2ξ«ξ€·πΌβˆ’π›Όπ‘›π΄ξ€Έξ€·π‘‡π‘₯π‘›ξ€Έβˆ’Μƒπ‘₯,𝛼𝑛π‘₯π›Ύπ‘“π‘›ξ€Έβˆ’π΄Μƒπ‘₯+2ξ«ξ€·πΌβˆ’π›Όπ‘›π΄ξ€Έξ€·π‘‡π‘₯𝑛,ξ€·π›Όβˆ’Μƒπ‘₯π‘›βˆ’π›½π‘›π‘₯𝑛π‘₯βˆ’π›Ύπ‘“π‘›ξ«π›Όξ€Έξ€Έξ¬+2𝑛π‘₯𝛾𝑓𝑛,ξ€·π›Όβˆ’π΄Μƒπ‘₯π‘›βˆ’π›½π‘›π‘₯𝑛π‘₯βˆ’π›Ύπ‘“π‘›β‰€ξ€·ξ€Έξ€Έξ¬1βˆ’π›Όπ‘›π›Ώξ€Έ2β€–β€–π‘₯π‘›β€–β€–βˆ’Μƒπ‘₯2+𝛼2𝑛‖‖π‘₯π›Ύπ‘“π‘›ξ€Έβ€–β€–βˆ’π΄Μƒπ‘₯2+ξ€·π›Όπ‘›βˆ’π›½π‘›ξ€Έ2β€–β€–π‘₯𝑛π‘₯βˆ’π›Ύπ‘“π‘›ξ€Έβ€–β€–2+2𝛼𝑛𝑇π‘₯𝑛π‘₯βˆ’Μƒπ‘₯,π›Ύπ‘“π‘›ξ€Έξ¬βˆ’π΄Μƒπ‘₯βˆ’2𝛼2𝑛𝐴𝑇π‘₯𝑛π‘₯βˆ’Μƒπ‘₯,π›Ύπ‘“π‘›ξ€Έξ¬ξ€·π›Όβˆ’π΄Μƒπ‘₯+2π‘›βˆ’π›½π‘›ξ€Έξ«π‘‡π‘₯π‘›βˆ’Μƒπ‘₯,π‘₯𝑛π‘₯βˆ’π›Ύπ‘“π‘›ξ€Έξ¬βˆ’2π›Όπ‘›ξ€·π›Όπ‘›βˆ’π›½π‘›π΄ξ€·ξ€Έξ«π‘‡π‘₯π‘›ξ€Έβˆ’Μƒπ‘₯,π‘₯𝑛π‘₯βˆ’π›Ύπ‘“π‘›ξ€Έξ¬+2π›Όπ‘›ξ€·π›Όπ‘›βˆ’π›½π‘›ξ€·π‘₯ξ€Έξ«π›Ύπ‘“π‘›ξ€Έβˆ’π΄Μƒπ‘₯,π‘₯𝑛π‘₯βˆ’π›Ύπ‘“π‘›β‰€ξ€·ξ€Έξ¬1βˆ’π›Όπ‘›π›Ώξ€Έ2β€–β€–π‘₯π‘›β€–β€–βˆ’Μƒπ‘₯2+𝛼2𝑛‖‖π‘₯π›Ύπ‘“π‘›ξ€Έβ€–β€–βˆ’π΄Μƒπ‘₯2+ξ€·π›Όπ‘›βˆ’π›½π‘›ξ€Έ2β€–β€–π‘₯𝑛π‘₯βˆ’π›Ύπ‘“π‘›ξ€Έβ€–β€–2+2𝛼𝑛𝑇π‘₯𝑛π‘₯βˆ’Μƒπ‘₯,π›Ύπ‘“π‘›ξ€Έξ¬βˆ’π›Ύπ‘“(Μƒπ‘₯)+𝛾𝑓(Μƒπ‘₯)βˆ’π΄Μƒπ‘₯βˆ’2𝛼2𝑛𝐴𝑇π‘₯𝑛π‘₯βˆ’Μƒπ‘₯,π›Ύπ‘“π‘›ξ€Έξ¬ξ€·π›Όβˆ’π΄Μƒπ‘₯+2π‘›βˆ’π›½π‘›ξ€Έξ«π‘‡π‘₯π‘›βˆ’Μƒπ‘₯,π‘₯𝑛π‘₯βˆ’π›Ύπ‘“π‘›ξ€Έξ¬βˆ’2π›Όπ‘›ξ€·π›Όπ‘›βˆ’π›½π‘›π΄ξ€·ξ€Έξ«π‘‡π‘₯π‘›ξ€Έβˆ’Μƒπ‘₯,π‘₯𝑛π‘₯βˆ’π›Ύπ‘“π‘›ξ€Έξ¬+2π›Όπ‘›ξ€·π›Όπ‘›βˆ’π›½π‘›ξ€·π‘₯ξ€Έξ«π›Ύπ‘“π‘›ξ€Έβˆ’π΄Μƒπ‘₯,π‘₯𝑛π‘₯βˆ’π›Ύπ‘“π‘›β‰€ξ€·ξ€Έξ¬1βˆ’π›Όπ‘›π›Ώξ€Έ2β€–β€–π‘₯π‘›β€–β€–βˆ’Μƒπ‘₯2+𝛼2𝑛‖‖π‘₯π›Ύπ‘“π‘›ξ€Έβ€–β€–βˆ’π΄Μƒπ‘₯2+ξ€·π›Όπ‘›βˆ’π›½π‘›ξ€Έ2β€–β€–π‘₯𝑛π‘₯βˆ’π›Ύπ‘“π‘›ξ€Έβ€–β€–2+2𝛼𝑛𝑇π‘₯𝑛π‘₯βˆ’Μƒπ‘₯,π›Ύπ‘“π‘›ξ€Έξ¬βˆ’π›Ύπ‘“(Μƒπ‘₯)+2π›Όπ‘›βŸ¨π‘‡π‘₯π‘›βˆ’Μƒπ‘₯,𝛾𝑓(Μƒπ‘₯)βˆ’π΄Μƒπ‘₯βŸ©βˆ’2𝛼2𝑛𝐴𝑇π‘₯𝑛π‘₯βˆ’Μƒπ‘₯,π›Ύπ‘“π‘›ξ€Έξ¬ξ€·π›Όβˆ’π΄Μƒπ‘₯+2π‘›βˆ’π›½π‘›ξ€Έξ«π‘‡π‘₯π‘›βˆ’Μƒπ‘₯,π‘₯𝑛π‘₯βˆ’π›Ύπ‘“π‘›ξ€Έξ¬βˆ’2π›Όπ‘›ξ€·π›Όπ‘›βˆ’π›½π‘›π΄ξ€·ξ€Έξ«π‘‡π‘₯π‘›ξ€Έβˆ’Μƒπ‘₯,π‘₯𝑛π‘₯βˆ’π›Ύπ‘“π‘›ξ€Έξ¬+2π›Όπ‘›ξ€·π›Όπ‘›βˆ’π›½π‘›ξ€·π‘₯ξ€Έξ«π›Ύπ‘“π‘›ξ€Έβˆ’π΄Μƒπ‘₯,π‘₯𝑛π‘₯βˆ’π›Ύπ‘“π‘›β‰€ξ€·ξ€Έξ¬1βˆ’π›Όπ‘›π›Ώξ€Έ2β€–β€–π‘₯π‘›β€–β€–βˆ’Μƒπ‘₯2+𝛼2𝑛‖‖π‘₯π›Ύπ‘“π‘›ξ€Έβ€–β€–βˆ’π΄Μƒπ‘₯2+ξ€·π›Όπ‘›βˆ’π›½π‘›ξ€Έ2β€–β€–π‘₯𝑛π‘₯βˆ’π›Ύπ‘“π‘›ξ€Έβ€–β€–2+2𝛼𝑛‖‖π‘₯π›Ύβ„Žπ‘›β€–β€–βˆ’Μƒπ‘₯2+2π›Όπ‘›βŸ¨π‘‡π‘₯π‘›βˆ’Μƒπ‘₯,𝛾𝑓(Μƒπ‘₯)βˆ’π΄Μƒπ‘₯βŸ©βˆ’2𝛼2𝑛𝐴𝑇π‘₯𝑛π‘₯βˆ’Μƒπ‘₯,π›Ύπ‘“π‘›ξ€Έξ¬ξ€·π›Όβˆ’π΄Μƒπ‘₯+2π‘›βˆ’π›½π‘›ξ€Έξ«π‘‡π‘₯π‘›βˆ’Μƒπ‘₯,π‘₯𝑛π‘₯βˆ’π›Ύπ‘“π‘›ξ€Έξ¬βˆ’2π›Όπ‘›ξ€·π›Όπ‘›βˆ’π›½π‘›π΄ξ€·ξ€Έξ«π‘‡π‘₯π‘›ξ€Έβˆ’Μƒπ‘₯,π‘₯𝑛π‘₯βˆ’π›Ύπ‘“π‘›ξ€Έξ¬+2π›Όπ‘›ξ€·π›Όπ‘›βˆ’π›½π‘›ξ€·π‘₯ξ€Έξ«π›Ύπ‘“π‘›ξ€Έβˆ’π΄Μƒπ‘₯,π‘₯𝑛π‘₯βˆ’π›Ύπ‘“π‘›β‰€ξ‚ƒξ€·ξ€Έξ¬1βˆ’π›Όπ‘›π›Ώξ€Έ2+2𝛼𝑛‖‖π‘₯π›Ύβ„Žπ‘›β€–β€–βˆ’Μƒπ‘₯2+𝛼2𝑛‖‖π‘₯π›Ύπ‘“π‘›ξ€Έβ€–β€–βˆ’π΄Μƒπ‘₯2+ξ€·π›Όπ‘›βˆ’π›½π‘›ξ€Έ2β€–β€–π‘₯𝑛π‘₯βˆ’π›Ύπ‘“π‘›ξ€Έβ€–β€–2+2π›Όπ‘›βŸ¨π‘‡π‘₯π‘›βˆ’Μƒπ‘₯,𝛾𝑓(Μƒπ‘₯)βˆ’π΄Μƒπ‘₯βŸ©βˆ’2𝛼2𝑛𝐴𝑇π‘₯𝑛π‘₯βˆ’Μƒπ‘₯,π›Ύπ‘“π‘›ξ€Έξ¬ξ€·π›Όβˆ’π΄Μƒπ‘₯+2π‘›βˆ’π›½π‘›ξ€Έξ«π‘‡π‘₯π‘›βˆ’Μƒπ‘₯,π‘₯𝑛π‘₯βˆ’π›Ύπ‘“π‘›ξ€Έξ¬βˆ’2π›Όπ‘›ξ€·π›Όπ‘›βˆ’π›½π‘›π΄ξ€·ξ€Έξ«π‘‡π‘₯π‘›ξ€Έβˆ’Μƒπ‘₯,π‘₯𝑛π‘₯βˆ’π›Ύπ‘“π‘›ξ€Έξ¬+2𝛼𝑛𝛼𝑛π‘₯π›Ύπ‘“π‘›ξ€Έβˆ’π΄Μƒπ‘₯,π‘₯𝑛π‘₯βˆ’π›Ύπ‘“π‘›β‰€ξ€Ίξ€Έξ¬1βˆ’2(π›Ώβˆ’π›Ύβ„Ž)𝛼𝑛‖‖π‘₯π‘›β€–β€–βˆ’Μƒπ‘₯2+𝛼2𝑛𝛿2β€–β€–π‘₯π‘›β€–β€–βˆ’Μƒπ‘₯+𝛼2𝑛‖‖π‘₯π›Ύπ‘“π‘›ξ€Έβ€–β€–βˆ’π΄Μƒπ‘₯2+ξ€·π›Όπ‘›βˆ’π›½π‘›ξ€Έ2β€–β€–π‘₯𝑛π‘₯βˆ’π›Ύπ‘“π‘›ξ€Έβ€–β€–2+2π›Όπ‘›βŸ¨π‘‡π‘₯π‘›βˆ’Μƒπ‘₯,𝛾𝑓(Μƒπ‘₯)βˆ’π΄Μƒπ‘₯βŸ©βˆ’2𝛼2𝑛𝐴𝑇π‘₯𝑛π‘₯βˆ’Μƒπ‘₯,π›Ύπ‘“π‘›ξ€Έξ¬ξ€·π›Όβˆ’π΄Μƒπ‘₯+2π‘›βˆ’π›½π‘›ξ€Έξ«π‘‡π‘₯π‘›βˆ’Μƒπ‘₯,π‘₯𝑛π‘₯βˆ’π›Ύπ‘“π‘›ξ€Έξ¬βˆ’2π›Όπ‘›ξ€·π›Όπ‘›βˆ’π›½π‘›π΄ξ€·ξ€Έξ«π‘‡π‘₯π‘›ξ€Έβˆ’Μƒπ‘₯,π‘₯𝑛π‘₯βˆ’π›Ύπ‘“π‘›ξ€Έξ¬+2π›Όπ‘›ξ€·π›Όπ‘›βˆ’π›½π‘›ξ€·π‘₯ξ€Έξ«π›Ύπ‘“π‘›ξ€Έβˆ’π΄Μƒπ‘₯,π‘₯𝑛π‘₯βˆ’π›Ύπ‘“π‘›β‰€ξ€Ίξ€Έξ¬1βˆ’2(π›Ώβˆ’π›Ύβ„Ž)𝛼𝑛‖‖π‘₯π‘›β€–β€–βˆ’Μƒπ‘₯2+𝛼2𝑛𝛿2β€–β€–π‘₯π‘›β€–β€–βˆ’Μƒπ‘₯+𝛼2𝑛‖‖π‘₯π›Ύπ‘“π‘›ξ€Έβ€–β€–βˆ’π΄Μƒπ‘₯2+ξ€·π›Όπ‘›βˆ’π›½π‘›ξ€Έ2β€–β€–π‘₯𝑛π‘₯βˆ’π›Ύπ‘“π‘›ξ€Έβ€–β€–2+2π›Όπ‘›βŸ¨π‘‡π‘₯π‘›βˆ’Μƒπ‘₯,𝛾𝑓(Μƒπ‘₯)βˆ’π΄Μƒπ‘₯βŸ©βˆ’2𝛼2𝑛𝐴𝑇π‘₯𝑛π‘₯βˆ’Μƒπ‘₯,π›Ύπ‘“π‘›ξ€Έξ¬ξ€·π›Όβˆ’π΄Μƒπ‘₯+2π‘›βˆ’π›½π‘›ξ€Έξ«π‘‡π‘₯π‘›βˆ’Μƒπ‘₯,π‘₯𝑛π‘₯βˆ’π›Ύπ‘“π‘›ξ€Έξ¬βˆ’2π›Όπ‘›ξ€·π›Όπ‘›βˆ’π›½π‘›π΄ξ€·ξ€Έξ«π‘‡π‘₯π‘›ξ€Έβˆ’Μƒπ‘₯,π‘₯𝑛π‘₯βˆ’π›Ύπ‘“π‘›ξ€Έξ¬+2π›Όπ‘›ξ€·π›Όπ‘›βˆ’π›½π‘›ξ€·π‘₯ξ€Έξ«π›Ύπ‘“π‘›ξ€Έβˆ’π΄Μƒπ‘₯,π‘₯𝑛π‘₯βˆ’π›Ύπ‘“π‘›β‰€ξ€Ίξ€Έξ¬1βˆ’2(π›Ώβˆ’π›Ύβ„Ž)𝛼𝑛‖‖π‘₯π‘›β€–β€–βˆ’Μƒπ‘₯2+2π›Όπ‘›βŸ¨π‘‡π‘₯π‘›βˆ’Μƒπ‘₯,𝛾𝑓(Μƒπ‘₯)βˆ’π΄Μƒπ‘₯⟩+𝛼𝑛𝑀𝑛,(3.42) where 𝑀𝑛=𝛼𝑛‖‖π‘₯π›Ύπ‘“π‘›ξ€Έβ€–β€–βˆ’π΄Μƒπ‘₯2+ξ€·π›Όπ‘›βˆ’π›½π‘›ξ€Έ2𝛼𝑛‖‖π‘₯𝑛π‘₯βˆ’π›Ύπ‘“π‘›ξ€Έβ€–β€–2βˆ’2𝛼𝑛𝐴𝑇π‘₯𝑛π‘₯βˆ’Μƒπ‘₯,π›Ύπ‘“π‘›ξ€Έξ¬π›Όβˆ’π΄Μƒπ‘₯+2π‘›βˆ’π›½π‘›π›Όπ‘›ξ«π‘‡π‘₯π‘›βˆ’Μƒπ‘₯,π‘₯𝑛π‘₯βˆ’π›Ύπ‘“π‘›ξ€·π›Όξ€Έξ¬βˆ’2π‘›βˆ’π›½π‘›π΄ξ€·ξ€Έξ«π‘‡π‘₯π‘›ξ€Έβˆ’Μƒπ‘₯,π‘₯𝑛π‘₯βˆ’π›Ύπ‘“π‘›ξ€·π›Όξ€Έξ¬+2π‘›βˆ’π›½π‘›ξ€·π‘₯ξ€Έξ«π›Ύπ‘“π‘›ξ€Έβˆ’π΄Μƒπ‘₯,π‘₯𝑛π‘₯βˆ’π›Ύπ‘“π‘›.(3.43) That is, β€–β€–π‘₯𝑛+1β€–β€–βˆ’Μƒπ‘₯2≀1βˆ’2(π›Ώβˆ’π›Ύβ„Ž)𝛼𝑛‖‖π‘₯π‘›β€–β€–βˆ’Μƒπ‘₯2+𝛼𝑛2βŸ¨π‘‡π‘₯π‘›βˆ’Μƒπ‘₯,𝛾𝑓(Μƒπ‘₯)βˆ’π΄Μƒπ‘₯⟩+𝑀𝑛.(3.44) Since {π‘₯𝑛} is bounded, by the conditions of Theorem 3.4, we get limπ‘›β†’βˆžπ‘€π‘›=0 and βˆ‘βˆžπ‘›=0(π›Ώβˆ’π›Ύβ„Ž)𝛼𝑛=∞, this together with (3.39) implies that limsupπ‘›β†’βˆžξ€Ί2βŸ¨π‘‡π‘₯π‘›βˆ’Μƒπ‘₯,𝛾𝑓(Μƒπ‘₯)βˆ’π΄Μƒπ‘₯⟩+𝑀𝑛≀0.(3.45) Now applying Lemma 2.1 to (3.44) concludes that π‘₯𝑛→̃π‘₯. This complete the proof of Theorem 3.4.

If pick 𝛼𝑛=𝛽𝑛, we obtain the result of Marino and Xu [2].

4. Approximate Iteration Algorithm and Error Estimate

In this section, we use the following approximate iteration algorithm: 𝑦𝑛+1=(πΌβˆ’π‘ π΄)𝑇𝑦𝑛𝑦+𝑑𝛾𝑓𝑛+(π‘ βˆ’π‘‘)𝑦𝑛,(4.1) for an arbitrary initial 𝑦0∈𝐻, to calculate the fixed point of nonexpansive mapping and solution of variational inequality with bounded linear operator 𝐴, where 𝐴,𝑇,𝛾,𝑠,𝑑 and others 𝛿,β„Ž as in the Section 3.

Meanwhile, the Μƒπ‘₯∈𝐹(𝑇) is obtained in Theorem 3.2 which is unique solution of variational inequality (3.15) and {π‘₯𝑛}β†’Μƒπ‘₯, as π‘›β†’βˆž,π‘₯𝑑,𝑠→̃π‘₯ as 𝑠→0, where {π‘₯𝑛} and π‘₯𝑑,𝑠 are respectively defined by (3.31) and (3.6).

The following lemma will be useful for the establish of formula of convergence rate estimate.

Lemma 4.1 (Banach's contractive mapping principle). Let 𝐻 be a Banach space and 𝑆 be a contraction from 𝐻 into self, that is, ‖𝑆π‘₯βˆ’π‘†π‘¦β€–β‰€πœƒβ€–π‘₯βˆ’π‘¦β€–,βˆ€π‘₯,π‘¦βˆˆπ»,(4.2) where 0<πœƒ<1 is a constant. Then the Picard iterative sequence π‘₯𝑛+1=𝑆π‘₯𝑛, for arbitrary initial π‘₯0∈𝐻, converges strongly to a unique fixed point π‘₯βˆ— of 𝑆 and β€–β€–π‘₯π‘›βˆ’π‘₯βˆ—β€–β€–β‰€πœƒπ‘›β€–β€–π‘₯1βˆ’πœƒ0βˆ’π‘†π‘₯0β€–β€–.(4.3)

For above 𝑇,𝐴,𝑓,𝛾,𝑠,𝑑,𝛿, we define the following contractive mapping: 𝑆𝑑,𝑠𝑦=(πΌβˆ’π‘ π΄)𝑇𝑦+𝛾𝑓(𝑦)+(π‘ βˆ’π‘‘)𝑦(4.4) from 𝐻 into self. In fact, it is not hard to see that 𝑆𝑑,𝑠 is a contraction for sufficiently small 𝑠, indeed, by Lemma 2.5 we have, for any π‘₯,π‘¦βˆˆπ», that ‖‖𝑆𝑑,𝑠π‘₯βˆ’π‘†π‘‘,𝑠𝑦‖‖‖=≀𝑑𝛾‖𝑓(π‘₯)βˆ’π‘“(𝑦)β€–+β€–(πΌβˆ’π‘ π΄)(𝑇π‘₯βˆ’π‘‡π‘¦)β€–+β€–(π‘ βˆ’π‘‘)(π‘₯βˆ’π‘¦)≀(π‘‘π›Ύβ„Ž+1βˆ’π‘ π›Ώ+π‘ βˆ’π‘‘)β€–π‘₯βˆ’π‘¦β€–(1+𝑑(π›Ύβ„Žβˆ’1)βˆ’π‘ (π›Ώβˆ’1))β€–π‘₯βˆ’π‘¦β€–.(4.5) By using Lemma 4.1, then there exists unique fixed point π‘₯𝑑,π‘ βˆˆπ» of 𝑆𝑑,𝑠 and the iterative sequence 𝑦𝑛+1=𝑆𝑑,𝑠𝑦𝑛=(πΌβˆ’π‘ π΄)𝑇𝑦𝑛𝑦+𝛾𝑓𝑛+(π‘ βˆ’π‘‘)𝑦𝑛,𝑦0∈𝐻,(4.6) converges strongly to this fixed point π‘₯𝑑,𝑠. Meanwhile, from (4.3) and (4.5) we obtain β€–β€–π‘¦π‘›βˆ’π‘₯𝑑,𝑠‖‖≀(1+𝑑(π›Ύβ„Žβˆ’1)βˆ’π‘ (π›Ώβˆ’1))𝑛‖‖𝑦𝑠(π›Ώβˆ’1)βˆ’π‘‘(π›Ύβ„Žβˆ’1)0βˆ’π‘†π‘‘,𝑠𝑦0β€–β€–.(4.7) On the other hand, from (3.21) we have β€–β€–π‘₯𝑑,π‘ β€–β€–βˆ’Μƒπ‘₯2β‰€π‘‘ξ«π‘ π›Ώβˆ’π‘‘π›Ύβ„Žπ›Ύπ‘“(Μƒπ‘₯)βˆ’π΄Μƒπ‘₯,π‘₯𝑑,𝑠+βˆ’Μƒπ‘₯π‘ βˆ’π‘‘π‘₯π‘ π›Ώβˆ’π‘‘π›Ύβ„Žξ€·ξ«π‘‘,π‘ βˆ’Μƒπ‘₯,π‘₯𝑑,𝑠+ξ«βˆ’Μƒπ‘₯Μƒπ‘₯βˆ’π΄Μƒπ‘₯,π‘₯𝑑,𝑠,βˆ’Μƒπ‘₯(4.8) which leads to ξ‚΅1βˆ’π‘ βˆ’π‘‘ξ‚Άβ€–β€–π‘₯π‘ π›Ώβˆ’π‘‘π›Ύβ„Žπ‘‘,π‘ β€–β€–βˆ’Μƒπ‘₯2β‰€π‘‘ξ«π‘ π›Ώβˆ’π‘‘π›Ύβ„Žπ›Ύπ‘“(Μƒπ‘₯)βˆ’π΄Μƒπ‘₯,π‘₯𝑑,𝑠+βˆ’Μƒπ‘₯π‘ βˆ’π‘‘ξ«π‘ π›Ώβˆ’π‘‘π›Ύβ„ŽΜƒπ‘₯βˆ’π΄Μƒπ‘₯,π‘₯𝑑,𝑠.βˆ’Μƒπ‘₯(4.9) Therefore, ξ‚΅1βˆ’π‘ βˆ’π‘‘ξ‚Άβ€–β€–π‘₯π‘ π›Ώβˆ’π‘‘π›Ύβ„Žπ‘‘,π‘ β€–β€–β‰€π‘‘βˆ’Μƒπ‘₯π‘ π›Ώβˆ’π‘‘π›Ύβ„Žβ€–π›Ύπ‘“(Μƒπ‘₯)βˆ’π΄Μƒπ‘₯β€–+π‘ βˆ’π‘‘β€–β€–π‘₯π‘ π›Ώβˆ’π‘‘π›Ύβ„Žβ€–Μƒπ‘₯βˆ’π΄Μƒπ‘₯β€–,𝑑,π‘ β€–β€–β‰€π‘‘βˆ’Μƒπ‘₯π‘ π›Ώβˆ’π‘‘π›Ύβ„Ž+π‘‘βˆ’π‘ β€–π›Ύπ‘“(Μƒπ‘₯)βˆ’π΄Μƒπ‘₯β€–+π‘ βˆ’π‘‘π‘ π›Ώβˆ’π‘‘π›Ύβ„Ž+π‘‘βˆ’π‘ β€–Μƒπ‘₯βˆ’π΄Μƒπ‘₯β€–.(4.10) Letting 𝐷1=‖𝛾𝑓(Μƒπ‘₯)βˆ’π΄Μƒπ‘₯β€–,𝐷2=β€–Μƒπ‘₯βˆ’π΄Μƒπ‘₯β€–, it follows that β€–β€–π‘₯𝑑,π‘ β€–β€–β‰€π‘‘βˆ’Μƒπ‘₯π·π‘ π›Ώβˆ’π‘‘π›Ύβ„Ž+π‘‘βˆ’π‘ 1+π‘ βˆ’π‘‘π·π‘ π›Ώβˆ’π‘‘π›Ύβ„Ž+π‘‘βˆ’π‘ 2.(4.11) From inequality (4.11) together with (4.7), and letting 𝐷3=‖𝑦0βˆ’π‘†π‘‘,𝑠𝑦0β€–, we get β€–β€–π‘¦π‘›β€–β€–β‰€π‘‘βˆ’Μƒπ‘₯π·π‘ π›Ώβˆ’π‘‘π›Ύβ„Ž+π‘‘βˆ’π‘ 1+π‘ βˆ’π‘‘π·π‘ π›Ώβˆ’π‘‘π›Ύβ„Ž+π‘‘βˆ’π‘ 2+(1+𝑑(π›Ύβ„Žβˆ’1)βˆ’π‘ (π›Ώβˆ’1))𝑛𝐷𝑠(π›Ώβˆ’1)βˆ’π‘‘(π›Ύβ„Žβˆ’1)3.(4.12) Inequality (4.12) is, namely, the error estimate for approximate fixed point 𝑦𝑛. Now, we give several special cases of inequality (4.12).

Error Estimate 1
Consider limsupπ‘›β†’βˆžβ€–β€–π‘¦π‘›β€–β€–β‰€π‘‘βˆ’Μƒπ‘₯π·π‘ π›Ώβˆ’π‘‘π›Ύβ„Ž+π‘‘βˆ’π‘ 1+π‘ βˆ’π‘‘π·π‘ π›Ώβˆ’π‘‘π›Ύβ„Ž+π‘‘βˆ’π‘ 2.(4.13)

Error Estimate 2
If 𝑑=𝑠, then ‖‖𝑦𝑛‖‖≀1βˆ’Μƒπ‘₯𝐷(π›Ώβˆ’π›Ύβ„Ž)1+(1βˆ’π‘ (π›Ώβˆ’π›Ύβ„Ž))𝑛𝐷𝑠(π›Ώβˆ’π›Ύβ„Ž)3,(4.14) which can be used to estimate error for iterative scheme 𝑦𝑛+1=(πΌβˆ’π‘ π΄)𝑇𝑦𝑛𝑦+𝑠𝛾𝑓𝑛,𝑦0∈𝐻.(4.15)

Error Estimate 3
If 𝐴=𝐼, then β€–β€–π‘¦π‘›β€–β€–β‰€π‘‘βˆ’Μƒπ‘₯π·π‘ π›Ώβˆ’π‘‘π›Ύβ„Ž+π‘‘βˆ’π‘ 1+(1+𝑑(π›Ύβ„Žβˆ’1)βˆ’π‘ (π›Ώβˆ’1))𝑛𝐷𝑠(π›Ώβˆ’1)βˆ’π‘‘(π›Ύβ„Žβˆ’1)3,(4.16) which can be used to estimate error for iterative scheme 𝑦𝑛+1=(1βˆ’π‘ )𝑇𝑦𝑛𝑦+𝑑𝛾𝑓𝑛+(π‘ βˆ’π‘‘)π‘₯𝑛,𝑦0∈𝐻.(4.17)

Acknowledgments

This paper is supported by the National Natural Science Foundation of China under Grant (11071279).

References

  1. F. Deutsch and I. Yamada, β€œMinimizing certain convex functions over the intersection of the fixed point sets of nonexpansive mappings,” Numerical Functional Analysis and Optimization, vol. 19, no. 1-2, pp. 33–56, 1998. View at Publisher Β· View at Google Scholar Β· View at Zentralblatt MATH
  2. G. Marino and H.-K. Xu, β€œA general iterative method for nonexpansive mappings in Hilbert spaces,” Journal of Mathematical Analysis and Applications, vol. 318, no. 1, pp. 43–52, 2006. View at Publisher Β· View at Google Scholar Β· View at Zentralblatt MATH
  3. H.-K. Xu, β€œIterative algorithms for nonlinear operators,” Journal of the London Mathematical Society, vol. 66, no. 1, pp. 240–256, 2002. View at Publisher Β· View at Google Scholar Β· View at Zentralblatt MATH
  4. H. K. Xu, β€œAn iterative approach to quadratic optimization,” Journal of Optimization Theory and Applications, vol. 116, no. 3, pp. 659–678, 2003. View at Publisher Β· View at Google Scholar Β· View at Zentralblatt MATH
  5. K. Goebel and W. A. Kirk, Topics in Metric Fixed Point Theory, vol. 28 of Cambridge Studies in Advanced Mathematics, Cambridge University Press, Cambridge, UK, 1990. View at Publisher Β· View at Google Scholar
  6. A. Moudafi, β€œViscosity approximation methods for fixed-points problems,” Journal of Mathematical Analysis and Applications, vol. 241, no. 1, pp. 46–55, 2000. View at Publisher Β· View at Google Scholar Β· View at Zentralblatt MATH
  7. H.-K. Xu, β€œViscosity approximation methods for nonexpansive mappings,” Journal of Mathematical Analysis and Applications, vol. 298, no. 1, pp. 279–291, 2004. View at Publisher Β· View at Google Scholar Β· View at Zentralblatt MATH