Abstract

The robust stochastic stability for a class of uncertain neutral-type delayed neural networks driven by Wiener process is investigated. By utilizing the Lyapunov-Krasovskii functional and inequality technique, some sufficient criteria are presented in terms of linear matrix inequality (LMI) to ensure the stability of the system. A numerical example is given to illustrate the applicability of the result.

1. Introduction

In the past few years, neural networks and their various generalizations have drawn much research attention owing to their promising potential applications in a variety of areas, such as robotics, aerospace, telecommunications, pattern recognition, image processing, associative memory, signal processing, and combinatorial optimization [13]. In such applications, it is of prime importance to ensure the asymptotic stability of the designed neural networks. Because of this, the stability of neural networks has been deeply investigated in the literature [414].

It is known that time delays and stochastic perturbations are commonly encountered in the implementation of neural networks, and may result in instability or oscillation. So it is essential to investigate the stability of delayed stochastic neural networks [15, 16]. Moreover, uncertainties are unavoidable in practical implementation of neural networks due to modeling errors and parameter fluctuation, which also cause instability and poor performance [15, 17, 18]. Therefore, it is significant to introduce such uncertainties into delayed stochastic neural networks.

On the other hand, because of the complicated dynamic properties of the neural cells in the real world, it is natural and important that systems will contain some information about the derivative of the past state. Practically, such phenomenon always appears in the study of automatic control, circuit analysis, chemical process simulation, and population dynamics, and so forth. Recently, there has been increasing interest in the study of delayed neural networks of neutral type, see [615, 1824]. In [6, 8], the authors developed the global asymptotic stability of neutral-type neural networks with delays by utilizing the Lyapunov stability theory and LMI technique. In [9, 10], the global exponential stability of neutral-type neural networks with distributed delays is studied. However, the stochastic perturbations were not taken into account in those delayed neural networks [610].

In [23, 24], the authors discussed the robust stability for uncertain stochastic neural networks of neutral-type with time-varying delays. However, the distributed delays were not taken into account in the models. So far, there are only a few papers that not only deal with the stochastic stability analysis for delayed neural networks of neutral-type, but also consider the parameter uncertainties.

To the best of our knowledge, there are very few results on the stochastic stability analysis for uncertain neutral-type neural networks with both discrete and distributed delays driven by Wiener process. This motivates the research in this paper.

In this paper, a class of uncertain neutral-type delayed neural networks driven by Wiener process is considered. By constructing a suitable Lyapunov functional, some new stability criteria to guarantee the system to be stochastically asymptotically stable in the mean square are given, which are less conservative than some existing reports. The structure of the addressed system is more general than in the other papers. The criteria can be checked easily by the LMI control toolbox in MATLAB. Moreover, a numerical example is given to illustrate the effectiveness and improvement over some existing results.

2. Preliminaries

Notations 2. 𝐀<𝟎 denotes that 𝐀 is a negative definite matrix. The superscript “𝑇” stands for the transpose of a matrix. (Ω,𝐹,𝑃) denotes a complete probability space, E() stands for the mathematical expectation operator. stands for the Euclidean norm. 𝐈 is the identity matrix of appropriate dimension, and the symmetric terms in a symmetric matrix are denoted by .

Consider the following class of uncertain neutral-type delayed neural networks driven by Wiener process: d[]=𝐱(𝑡)𝐂𝐱(𝑡(𝑡))𝐀(𝑡)𝐱(𝑡)+𝐁(𝑡)𝐟(𝐱(𝑡𝜏(𝑡)))+𝐃(𝑡)𝑡𝑡𝑟(𝑡)+𝐇𝐟(𝐱(𝑠))d𝑠d𝑡0(𝑡)𝐱(𝑡)+𝐇1𝐱𝑡(𝑡)𝐱(𝑡𝜏(𝑡))d𝐰(𝑡),0𝑡+𝑠=𝜑(𝑠),𝑠0𝜌,𝑡0,(2.1) where 𝐱=(𝑥1,𝑥2,,𝑥𝑛)𝑇 is the neuron state vector, 𝐀(𝑡)=𝐀+Δ𝐀(𝑡), 𝐁(𝑡)=𝐁+Δ𝐁(𝑡), 𝐃(𝑡)=𝐃+Δ𝐃(𝑡), 𝐇0(𝑡)=𝐇0+Δ𝐇0(𝑡), 𝐇1(𝑡)=𝐇1+Δ𝐇1(𝑡), 𝐀=diag(𝑎𝑖)𝑛×𝑛 is a positive diagonal matrix, 𝐁,𝐂,𝐃𝐑𝑛×𝑛 are the connection weight matrices, 𝐇0,𝐇1𝐑𝑛×𝑛 are known real constant matrices, Δ𝐀(𝑡),Δ𝐁(𝑡),Δ𝐃(𝑡),Δ𝐇0(𝑡),Δ𝐇1(𝑡) represent the time-varying parameter uncertain terms. 𝐟(𝐱)=(𝑓1(𝑥1),𝑓2(𝑥2),,𝑓𝑛(𝑥𝑛))𝑇 is the neuron activation function with 𝐟(0)=0. 𝐰(𝑡)=(𝑤1(𝑡),𝑤2(𝑡),,𝑤𝑛(𝑡))𝑇 is an 𝑛-dimensional Wiener process defined on a complete probability space (Ω,𝐹,𝑃). 𝑟(𝑡),𝜏(𝑡),(𝑡) are nonnegative, bounded, and differentiable time varying delays satisfying0<𝑟(𝑡)𝑟<,0<𝜏(𝑡).𝜏𝜏<,(𝑡)𝜂1<,0<(𝑡)̇<,(𝑡)𝜂2<.(2.2)

The admissible parameter uncertain terms are assumed to be the following form:Δ𝐀(𝑡),Δ𝐁(𝑡),Δ𝐃(𝑡),Δ𝐇0(𝑡),Δ𝐇1𝐌(𝑡)=𝐔𝐅(𝑡)1,𝐌2,𝐌3,𝐌4,𝐌5,(2.3) where 𝐔,𝐌𝑖, 𝑖=1,,5 are known real constant matrices, 𝐅(𝑡) is the time-varying uncertain matrix satisfying 𝐅𝑇𝐅𝐈.(2.4)

Suppose that 𝐟() is bounded and satisfies the following condition:𝐟(𝐱)𝐆𝐱,(2.5) where 𝐆𝐑𝑛×𝑛 is a known constant matrix.

Assume that the initial value 𝜑[𝜌,0]𝐑𝑛 is 𝐹0-measurable and continuously differentiable, we introduce the following norm:𝜑2𝜌=maxsup𝛼𝑠0E||𝜑𝑖||(𝑠)2,sup𝑠0E||𝜑𝑖||(𝑠)2<,(2.6) where 𝜌=max{𝜏,,𝑟}, 𝛼=max{𝜏,𝑟}.

Under the above assumptions, it is easy to verify that there exists a unique equilibrium point of system (2.1) (see [25]).

Definition 2.1. The equilibrium point of (2.1) is said to be globally robustly stochastically asymptotically stable in the mean square, if the following condition holds: lim𝑡+E||𝐱𝑡,𝑡0||,𝜑2=0,𝑡𝑡0,(2.7) where 𝐱(𝑡,𝑡0,𝜑) is any solution of model (2.1) with initial value 𝜑.

Lemma 2.2 (Schur complement [26]). Given constant matrices Ω1,Ω2,Ω3 with appropriate dimensions, where Ω𝑇1=Ω1 and Ω𝑇2=Ω2>0, then 𝛀1+𝛀𝑇3𝛀21𝛀3<0,(2.8) if and only if 𝛀1𝛀𝑇3𝛀2<0,or𝛀2𝛀3𝛀1<0.(2.9)

Lemma 2.3 (see [26]). Given matrices 𝐃,𝐄, and 𝐅 with 𝐅𝑇𝐅𝐈 and a scalar 𝜀>0, then 𝐃𝐅𝐄+𝐄𝑇𝐅𝑇𝐃𝑇𝜀𝐃𝐃𝑇+𝜀1𝐄𝑇𝐄.(2.10)

Lemma 2.4 (see [27]). For any constant matrix 𝐌𝐑𝑛×𝑛, 𝐌=𝐌𝑇>0, a scalar 𝛾>0, vector function 𝑥(𝑡)[0,𝛾]𝐑𝑛 such that the integrations are well defined, then 𝛾0𝑥(𝑠)d𝑠𝑇𝐌𝛾0𝑥(𝑠)d𝑠𝛾𝛾0𝑥𝑇(𝑠)𝐌𝑥(𝑠)d𝑠.(2.11)

3. Main Results

Theorem 3.1. System (2.1) is globally robustly stochastically asymptotically stable in the mean square, if there exist symmetric positive definite matrices 𝐏,𝐐,𝐑,𝐒,𝐔1,𝐔2 and positive scalars 𝛿,𝜀1,𝜀2>0 such that LMI holds: ΓΛ=1𝐀𝑇𝐏𝐂𝐏𝐁𝜀1𝐌𝑇1𝐌2𝐏𝐃𝜀1𝐌𝑇1𝐌3𝜀2𝐌𝑇4𝐌5𝐇𝑇0𝐏𝐏𝐔0Γ2𝐂𝑇𝐏𝐁𝐂𝑇𝐏𝐃00𝐂𝑇𝐏𝐔0Γ3𝜀1𝐌𝑇2𝐌30000Γ40000Γ5𝐇𝑇1𝐏00𝐏0𝐏𝐔𝜀1𝐈0𝜀2𝐈<0,(3.1) where Γ1=𝐏𝐀𝐀𝑇𝐏+𝐐+𝐑+𝑟𝐆𝑇𝐒𝐆+𝜀1𝐌𝑇1𝐌1+𝜀2𝐌𝑇4𝐌4, Γ2=𝐔1(1𝜂2)𝐑, Γ3=𝛿𝐈+𝜀1𝐌𝑇2𝐌2, Γ4=𝑟1𝐒+𝜀1𝐌𝑇3𝐌3, Γ5=𝐔2(1𝜂1)𝐐+𝛿𝐆𝑇𝐆+𝜀2𝐌𝑇5𝐌5.

Proof. Using Lemma 2.2, the matrix Λ<0 implies that Γ1𝐀𝑇𝐏𝐂𝐏𝐁𝜀1𝐌𝑇1𝐌2𝐏𝐃𝜀1𝐌𝑇1𝐌3𝜀2𝐌𝑇4𝐌5𝐇𝑇0𝐏Γ2𝐂𝑇𝐏𝐁𝐂𝑇𝐏𝐃00Γ3𝜀1𝐌𝑇2𝐌300Γ400Γ5𝐇𝑇1𝐏+𝐏𝐏𝐔0𝐂𝑇𝜀𝐏𝐔00000000𝐏𝐔11𝐈0𝜀21𝐈𝐔𝑇𝐏𝐔𝑇𝐏𝐂000000000𝐔𝑇𝐏=Φ1𝐀𝑇𝐏𝐂𝐏𝐁𝐏𝐃0𝐇𝑇0𝐏Γ2𝐂𝑇𝐏𝐁𝐂𝑇𝐏𝐃00𝛿𝐈000𝑟1𝐒00Φ2𝐇𝑇1𝐏𝐏+𝜀1𝐌𝑇10𝐌𝑇2𝐌𝑇300𝐌𝑇10𝐌𝑇2𝐌𝑇300𝑇+𝜀2𝐌𝑇4000𝐌𝑇50𝐌𝑇4000𝐌𝑇50𝑇+𝜀11𝐏𝐔𝐂𝑇0000𝐏𝐔𝐏𝐔𝐂𝑇0000𝐏𝐔𝑇+𝜀210000000000𝐏𝐔𝐏𝐔𝑇<0,(3.2) where Φ1=𝐏𝐀𝐀𝑇𝐏+𝐐+𝐑+𝑟𝐆𝑇𝐒𝐆,  Φ2=𝐔2(1𝜂1)𝐐+𝛿𝐆𝑇𝐆.
From (2.3), (2.4), using Lemma 2.3, we have 𝐏Δ𝐀(𝑡)Δ𝐀𝑇(𝑡)𝐏Δ𝐀𝑇(𝑡)𝐏𝐂𝐏Δ𝐁(𝑡)𝐏Δ𝐃(𝑡)0Δ𝐇𝑇0(𝑡)𝐏0𝐂𝑇𝐏Δ𝐁(𝑡)𝐂𝑇𝐏Δ𝐃(𝑡)0000000000Δ𝐇𝑇1=(𝑡)𝐏0𝐌𝑇10𝐌𝑇2𝐌𝑇300𝐅𝑇(𝑡)𝐏𝐔𝐂𝑇0000𝐏𝐔𝑇+𝐏𝐔𝐂𝑇0000𝐏𝐔𝐅(𝑡)𝐌𝑇10𝐌𝑇2𝐌𝑇300𝑇+𝐌𝑇4000𝐌𝑇50𝐅𝑇00000(𝑡)𝐏𝐔𝑇+00000𝐌𝐏𝐔𝐅(𝑡)𝑇4000𝐌𝑇50𝑇𝜀1𝐌𝑇10𝐌𝑇2𝐌𝑇300𝐌𝑇10𝐌𝑇2𝐌𝑇300𝑇+𝜀11𝐏𝐔𝐂𝑇0000𝐏𝐔𝐏𝐔𝐂𝑇0000𝐏𝐔𝑇+𝜀2𝐌𝑇4000𝐌𝑇50𝐌𝑇4000𝐌𝑇50𝑇+𝜀210000000000𝐏𝐔𝐏𝐔𝑇.(3.3) Together with (3.2), we get Ψ𝐀𝑇(𝑡)𝐏𝐂𝐏𝐁(𝑡)𝐏𝐃(𝑡)0𝐇𝑇0(𝑡)𝐏Γ2𝐂𝑇𝐏𝐁(𝑡)𝐂𝑇𝐏𝐃(𝑡)00𝛿𝐈000𝑟1𝐒00Φ2𝐇𝑇1(𝑡)𝐏𝐏<0,(3.4) where Ψ=𝐏𝐀(𝑡)𝐀𝑇(𝑡)𝐏+𝐐+𝐑+𝑟𝐆𝑇𝐒𝐆.
Utilizing Lemma 2.2 again, we obtain Σ=Ψ𝐀𝑇(𝑡)𝐏𝐂𝐏𝐁(𝑡)𝐏𝐃(𝑡)0Γ2𝐂𝑇𝐏𝐁(𝑡)𝐂𝑇𝐏𝐃(𝑡)0𝛿𝐈00𝑟1𝐒0Φ2+𝐇𝑇0000𝐇(𝑡)𝑇1𝐏𝐇(𝑡)𝑇0000𝐇(𝑡)𝑇1(𝑡)𝑇<0.(3.5)
Constructing a positive definite Lyapunov-Krasovskii functional as follows: 𝑉(𝑡,𝐱(𝑡))=𝐲𝑇(𝑡)𝐏𝐲(𝑡)+𝑡𝑡𝜏(𝑡)𝐱𝑇(𝑠)𝐐𝐱(𝑠)d𝑠+𝑡𝑡(𝑡)𝐱𝑇(+𝑠)𝐑𝐱(𝑠)d𝑠0𝑟(𝑡)𝑡𝑡+𝜃𝐟𝑇(𝐱(𝑠))𝐒𝐟(𝐱(𝑠))d𝑠d𝜃+𝑇𝑡𝐱𝑇(𝑠(𝑠))𝐔1+𝐱(𝑠(𝑠))d𝑠𝑇𝑡𝐱𝑇(𝑠𝜏(𝑠))𝐔2𝐱(𝑠𝜏(𝑠))d𝑠,(3.6) where 𝑦(𝑡)=𝐱(𝑡)𝐂𝐱(𝑡(𝑡)), 𝑇>𝑡 is a constant.
By Ito’s differential formula, we get d𝑉(𝑡,𝐱(𝑡))2𝐲𝑇(𝑡)𝐏𝐀(𝑡)𝐱(𝑡)+𝐁(𝑡)𝐟(𝐱(𝑡𝜏(𝑡)))+𝐃(𝑡)𝑡𝑡𝑟(𝑡)𝐟(𝐱(𝑠))d𝑠+𝐱𝑇.𝜏𝐱(𝑡)𝐐𝐱(𝑡)1(𝑡)𝑇(𝑡𝜏(𝑡))𝐐𝐱(𝑡𝜏(𝑡))+𝐱𝑇̇𝐱(𝑡)𝐑𝐱(𝑡)1(𝑡)𝑇(𝑡(𝑡))𝐑𝐱(𝑡(𝑡))+𝑟𝐟𝑇(𝐱(𝑡))𝐒𝐟(𝐱(𝑡))𝑡𝑡𝑟(𝑡)𝐟𝑇(𝐱(𝑠))𝐒𝐟(𝐱(𝑠))d𝑠𝐱𝑇(𝑡(𝑡))𝐔1𝐱(𝑡(𝑡))𝐱𝑇(𝑡𝜏(𝑡))𝐔2+𝐇𝐱(𝑡𝜏(𝑡))0(𝑡)𝐱(𝑡)+𝐇1(𝑡)𝐱(𝑡𝜏(𝑡))𝑇𝐏𝐇0(𝑡)𝐱(𝑡)+𝐇1(𝑡)𝐱(𝑡𝜏(𝑡))d𝑡+2𝐲𝑇𝐇(𝑡)𝐏0(𝑡)𝐱(𝑡)+𝐇12[](𝑡)𝐱(𝑡𝜏(𝑡))d𝐰(𝑡)𝐱(𝑡)𝐂𝐱(𝑡(𝑡))𝑇×𝐏𝐀(𝑡)𝐱(𝑡)+𝐁(𝑡)𝐟(𝐱(𝑡𝜏(𝑡)))+𝐃(𝑡)𝑡𝑡𝑟(𝑡)𝐟(𝐱(𝑠))d𝑠+𝐱𝑇(𝑡)𝐐𝐱(𝑡)1𝜂1𝐱𝑇(𝑡𝜏(𝑡))𝐐𝐱(𝑡𝜏(𝑡))+𝐱𝑇(𝑡)𝐑𝐱(𝑡)1𝜂2𝐱𝑇(𝑡(𝑡))𝐑𝐱(𝑡(𝑡))+𝑟𝐟𝑇(𝐱(𝑡))𝐒𝐟(𝐱(𝑡))𝑡𝑡𝑟(𝑡)𝐟𝑇(𝐱(𝑠))𝐒𝐟(𝐱(𝑠))d𝑠𝐱𝑇(𝑡(𝑡))𝐔1𝐱(𝑡(𝑡))𝐱𝑇(𝑡𝜏(𝑡))𝐔2+𝐇𝐱(𝑡𝜏(𝑡))0(𝑡)𝐱(𝑡)+𝐇1(𝑡)𝐱(𝑡𝜏(𝑡))𝑇𝐏𝐇0(𝑡)𝐱(𝑡)+𝐇1(𝑡)𝐱(𝑡𝜏(𝑡))d𝑡+2𝐲𝑇𝐇(𝑡)𝐏0(𝑡)𝐱(𝑡)+𝐇1(𝑡)𝐱(𝑡𝜏(𝑡))d𝐰(𝑡).(3.7) From (2.5), for a scalar 𝛿>0, we have 𝐟𝛿𝑇(𝐱(𝑡𝜏(𝑡)))𝐟(𝐱(𝑡𝜏(𝑡)))𝐱𝑇(𝑡𝜏(𝑡))𝐆𝑇𝐆𝐱(𝑡𝜏(𝑡))0.(3.8) Using Lemma 2.4, we have 𝑡𝑡𝑟(𝑡)𝐟(𝐱(𝑠))d𝑠𝑇𝑟1𝐒𝑡𝑡𝑟(𝑡)𝐟(𝐱(𝑠))d𝑠𝑡𝑡𝑟(𝑡)𝐟𝑇(𝐱(𝑠))𝐒𝐟(𝐱(𝑠))d𝑠.(3.9) Together (3.8), (3.9) with d𝑉(𝑡,𝐱(𝑡)), we obtain 𝐱d𝑉(𝑡,𝐱(𝑡))𝑇(𝑡)𝐏𝐀(𝑡)𝐀𝑇(𝑡)𝐏+𝐐+𝐑+𝑟𝐆𝑇𝐒𝐆𝐱(𝑡)+𝐱𝑇(𝑡)𝐀𝑇(𝑡)𝐏𝐂𝐱(𝑡(𝑡))+𝐱𝑇(𝑡(𝑡))𝐂𝑇𝐏𝐀(𝑡)𝐱(𝑡)+𝐱𝑇(𝑡)𝐏𝐁(𝑡)𝐟(𝐱(𝑡𝜏(𝑡)))+𝐟𝑇(𝐱(𝑡𝜏(𝑡)))𝐁𝑇(𝑡)𝐏𝐱(𝑡)+𝐱𝑇(𝑡)𝐏𝐃(𝑡)𝑡𝑡𝑟(𝑡)𝐟(𝐱(𝑠))d𝑠+𝑡𝑡𝑟(𝑡)𝐟(𝐱(𝑠))d𝑠𝑇𝐃𝑇(𝑡)𝐏𝐱(𝑡)+𝐱𝑇(𝑡(𝑡))𝐔11𝜂2𝐑𝐱(𝑡(𝑡))𝐱𝑇(𝑡(𝑡))𝐂𝑇𝐏𝐁(𝑡)𝐟(𝐱(𝑡𝜏(𝑡)))𝐟𝑇(𝐱(𝑡𝜏(𝑡)))𝐁𝑇(𝑡)𝐏𝐂𝐱(𝑡(𝑡))𝐱𝑇(𝑡(𝑡))𝐂𝑇𝐏𝐃(𝑡)𝑡𝑡𝑟(𝑡)𝐟(𝐱(𝑠))d𝑠𝑡𝑡𝑟(𝑡)𝐟(𝐱(𝑠))d𝑠𝑇𝐃𝑇(𝑡)𝐏𝐂𝐱(𝑡(𝑡))+𝐱𝑇×(𝑡𝜏(𝑡))𝐔21𝜂1𝐐𝐱(𝑡𝜏(𝑡))𝑡𝑡𝑟(𝑡)𝐟(𝐱(𝑠))d𝑠𝑇𝑟1𝐒𝑡𝑡𝑟(𝑡)𝐟+𝐇(𝐱(𝑠))d𝑠0(𝑡)𝐱(𝑡)+𝐇1(𝑡)𝐱(𝑡𝜏(𝑡))𝑇𝐏𝐇0(𝑡)𝐱(𝑡)+𝐇1(𝑡)𝐱(𝑡𝜏(𝑡))d𝑡+2𝐲𝑇𝐇(𝑡)𝐏0(𝑡)𝐱(𝑡)+𝐇1(𝑡)𝐱(𝑡𝜏(𝑡))d𝐰(𝑡).(3.10) That is, d𝑉(𝑡,𝐱(𝑡))𝜉𝑇(𝑡)Σ𝜉(𝑡)d𝑡+2𝐲𝑇𝐇(𝑡)𝐏0(𝑡)𝐱(𝑡)+𝐇1(𝑡)𝐱(𝑡𝜏(𝑡))d𝐰(𝑡),(3.11) where 𝜉𝑇(𝑡)=(𝐱𝑇(𝑡),𝐱𝑇(𝑡(𝑡)),𝐟𝑇(𝐱(𝑡𝜏(𝑡))),(𝑡𝑡𝑟(𝑡)𝐟(𝐱(𝑠))d𝑠)𝑇,𝐱𝑇(𝑡𝜏(𝑡))), and the matrix Σ is given in (3.5).
Taking the mathematical expectation, we get Ed𝑉(𝑡,𝐱(𝑡))𝜉d𝑡E𝑇(𝑡)Σ𝜉(𝑡)𝜆max(Σ)E𝐱(𝑡)2.(3.12) From (3.5), we know Σ<0, that is, 𝜆max(Σ)<0. By Lyapunov-Krasovskii stability theorems, the system (2.1) is globally robustly asymptotically stable. The proof is completed.

Remark 3.2. To the best of our knowledge, few authors have considered the stochastically asymptotic stability for uncertain neutral-type neural networks driven by Wiener process. We can find recent papers [18, 2224]. However, it is assumed in [18] that the system is a linear model and all delays are constants. In [22], it is assumed that the time-varying delays satisfying .𝜏(𝑡)𝜌𝜏<1, ̇(𝑡)𝜌<1, in this paper, we relax it to .𝜏(𝑡)𝜌𝜏̇<,(𝑡)𝜌<. In [23, 24], the authors discussed the robust stability for uncertain stochastic neural networks of neutral-type with time-varying delays. However, the distributed delays were not taken into account in the models. Hence, our results in this paper have wider adaptive range.

Remark 3.3. Suppose that 𝐂=0,𝐃(𝑡)=0 (i.e., without neutral-type and distributed delays), then the system (2.1) becomes the one investigated in [15].

Remark 3.4. In [17], the authors studied the global stability for uncertain stochastic neural networks with time-varying delay by Lyapunov functional method and LMI technique. However, the neutral term and distributed delays were not taken into account in the models. Therefore, our developed results in this paper are more general than those reported in [17].

Remark 3.5. It should be noted that the condition is given as linear matrix inequalities LMIs, therefore, by using the MATLAB LMI Toolbox, it is straightforward to check the feasibility of the condition.

4. Numerical Example

Consider the following uncertain neutral-type delayed neural networks:d[]=𝐱(𝑡)𝐂𝐱(𝑡(𝑡))𝐀+𝐔𝐅(𝑡)𝐌1𝐱(𝑡)+𝐁+𝐔𝐅(𝑡)𝐌2+𝐟(𝐱(𝑡𝜏(𝑡)))𝐃+𝐔𝐅(𝑡)𝐌3𝑡𝑡𝑟(𝑡)+𝐟(𝐱(𝑠))d𝑠d𝑡𝐔𝐅(𝑡)𝐌4𝐱(𝑡)+𝐔𝐅(𝑡)𝐌5𝐱(𝑡𝜏(𝑡))d𝐰(𝑡),(4.1) where 𝑛=2, 𝑓𝑖(𝑥𝑖)=sin𝑥𝑖, 𝑖=1,2, 𝜂1=0.7, 𝜂2=0.5, 0<𝑟(𝑡)𝑟=3, 𝐅𝑇(𝑡)𝐅(𝑡)𝐈.

The constant matrices are,𝐀=3003,𝐁=0.20.160.040.08,𝐂=0.2000.2𝐃=0.040.030.020.05,𝐔=0.10.50.50.3,𝐌1=,𝐌0.6000.62=𝐌3=𝐌4=0.2000.2,𝐌5=.0.4000.4,𝐆=1001(4.2) By using the MATLAB LMI Control Toolbox, we obtain the feasible solution as follows: 𝛿=2.0876, 𝜀1=5.0486,  𝜀2=8.0446,,𝐏=6.84650.72570.72576.6012,𝐐=9.93710.27920.279210.4388,𝐑=8.01042.39362.39365.4991𝐒=3.49840.81430.81431.9588,𝐔1=1.11110.38890.38891.1060,𝐔2=.2.41930.65610.65612.6270(4.3) That is the system (4.1) is globally robustly stochastically asymptotically stable in the mean square.

5. Conclusion

In this paper, the stochastically asymptotic stability problem has been studied for a class of uncertain neutral-type delayed neural networks driven by Wiener process by utilizing the Lyapunov-Krasovskii functional and linear matrix inequality (LMI) approach. A numerical example is given to illustrate the applicability of the result.

Acknowledgment

This paper was fully supported by the National Natural Science Foundation of China (no. 10771199 and no. 10871117).