Filtering and Control for Unreliable Communication 2015View this Special Issue
Research Article | Open Access
Chunmei Wu, Junhao Hu, Yan Li, "Robustness Analysis of Hybrid Stochastic Neural Networks with Neutral Terms and Time-Varying Delays", Discrete Dynamics in Nature and Society, vol. 2015, Article ID 278571, 12 pages, 2015. https://doi.org/10.1155/2015/278571
Robustness Analysis of Hybrid Stochastic Neural Networks with Neutral Terms and Time-Varying Delays
We analyze the robustness of global exponential stability of hybrid stochastic neural networks subject to neutral terms and time-varying delays simultaneously. Given globally exponentially stable hybrid stochastic neural networks, we characterize the upper bounds of contraction coefficients of neutral terms and time-varying delays by using the transcendental equation. Moreover, we prove theoretically that, for any globally exponentially stable hybrid stochastic neural networks, if additive neutral terms and time-varying delays are smaller than the upper bounds arrived, then the perturbed neural networks are guaranteed to also be globally exponentially stable. Finally, a numerical simulation example is given to illustrate the presented criteria.
In the past decades, neural networks have been extensively used in various areas. Stability is usually a prerequisite for most successful applications of neural networks. The stability of neural networks depends mainly on their parametric configuration. Moreover, it is well known that noise and time delays are often the sources of instability and they may destroy the stability of neural networks if they exceed their limits.
Robustness is the traditional characteristic of disturbance rejection, and it refers to control system in the feature or parameter perturbation when it still can make the performance of the quality indicators remain unchanged. Robustness characterized control system is not sensitive to feature or parameter perturbation. In practical problems, the system characteristics or parameter perturbation is often unavoidable. Causes of perturbation mainly have two aspects; one is due to the imprecision of the measurement features or the fact that parameters will deviate from the actual value of the design value (nominal value); the other is influenced by environmental factors in the process of system operation and causes the slow drift of features or parameters. Robustness, therefore, has become an important research topic in the control theory. The study of robustness is mainly limited to linear time-invariant control system; the areas covered include stability and astatic and adaptive control. Robustness problems and relative stability of the control system and the principle of invariance have the close relation; the establishment of the internal model principle is the study of the fact that robustness problem plays an important role.
The robustness analyses of global exponential stability of various neural networks have been widely investigated in recent years. For example, in , the robustness of global exponential stability for hybrid neural networks with noise and delay perturbations was investigated. In , the robustness of global exponential stability of recurrent neural networks with time delays and random disturbances was discussed. In , delay-dependent robust stability of cellular neural networks with discrete and distributed time-varying delays was considered. Shen and Wang  studied robustness of global exponential stability of recurrent neural networks in the presence of time delays and random disturbances. The stability of the systems often also depends on neutral terms. In , Shen and Wang continue to analyse robustness of global exponential stability of nonlinear systems with time delays and neutral terms.
In implementation or application of neural networks, the change of parameters of neural networks abruptly is not uncommon. In , Liberzon characterizes the parameters of neural network (e.g., connection weights and biases) and reports that they change abruptly due to unexpected failure or designed switching. In such a case, neural networks can be represented by a switching model which can be regarded as a set of parametric configurations switching from one to another according to a given Markov chain. For instance, stability of stochastic neural networks with Markovian jumping Parameters was analyzed in . In , stability analysis for discrete time Markovian jump neural networks with mixed time delays was studied. Stability of several hybrid stochastic neural networks was analyzed in [9–13]. In , the global asymptotic stability of cellular neural networks with delays was investigated. In [15, 16], the authors investigated the robustness of global exponential stability of stochastic systems (with Markovian switching) in the presence of time-varying delays or noises. In , the robust stability analysis of switched Hopfield neural networks with time-varying delay under uncertainty was investigated. In [18, 19], the delay-dependent robust stability of uncertain neutral-type stochastic systems with Markovian jumping was studied by using linear matrix inequality.
In this paper, we characterize robustness of the hybrid stochastic neural networks in the presence of neutral terms and time-varying delays. The upper bounds of neutral terms contraction coefficients and time-varying delays are estimated by solving transcendental equations. Moreover, we prove theoretically that, for any globally exponentially stable hybrid stochastic neural networks, if additive contraction coefficients of neutral terms and time-varying delays are smaller than the derived upper bounds herein, then the perturbed neural networks are guaranteed to be globally exponentially stable.
The remainder of this paper is organized as follows. In Section 2, some preliminaries and necessary notations are given. Section 3 discusses the stability conditions of the neural networks with Markovian switching in the presence of neutral terms and time-varying delays simultaneously. In Section 4, a numerical example is given to substantiate the theoretical results. Finally, concluding remarks are made in Section 5.
2. Preliminaries and Assumptions
Throughout this paper, unless otherwise specified, let be a complete probability space with a filtration satisfying the usual conditions (i.e., it is right continuous and contains all -null sets). Let , and let be an -dimension Brownian motion defined on the probability space. Let be the Euclidean norm in . If is a vector or matrix, its transpose is denoted by . If is a matrix, its trace norm is denoted by . Let denote the family of continuous functions from to with the norm . For , denote by the family of all -measurable, -valued random variables such that . Denote by the family of all -measurable, bounded, and -valued random variables.
Let , , be a right continuous Markov chain on the probability space taking values in a finite state space with generator given by where . Here, is the transition rate from to if , while . It is known that almost every sample path of is a right continuous step function with a finite number of simple jumps in any finite subinterval of .
In this paper, we will consider the hybrid stochastic neural networks in the presence of neutral terms and time-varying delays of the formwhere , , , , is the self-feedback connection weight matrix, is connection weight matrix, is an -dimension Brownian motion defined on the probability space , is a time-varying delay, which satisfies , , and . Assume that , , and satisfy the following assumptions.
Assumption 1. For all and , there exists a constant such that moreover, for any ,
Assumption 2. For all and , there exists a positive constant such that moreover, for any ,
Assumption 3. For all , there exists a constant such that moreover,
It is well known that, for any given initial value and , according to Assumptions 1–3, system (2) has a unique state when (see ). Besides, system (2) has a trivial state .
In case of no time delay and neutral term, system (2) has the following form:From , according to Assumptions 1-2, for any given initial value and , for system (6), there exists a unique state And is the trivial state of system (6). To analyse the stability of systems (6) and (2), we now define the th moment global exponential stability and almost surely global exponential stability of systems (6) and (2).
Definition 4. Let ; if for any and , there exist positive constants and such that where is the state of system (6); then the state of system (6) is th moment globally exponentially stable.
The state of system (2) is th moment globally exponentially stable if, for any , , there exist and such that that is, the Lyapunov exponent where is the state of system (2).
Definition 5. System (6) is said to be almost surely globally exponentially stable, if, for any and , there exist and such that , holds almost surely, where is the state of system (6).
If, for any and , there exist and such that , holds almost surely; then system (2) is almost surely globally exponentially stable, where is the state of system (2).
From the above definitions, it is clear that the almost sure global exponential stability of system (2) implies the th moment global exponential stability of system (2) (see ) but not vice versa. However, if Assumptions 1–3 hold, we can get the following lemma (see ).
Meanwhile, in order to obtain our result, we also need another lemma (see ).
Lemma 7. Let and , such that then In particular, for , there is equality.
3. Robustness Analysis
In this section, we will show that if system (6) is th moment globally exponentially stable, system (2) may remain to be th moment globally exponentially stable provided that neutral terms contraction coefficients and time-varying delay are sufficiently small.
Theorem 8. Let Assumptions 1–3 hold and system (6) is th moment globally exponentially stable. System (2) is th moment globally exponentially stable and almost surely globally exponentially stable, if and , where is a unique positive solution of the transcendental equationand is a unique positive solution of the transcendental equationwhere and is an adjustable parameter, , is a step, and , ,
Proof. Fix , , to simplicity, we write and as and , respectively. From systems (2) and (6), for any , we have when ; by Assumptions 1-2, Hölder inequality, and Lemma 7, we derivewhere .
By system (2) and Assumptions 1–3, we getIn addition, for ,By reversing the order of integral, we getSimilarly, we can deriveTherefore, when , by substituting (18)–(20) into (17), we havewhere From (21) and (16), when , we getNote that ; when , by Assumption 3, we haveBy (23) and (24), we further havewhereWhen , by applying the Gronwall inequality, we getTherefore, when , thenFrom (12), if , we havethat is,whereIf , thenObviously,as .
Moreover, . Therefore, is strictly increasing with respect to . Consequently, by the root existence theorem of continuous functions, has an unique positive root .
When , let We havethat is,where