Abstract

We study the impact of stochastic noise and connection weight matrices uncertainty on global exponential stability of hybrid BAM neural networks with reaction diffusion terms. Given globally exponentially stable hybrid BAM neural networks with reaction diffusion terms, the question to be addressed here is how much stochastic noise and connection weights matrices uncertainty the neural networks can tolerate while maintaining global exponential stability. The upper threshold of stochastic noise and connection weights matrices uncertainty is defined by using the transcendental equations. We find that the perturbed hybrid BAM neural networks with reaction diffusion terms preserve global exponential stability if the intensity of both stochastic noise and connection weights matrices uncertainty is smaller than the defined upper threshold. A numerical example is also provided to illustrate the theoretical conclusion.

1. Introduction

The bidirectional associative memory (BAM) neural networks were first introduced by Kosko in which the neurons in one layer are fully interconnected to the neurons in the other layer, while there are no interconnection among the neurons in the same layers [13]. The BAM neural networks widely have applications in pattern recognition, robot, signal processing, associative memory, solving optimization problems, and automatic control engineering. For most successful applications of BAM neural networks, the stability analysis on BAM neural networks is usually a prerequisite. The exponential stability and periodic oscillatory solution of BAM neural networks with delays were studied by Cao et al. [4, 5]. Moreover, in BAM neural networks, diffusion phenomena can hardly be avoided when electrons are moving in asymmetric electromagnetic fields. The BAM neural networks with reaction diffusion terms described by partial differential equations were investigated by many authors [611]. Sometimes, it is necessary to assess the parameters of the neural network that may experience abrupt changes caused by certain phenomena such as component failure or repair, change of subsystem interconnection, and environmental disturbance. The continuous-time Markov chains have been used to model these parameter jumps [1214]. These neural networks with Markov chains are usually called hybrid neural networks. The almost surely exponential stability, moment exponential stability, and stabilization of hybrid neural networks were also researched; see, for example, [1517]. By making use of impulsive control, Zhu and Cao [18] considered the stability of hybrid neural networks with mixed delay.

For neural networks with stochastic noise, the system is usually described by stochastic differential equations. The stability of stochastic neural networks with delay or reaction diffusion terms was extensively analyzed by using the Itô formula and the linear matrix inequality (LMI) methods [1822]. As is well known, stochastic noise is often the sources of instability and may destabilize the stable neural networks [23]. For stable hybrid BAM neural networks with reaction diffusion terms, it is interesting to determine how much noise the stochastic neural networks can tolerate while maintaining global exponential stability.

Moreover, the connection weights of neurons depend on certain resistance and capacitance values which include uncertainty. The robust stability about parameter matrices uncertainty in neural networks was investigated by many authors [24, 25]. If the uncertainty in connection weights matrices is too large, the neural networks may be unstable. Therefore, for stable hybrid BAM neural networks with reaction diffusion terms, it is also interesting to determine how much connection weights matrices uncertainty the neural networks can also tolerate while maintaining global exponential stability.

In this paper, we will study the impact of stochastic noise and connection weight matrices uncertainty of hybrid BAM neural networks with reaction diffusion terms. We give the upper threshold of stochastic noise and connection weights matrices uncertainty defined by using the transcendental equations. We find that the perturbed hybrid BAM neural networks with reaction diffusion terms preserve global exponential stability if the intensity of both stochastic noise and connection weights matrices uncertainty is smaller than the defined upper threshold.

The remainder of this paper is organized as follows. Some preliminaries are given in Section 2. Section 3 discusses the impact of the stochastic noise on global exponential stability of these neural networks. Section 4 discusses the impact of the connection weight matrices uncertainty and stochastic noise on global exponential stability of these neural networks. Finally, an example with numerical simulation is given to illustrate the effectiveness of the obtained results in Section 5.

2. Preliminaries

Throughout this paper, unless otherwise specified, let be complete probability space with a filtration satisfying the usual conditions (i.e., it is increasing and right continuous while contains all -null sets). Let be a scalar Brownian motion (Wiener process) defined on the probability space. Let denote the transpose of . If is a matrix, its operator norm is denoted by , where is the Euclidean norm. Let , , be a right-continuous Markov chain on the probability space taking values in a finite state space with the generator given by where . Here, is the transition rate from to if while We assume that the Markov chain is independent of the Brownian motion . It is well known that almost every sample path of is a right-continuous step function with finite number of simple jumps in any finite subinterval of .

In this paper, we will consider the following hybrid BAM neural networks with reaction diffusion terms: where , , , , and the initial value . Consider ; is a compact set with smooth boundary in space , and .   =   and   =  , , are the state of the th neurons and the th neurons at times and in space , respectively. and denote the signal functions on the th neurons and the th neurons at times and in space , respectively. and denote the external input on the th neurons and the th neurons, respectively. and denote the rates with which the th neurons and the th neurons will reset its potential to the resting state in isolation when disconnected from the networks and external inputs, respectively. and denote the strength of the th neurons on the th neurons and the th neurons on the th neurons, respectively. Smooth functions and correspond to the transmission diffusion operator along the th neurons and the th neurons, respectively.

The initial conditions and boundary conditions are given by

The neuron activation functions and are global Lipschitz continuous; that is, there exist constants and , such that

Then, the neural networks (3) have a unique state and for any initial values (see [26, 27]).

In addition, we assume that the neural networks (3) have an equilibrium point , .

Let , , = , = , , and , and then (3) can be rewritten as

The initial conditions and boundary conditions are given by Hence, the origin is an equilibrium point of (6). The stability of the equilibrium point of (3) is equivalent to the stability of the origin of the state space of (6).

From (5), we give the assumption about activations functions and .

Assumption (H1). The neuron activation functions and are global Lipschitz continuous; that is, there exist constants and , such that

We consider the following function vector space:

For every pair of in and every given , define inner product for and with Obviously, it satisfies inner product axiom, and the norm can be deduced by

Definition 1. The neural networks (6) are said to be global exponentially stable if for any , there exist and , such that
For the purpose of simplicity, we rewrite (6) as follows:
The initial conditions and boundary conditions are given by where Here, denotes Hadamard product of matrix and and and .

3. Noise Impact on Stability

In this section, we consider the noise-induced neural networks (6) described by the stochastic partial differential equations

The initial conditions and boundary conditions are given by where is the noise intensity.

We rewrite (16) as follows: For the globally exponentially stable neural networks (6), we will characterize how much stochastic noise the neural networks (16) can tolerate while maintaining global exponential stability.

Definition 2. The neural networks (16) are said to be almost surely globally exponentially stable, if for any the Lyapunov exponent

Definition 3. The neural networks (16) are said to be mean square globally exponentially stable, if, for any , the Lyapunov exponent where , is the state of neural networks (16).

From the above definitions, it is clear that the almost sure global exponential stability of the neural networks (16) implies the mean square global exponential stability of the neural networks (16) (see [26, 27]) but not vice versa.

Theorem 4. Under Assumption (H1), the mean square global exponential stability of neural networks (16) implies the almost sure global exponential stability of the neural networks (16).

Proof. For any , we denote the state , of (16) as . By Definition 3, there exist and , such that Let . Construct average Lyapunov functional Let , by Itô formula and Assumption (H1), for , By boundary condition and Gauss formula, we get By Hölder’s inequality, we have Substituting (24)–(27) into (23), we get where we use .
From (28), we have where and .
On the other hand, by the Burkholder-Davis-Gundy inequality [27] and , we have Substituting the above inequality into (29), we get By induction and the mean square global exponential stability of neural networks (16), Let , by Chebyshev’s inequality [27], it follows from (32) that By Borel-Cantelli Lemma [27], for almost all , holds for all but finitely many . Hence, there exists an , for all , excluding a -null set, for the above inequality that holds whenever . Consequently, for almost all , if . Therefore,

Theorem 5. Let Assumption (H1) hold and the neural networks (6) be globally exponentially stable. Then, the neural networks (16) is mean square globally exponentially stable and also almost surely globally exponentially stable, if there exist and , where is a unique positive solution of the transcendental equation where , , and so forth and and .

Proof. For any , we denote the state of (16) as and the state of (6) as .
From (6) and (18) and stochastic Fubini’s Theorem, we have Construct average Lyapunov functional where .
By applying generalized Itô formula [27], we have By boundary condition and (24), we have By boundary condition and (25), we have By Hölder’s inequality, we get
From (42)–(45) and Assumption (H1), we obtain that When , we have By stochastic Fubini’s Theorem, we have By (47), one get When , by applying Gronwall’s inequality, we have By the global exponential stability of (6), we have Moreover, From (37), when , we have Let By (52), we have For any positive integer , from the existence and uniqueness of the flow of (16) (see [28]), when , we have From (55) and (56), Hence, for any , there exists a positive integer , such that , and we have where . The above inequality also holds for .
Therefore, the neural networks (16) are mean square globally exponentially stable, and by Theorem 4, the neural networks (16) are also almost surely globally exponentially stable.

4. Connection Weight Matrices Uncertainty and Noise Impact on Stability

In this section, we first consider the parameter uncertainty intensity which is added to the self-feedback matrix of the neural networks (16). Then, the neural networks (16) are changed as

The initial conditions and boundary conditions are given by where is the self-feedback matrix uncertainty intensity and is the noise intensity.

We rewrite (59) as follows:

For the global exponential stability of neural networks (6), we will characterize how much the intensity of both the self-feedback matrix uncertainty and stochastic noise the stochastic neural networks (59) can tolerate while maintaining global exponential stability.

Theorem 6. Let Assumption (H1) hold and let the neural networks (6) be globally exponentially stable. Then, the neural networks (59) are mean square globally exponential stability and also almost sure globally exponential stability, if there exists , and is in the inner of the closed curve described by the following transcendental equation: where , , and so forth and and .

Proof. For any , we denote the state of (59) as and the state of (6) as .
From (6) and (61) and stochastic Fubini’s Theorem, we have Construct the average Lyapunov functional where .
By applying generalized Itô formula [27], we have By Hölder’s inequality, we get From (42), (43), and (67) and Assumption (H1), we obtain that When , we have By stochastic Fubini’s Theorem, we have By (69), one get When , by applying Gronwall’s inequality, we have By the global exponential stability of (6), we have Moreover, From (62), when is in the inner of the closed curve described by the transcendental equation, we have Let By (74), we have Similar to the proof of Theorem 5, we can prove that the neural networks (59) are mean square globally exponentially stable and also almost surely globally exponentially stable.

To continue, we consider the parameter uncertainty intensity which is added to the connection weight matrix of the neural networks (16). Then, the neural networks (16) are changed as

The initial conditions and boundary conditions are given by where is the connection weight matrix uncertainty intensity and is the noise intensity.

We rewrite (78) as follows: For the global exponential stability of neural networks (6), we will characterize how much the intensity of both the connection weight matrix uncertainty and stochastic noise the stochastic neural networks (78) can tolerate while maintaining global exponential stability.

Theorem 7. Let Assumption (H1) hold and let the neural networks (6) be global exponential stability. Then, the neural networks (78) are mean square globally exponentially stable and also almost surely globally exponentially stable, if there exists , , and is in the inner of the closed curve described by the following transcendental equation: where , , and so forth and and .

The proof is similar to the proof of Theorem 6.

5. Illustrate Example

Example 1. Consider hybrid BAM neural networks with reaction diffusion terms
The initial conditions and boundary conditions are given by where and , , and . According to Theorem  1 in [9] and Theorem 1 in [29], the neural networks (82) are global exponential stability with and .
In the presence of stochastic noise and self-feedback matrix uncertainty, the neural networks (82) become According to Theorem 6, let and . From (62), we have Then, we can obtain its closed curve for . Figure 1 depicts the stability region for in (85).

Figures 2, 3, 4, and 5 depict the surface curves of the neural networks (85) with . It shows that the state of the neural networks (85) is mean square globally exponential stability and almost surely globally exponential stable, as the parameter in the inner of the curve of Figure 1.

Figures 6, 7, 8 and 9 show the surface curves of the neural networks (85) with . It shows that when the conditions in Theorem 6 do not hold, the neural networks (85) become unstable.

Conflict of Interests

The authors declare that there is no conflict of interests regarding the publication of this paper.

Acknowledgments

This paper is partially supported by National Natural Science Foundation of China under Grants nos. 61304068, 61374085, and 61134012 and the Fundamental Research Funds for the Central Universities under Grant no. 2013QC019.