Complexity

Complexity / 2019 / Article

Research Article | Open Access

Volume 2019 |Article ID 3429326 | https://doi.org/10.1155/2019/3429326

Xiaohui Xu, Jibin Yang, Yanhai Xu, "Mean Square Exponential Stability of Stochastic Complex-Valued Neural Networks with Mixed Delays", Complexity, vol. 2019, Article ID 3429326, 20 pages, 2019. https://doi.org/10.1155/2019/3429326

Mean Square Exponential Stability of Stochastic Complex-Valued Neural Networks with Mixed Delays

Academic Editor: Massimiliano Zanin
Received30 Jan 2019
Accepted04 Apr 2019
Published20 Jun 2019

Abstract

This paper investigates the mean square exponential stability problem of a class of complex-valued neural networks with stochastic disturbance and mixed delays including both time-varying delays and continuously distributed delays. Under different assumption conditions concerning stochastic disturbance term from the existing ones, some sufficient conditions are derived for assuring the mean square exponential stability of the equilibrium point of the system based on the vector Lyapunov function method and differential-integral theorem. The obtained results not only generalize the existing ones, but also reduce the conservatism of the previous stability results about complex-valued neural networks with stochastic disturbances. Two numerical examples with simulation results are given to verify the feasibility of the proposed results.

1. Introduction

In recent years, the dynamical behavior analysis of various neural networks models defined in complex number domain has attracted more and more attention due to their extensive applications in many fields, such as filtering, speech synthesis, remote sensing, and signal processing, which cannot be solved comprehensively with only their counterparts defined in real number domain [1, 2].

It is well known that delays arise because of the processing of information in both biological and man-made neural networks in view of a practical point [3, 4]. Besides, the neural networks usually have a spatial extent due to the presences of a multitude of parallel pathway with a variety of axon sizes and lengths. There will be a distribution of conduction velocities along these pathways and a distribution of propagation be designed with discrete delays. Therefore, the more appropriate way is to incorporate continuously distributed delays [37]. In [811], the authors have studied several kinds of complex-valued neural networks with continuously distributed delays. Some significant results were obtained for assuring the stability of the proposed systems in [811].

In fact, most real models of neural networks are affected by many external and internal perturbations which are of great uncertainty, such as impulsive disturbances [5, 915], Markovian jumping parameters [1619], and parameter uncertainties [2022]. As Haykin [23] points out, in real nervous systems, synaptic transmission is a noisy process brought on by random fluctuations from the release of neurotransmitters and other probabilistic causes. One approach to the mathematical incorporation of such effects is to use probabilistic threshold models. In current papers, neural networks model with external random perturbations is to view as nonlinear dynamical systems with white noise (perturbations of Brownian motion) [3, 24]. Therefore, it is of real significance to consider that stochastic effects to the stability of neural networks with continuously distributed delays.

As far as we know, the existing stability results concerning various neural networks are mainly applied to judge the neural networks in real number domain, such as [3, 7, 1214, 16, 17, 2431]. Recently, there have been some literatures with regard to the complex-valued neural networks with stochastic disturbances [3234]. In [32], the authors established a class of stochastic memristor-based complex-valued neural networks with time varying delays. In [33], the robust state estimation problem was investigated for a class of complex-valued neural networks involving parameter uncertainties, constant time delays, and stochastic disturbances by resorting to the sampled data information from the available output measurements. However, the infinitely distributed delays have not been considered in the model of complex-valued neural networks studied in [32, 33]. In [34], the passivity analysis was conducted for a class of stochastic memristor-based complex-valued recurrent neural networks with discrete time delays and continuously distributed delays by applying both the scalar Lyapunov function method and the LMI method. The stochastic weighted coefficients in the model [34] were assumed to be constants in real number domain, which was conservatism and needed to be generalized in the complex number domain further. It is well known that the synchronization problem of chaotic neural networks can be translated into the stability problem of the corresponding error system of driving system and driven system. In [35], the authors concerned the problem of finite-time synchronization for a class of complex-valued neural networks with stochastic disturbances and mixed delays. However, the stochastic terms in the driving and driven systems were only assumed to be bounded directly, which was a shortage of generality. Besides, the results in [35] are unable to judge the stability of the corresponding systems.

Separating the model of complex-valued neural networks into its real and imaginary parts is a routine approach to study the dynamical behavior of the systems, such as [2, 8, 11, 15, 3235]. The complex-valued activation functions were supposed to be with existence, continuity, and boundedness of the partial derivatives of the activation functions about the real and imaginary parts of the state variables [2, 8, 11, 15, 32]. As pointed in [33, 3638], the assumption concerning activation functions in [2, 8, 11, 15, 32] were with additional restriction that the partial derivatives of its real parts and imaginary parts were required to satisfy the existence and continuity.

To the best of authors’ knowledge, there is no research concerning the exponential stability in the mean square sense for the complex-valued neural networks with mixed delays (including both time-varying delays and continuously distributed delays) and stochastic disturbances. Therefore, in this paper we will establish some new conditions for assuring the stability of the mentioned systems. The main advantages and contributions can be listed as follows. (a) The stochastic neural networks with mixed delays in complex number domain are proposed. (b) Different assumption conditions concerning stochastic disturbance term from the existing ones are given in this paper, which is of less conservatism. (c) Both stochastic disturbances and interval parameter uncertainties are considered in the addressed systems. (d) Some sufficient conditions with simple matrix forms are obtained for ensuring the mean square exponential stability and robust exponential stability in the mean square sense of the systems, which are easy to be checked in practice.

2. Notations and Model Descriptions

In this paper, we consider a class of complex-valued neural networks as follows: In (1), represents the k-th neuron state, , where denotes a complex number set, and denotes the neuron number. , , and represent the connection weighted matrices, respectively. is the external input vector, where denotes the transpose of the vector. represents the activation function. with denotes the neuron self-feedback coefficient matrix, where denote a real number set. denotes the weighted function of stochastic disturbances and is the Brown motion defined on a complete probability space with a natural filtration generated by .

It is assumed that is the initial condition of system (1), where is continuous function mapping from to , .

Let be the equilibrium point of (1), where () and i denotes the imaginary unit; i.e., .

Let , where and represent the real part and imaginary part of a complex number, respectively. Furthermore, the activation functions are assumed to be expressed as (2) by separating them into their real parts and imaginary parts, where and represent the real part and the imaginary part of , respectively.

Next, some assumptions concerning (1) are given to obtain the stability results.

Assumption 1. Suppose that () is a bounded function with .

Assumption 2. Suppose that : are piecewise continuous functions and satisfy where is continuous on , and .

Let denote the module of a complex number.

Assumption 3. It is assumed that with the form of (2) satisfies the mean value theorem of multivariable functions. That is, for any given , there exist positive constants , , , and such that

Let , , , .

Remark 4. In [2, 8, 11, 15, 32], the activation functions were supposed to satisfy that the partial derivatives of their real parts and imaginary parts were required to be with the existence and continuity. In fact it is an additional restriction on the activation functions in complex number domain. In this paper the mentioned restriction is no longer required.

Assumption 5. Assume that the weighted functions of stochastic disturbances () can be separated into their real parts and imaginary parts with following forms:where . Suppose that there exist nonnegative constants , , , and such that inequalities (5) and (7) hold:

Let , , , .

Next, we will separate (1) into its real part and imaginary part as follows:where . Let and , and , and be the real part and imaginary part of matrices , , and , respectively. Let and be the real part and imaginary part of external input .

For a complex number vector , let and .

Definition 6. The equilibrium point of (1) is said to be exponentially stable in the mean square sense if, for all , there exist constants >0 and such that where

Lemma 7 (see [8]). Let be a real number matrix with , . The following statements are equivalent:
(i) is an M-matrix.
(ii) The real parts of all eigenvalues of are positive.
(iii) There exists a positive vector such that 0.

Lemma 8 (see [39]). Let be an process given by Let be again an process with second derivative with respect to y; then where is computed according to the rules .=.=.=0 and .=.

Remark 9. By substituting (13) into (14) and considering the computation rule in Lemma 8, we can obtain the following equation, which will be used in the process of the proof for theorems follow-up.

3. Main Results on Mean Square Exponential Stability

For simplification, let , , .

In what follows, we will study the mean square exponential stability of the equilibrium point of (1). For analysis convenience, we translate the coordinate of (1). Let , .

By the translation, (1) can be changed as follows, where ,where

Moreover, the initial conditions of (1) are of the forms , .

Obviously, if the zero solution of the system being composed of (16) and (17) is exponentially stable in the mean square sense, then the equilibrium point of the system being composed of (9) and (10) is also exponentially stable in the mean square sense.

Next, we will give some sufficient conditions for assuring the mean square exponential stability of the equilibrium point of (1). First of all, we let

Theorem 10. If Assumptions 15 are satisfied and the matrix is an M-matrix, then the equilibrium point of (1) is of mean square exponential stability for any external input .

Proof. Because is an M matrix, according to Lemma 7 there exists a positive vector such that the following inequalities hold, where and , when ; and when : Construct the following functions: Because and are continuous with respect to , and , , there exist constants and , such that and , . Obviously, there exists a positive constant satisfying such that and hold; that is,Choose a candidate vector Lyapunov function in what follows: i.e.,