Abstract and Applied Analysis

Volume 2014, Article ID 560861, 12 pages

http://dx.doi.org/10.1155/2014/560861

## Robust Stability Analysis of Neutral-Type Hybrid Bidirectional Associative Memory Neural Networks with Time-Varying Delays

^{1}College of Automation, Chongqing University, Chongqing 400044, China^{2}Department of Mathematics and Information Engineering, Chongqing University of Education, Chongqing 400065, China^{3}School of Engineering, University of Guelph, Guelph, ON, Canada N1G 2W1

Received 5 March 2014; Accepted 2 May 2014; Published 27 May 2014

Academic Editor: Gani Stamov

Copyright © 2014 Wei Feng et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

#### Abstract

The global asymptotic robust stability of equilibrium is considered for neutral-type hybrid bidirectional associative memory neural networks with time-varying delays and parameters uncertainties. The results we obtained in this paper are delay-derivative-dependent and establish various relationships between the network parameters only. Therefore, the results of this paper are applicable to a larger class of neural networks and can be easily verified when compared with the previously reported literature results. Two numerical examples are illustrated to verify our results.

#### 1. Introduction

Stability analysis of neural networks is an issue of both theoretical and practical importance due to the fact that in some applications the designed neural network is required to have a unique and stable equilibrium point [1–3]. Time delays are unavoidably encountered in the implementation of neural networks, which may cause undesirable dynamic network behaviors such as oscillation and instability. On the other hand, in practice, the weight coefficients of the neurons depend on certain resistance and capacitance values which are subject to uncertainties. In the design of neural networks, it is important to ensure that the system is stable with respect to these uncertainties.

It is well known that a series of neural networks related to bidirectional associative memory (BAM) models have been proposed by Kosko [4, 5]. These models generalized the single-layer autoassociative Hebbian correlation to a two-layer pattern-matched heteroassociative circuit. This class of networks has been successfully applied to pattern recognition and artificial intelligence. A great number of results for BAM neural networks concerning the existence of equilibrium point and global asymptotic or robust stability have been derived [6–32].

Moreover, due to the complicated dynamic properties of the neural cells in the real world, the existing neural network models in many cases cannot characterize the properties of a neural reaction process precisely. It is natural and important that systems will contain some information about the derivative of the past state to further describe and model the dynamics for such complex neural reactions [33, 34]. However, the stability analysis of BAM neural networks of neutral type has been investigated by only a few researchers [18, 35–37].

However, the existing stability results [18, 36, 37] derived for the BAM neural networks can be applicable when only a pure delayed neural network model is considered. Recently, a more general class of BAM neural network models, called the hybrid BAM neural network in which both instantaneous and delayed signaling occur, was considered and some sufficient condition for robust stability of this class of BAM neural networks has been presented [23, 25, 38]. But, up to now, there are few results on stability of neutral-type hybrid BAM neural networks with time-varying delays.

Motivated by the preceding discussion, in this paper, we are going to deal with the problem of global asymptotic robust stability for neutral-type hybrid bidirectional associative memory neural networks with time-varying delays and parameters uncertainties. By constructing a novel Lyapunov functional, novel delay-derivative-dependent criteria are derived. Finally, two examples are provided to demonstrate the effectiveness of the obtained results.

Throughout this paper, we will use the following notations: let be a column vector and let be a real matrix. The three commonly used vector norms , , and are defined as

The three commonly used matrix norms , , and are defined as follows:

For the vector , will denote . For the matrix , the matrix will denote and and will denote the minimum and maximum eigenvalues of , respectively. If and are two real symmetric matrices, then will imply that , .

#### 2. Problem Formulation

Dynamical behavior of a neutral-type hybrid BAM neural network with time-varying delays is described by the following set of differential equations: in which and are the neuron state vectors, and denote the neuron charging time constants and passive decay rates, respectively, , , , and are the connection weights at the time , and represent the activation functions of the neurons and the propagational signal functions, respectively, and , , denote the external inputs. and are positive constants which correspond to the finite speed of axonal signal transmission.

It will be assumed that , , , , , , , and in system (3) are uncertain but bounded and belong to the following intervals: (H1) and are differentiable functions that satisfy for all and prescribed scalars , , , and .

The activation functions satisfy the following properties.(H2)There exist some positive constants , , and , , such that for all .(H3)There exist positive constants , , and , , such that and for all . Note that this assumption implies that the activation functions are bounded.

Assume that and are the equilibrium points of the system. In order to simplify our analysis, we transform the equilibrium points to the origin by the relationship

Then, the transformed system is as follows: where , = , , , , . The functions , and are of the form

It can be verified that the functions and satisfy the assumptions on and ; that is, , , and , implies that and , respectively. We also note that and , .

By assumption (H2) and the above equations, we can have

#### 3. Preliminaries

In this paper, we will assume that the norms of the matrices , , , and are bounded. Based on this property, we can directly observe the following facts.

*Fact 1.* If , , , and satisfy the parameter ranges defined by (4) and have bounded norms, then there exist some positive constants , , , and

Lemma 1 (Faydasicok and Arik [39]). *For , the following equation holds:
**
where and .*

Lemma 2 (Cao et al. [40]). *For , the following equation holds:
**
where and .*

Lemma 3 (Ensari and Arik [41]). *For , the following equation holds:
**
where and .*

Lemma 4 (Singh [42]). *For , the following equation holds:
**
where with .*

Lemma 5. *For any two vectors and , the following inequality holds:
**
where is any positive constant.*

#### 4. Global Robust Stability Results

Note that the equilibrium point of system (3) is globally asymptotically stable, if the origin of system (8) is a globally asymptotically stable equilibrium point. Therefore, in order to prove the global asymptotic stability of the equilibrium point of system (3), it will be sufficient to prove the global asymptotic stability of the origin of system (8). We can now proceed with the following result.

Theorem 6. *For given scalars and , let the activation functions satisfy assumptions (H2) and (H3) and let the network parameters satisfy (4). Then, the origin of neural network model (8) is globally asymptotically stable, if there exist positive diagonal matrices and , positive definite matrices , and , and four positive scalars , , , and , such that
**
where
*

*Proof. *Define the following positive definite Lyapunov functional:

The derivative of along the trajectories of the system is obtained as follows:

We can write the following inequalities as follows:

Combining (22)–(31) into (21) and considering
we have

Let , , , and , can be written in the form

By Fact 1, , , , , , and , one can have
or equivalently

Clearly, and , imply that . On the other hand, as , , meaning that the Lyapunov functional used for the stability analysis is radially unbounded. Then, by the standard Lyapunov functional theory, it is concluded that system (8) or equivalently the equilibrium point of system (3) is globally asymptotically stable. This completes the proof of Theorem 6.

*Remark 7. *The stability results presented [18, 36, 37] considered a pure delayed neural network mode and are expressed in the linear matrix inequality (LMI) forms. The LMI approach to the stability problem of neutral-type neural networks involves some difficulties with determining the constraint conditions on the network parameters as it requires testing positive definiteness of high dimensional matrices. However, Theorem 6 considers hybrid BAM neural networks and establishes various relationships between the network parameters only. Therefore, the results of this paper are applicable to a larger class of neural networks and can be easily verified when compared with the previously reported literature results.

*Choosing , , , and in the conditions of Theorem 6 as , , , and , we can express some special cases of Theorem 6 as follows.*

*Corollary 8. For given scalars and , let the activation functions satisfy assumptions (H2) and (H3) and let the network parameters satisfy (4). Then, the origin of neural network model (8) is globally asymptotically stable, if there exist eight positive scalars , , , , , , , and , such that
and the other parameters are defined in Theorem 6.*

*By setting , the stability criterion for hybrid BAM neural network with constant time delays is established from Theorem 6.*

*Corollary 9. Let the activation functions satisfy assumptions (H2) and (H3) and let the network parameters satisfy (4). Then, the origin of neural network model (8) is globally asymptotically stable, if there exists eight positive scalars , , , , , , , and , such that
and the other parameters are defined in Theorem 6.*

*Assume that there are no neutral terms and the system of BAM neural networks is described as
*

*Define the following positive definite Lyapunov functional:
*

*Following the similar line of the proof of Theorem 6, Corollary 10 is derived as follows.*

*Corollary 10. For given scalars , and , let the activation functions satisfy assumptions (H2) and (H3) and let the network parameters satisfy (4). Then, the origin of neural network model (8) is globally asymptotically stable, if there exist four positive scalars , , , and , such that
and the other parameters are defined in Theorem 6.*

*5. Comparative Numerical Examples*

*We will now give the following examples to demonstrate the applicability and advantages of our results.*

*Example 1. *Assume that the network parameters of neural system (8) are given as follows:
where is real number. We can conclude that the matrices , , , , , , , and are in the forms

Then we obtain

Since , we obtain .

*For the sufficiently small values of , , *