- About this Journal ·
- Abstracting and Indexing ·
- Aims and Scope ·
- Annual Issues ·
- Article Processing Charges ·
- Author Guidelines ·
- Bibliographic Information ·
- Citations to this Journal ·
- Contact Information ·
- Editorial Board ·
- Editorial Workflow ·
- Free eTOC Alerts ·
- Publication Ethics ·
- Recently Accepted Articles ·
- Reviewers Acknowledgment ·
- Submit a Manuscript ·
- Subscription Information ·
- Table of Contents

Abstract and Applied Analysis

Volume 2013 (2013), Article ID 576721, 13 pages

http://dx.doi.org/10.1155/2013/576721

## Global Exponential Stability Criteria for Bidirectional Associative Memory Neural Networks with Time-Varying Delays

^{1}Department of Mathematics, Chiang Mai University, Chiang Mai 50200, Thailand^{2}Centre of Excellence in Mathematics CHE, Si Ayutthaya Road, Bangkok 10400, Thailand

Received 6 February 2013; Accepted 29 April 2013

Academic Editor: Yuming Chen

Copyright © 2013 J. Thipcha and P. Niamsup. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

#### Abstract

The global exponential stability for bidirectional associative memory neural networks with time-varying delays is studied. In our study, the lower and upper bounds of the activation functions are allowed to be either positive, negative, or zero. By constructing new and improved Lyapunov-Krasovskii functional and introducing free-weighting matrices, a new and improved delay-dependent exponential stability for BAM neural networks with time-varying delays is derived in the form of linear matrix inequality (LMI). Numerical examples are given to demonstrate that the derived condition is less conservative than some existing results given in the literature.

#### 1. Introduction

A class of neural networks related to bidirectional associative memory (BAM) has been introduced by Kosko [1]. This model generalized the single-layer autoassociative Hebbian correlator to a two-layer pattern-matched heteroassociative circuit. It is an important model with the ability of information memory and information association, which is crucial for various applications such as pattern recognition, solving optimization problems, and automatic control engineering [2–10]. In [1, 11], Kosko investigates the global stability of BAM models and obtains a severe constraint of having a symmetric connection weight matrix. Since it is impossible to maintain an absolutely symmetric connection weight matrix, asymmetric connection has been a focus of this field. Some of these applications require that there should be a well-defined computable solution for all possible initial states. From a mathematical point of view, this means that the equilibrium point of the designed cellular neural networks (CNNs) is globally asymptotically stable (GAS) or globally exponentially stable (GES). Moreover, in biological and artificial neural networks, time delays arise in the process of information transmission; for example, in the electronic implementation of analogue neural networks, time delays occur in the communication and response of neurons owing to the finite switching speed of amplifiers. It is known that they can create an oscillatory or an unstable phenomenon. Therefore, the study of the stability and convergent dynamics of BAM neural networks with delays has raised considerable interest in recent years; see for examples [5, 7, 9, 10, 12–23] and the references cited therein. In [14, 15, 18, 20–22, 24–27], several sufficient conditions on the global exponential stability of BAM neural networks with time-varying delays have been derived. It is worth pointing out that the given criteria in [14, 15, 18, 20–22, 24–27] required the following hypothesis: the time-varying delays are continuously differentiable, the derivative of time-varying delays is smaller than one, and activation functions are bounded and monotonically nondecreasing. The common approach for studying stability of BAM neural networks is Lyapunov stability theory. With a properly designed Lyapunov-Krasovskii functional as well as introducing free-weighting matrices, one may derive stability criteria in term of linear matrix inequality (LMI) which is easily solved by several available algorithms.

Based on the above discussion, we propose to study the problem of global exponential stability of BAM neural networks with time-varying delays and generalized activation functions. The main contributions of our works are that the system consists of both memoryless and delayed activation functions, and the lower and upper bounds of the activation functions are allowed to be either positive, negative, or zero which is more general than systems considered in [14, 15, 18, 21, 22, 24–27]. By constructing a new and improved Lyapunov-Krasovskii functional which contains some integral terms of the activation functions, less conservative results are obtained by introducing appropriate free-weighting matrices and by using some improved integral inequality. Finally, two numerical examples are presented to show that our result is less conservative than some existing ones.

*Notations*. Throughout the paper, denotes the set of all real numbers. denotes the elements below the main diagonal of a symmetric block matrix. denotes the diagonal matrix. For symmetric matrices and , the notation (resp., ) means that the matrix is positive definite (resp., nonnegative). and denote the smallest and largest eigenvalue of given square matrix, respectively.

#### 2. Model Description and Preliminaries

Consider the following BAM neural network with time-varying delays of the form where and are the state of the th neurons from the neural field and the th neurons from the neural field , at time , respectively; and denote the neuron charging time constants and passive delay rates, respectively; and are the synaptic connection strengths; and are delayed synaptic connection strengths; and denote the activation functions of the th neurons from the neural field and the th neurons from the neural field , respectively; and denote the external inputs; and and represent the time-varying differentiable functions which satisfy where , and are positive scalars.

The initial conditions associated with (1) are assumed to be

Throughout this paper, we make the following assumption on the activation function , .(A1) and are bounded on .(A2) For any , there exist four constant matrices , , , and satisfying It is clear that under (A1) and (A2), the system (1) has at least one equilibrium; see [20]. In order to simplify our proof, we shift the equilibrium point , of system (1) to the origin. Let , , , , , , and . Then the system (1) can be transformed to The activation functions and satisfy the following properties.(H1) and are bounded on .(H2) For any , there exist constant matrices ,, , and satisfying (H3) and , , .

Rewrite the system (7) into the vector form The initial conditions associated with (7) are assumed to be where , , , , , , , , , , .

*Definition 1 (see [14]). *The trivial solution of system (7) is said to be globally exponentially stable if there exist constants and such that
where one denotes

Lemma 2 (see [28]). *If there exist symmetric positive-definite matrix and arbitrary matrices , and such that
**
then,
*

Lemma 3 (see [25]). *For any real vectors , and any matrix with appropriate dimensions, it follows that
*

Lemma 4 (see [25]). *Suppose that (H2) holds; then
*

#### 3. Main Result

In this section, we present a theorem which states the conditions that guarantee the global exponential stability of the system (7) employing the Lyapunov stability theory and linear matrix inequality approach.

Theorem 5. *Under the assumptions (H1)–(H3), for given four diagonal matrices , , , and and positive constants , and , the system (7) is globally exponentially stable with the convergent rate , if there exist positive matrices , , , positive diagonal matrices , , and positive-definite matrices
**
such that the following LMI holds:
**
where
*

*Proof. *Choose the Lyapunov-Krasovskii function candidate for the system (7) to be
where
The derivative of along the trajectories of system (7) is given by
By (H2), we have
Substituting (21) into (20), we obtain
By (H2), we have
By (24), we conclude that
By Lemma 2, we obtain
Substituting (27) into (26), we have
From (22), (25), and (28) we obtain
where is defined as in (16), and . Since the matrix given in Theorem 5 is the negative definite matrix, we have , for all which implies that . From the definition of in (20), we obtain
where , , and .

It follows from Lemma 3 that
Substituting (31) into (30), we obtain the bound of as follows:
Thus,
where
On the other hand, we have
Therefore,
where . Therefore, the system (7) is global exponentially stable with the convergent rate . This completes the proof.

*Remark 6. *In hypothesis (H2), lower bounds and upper bounds , , , , of activation functions are allowed to be either positive, negative, or zero. Clearly, hypothesis (H2) in our paper is more general than those given in [14, 15, 18, 21, 22, 24–27]. Hence, our result is less conservative than some existing results given in the literature.

#### 4. Numerical Examples

*Example 1*. Consider the BAM neural networks in (7) with , ,

In this example, the activation function and time delay are given as follows: , , , . It follows that , , , and . The assumption (H2) is satisfied with , , . Let . By using the LMI Toolbox in MATLAB, the LMI (16) of Theorem 5 is feasible with and a set of solutions of (16) is given by Thus, the system (7) is -exponentially stable and the value . The solution of the closed-loop system satisfies

By applying Theorem 5 and by solving the LMI (16) using MATLAB LMI Toolbox, we obtain the convergence rate which guarantees that the global exponential stability is . In Table 1, we give comparison of maximum allowable convergence rate obtained by Theorem 5 and by other methods in some previous existing results. From Table 1, it is shown that the proposed global exponential stability criterion is less conservative than those obtained in [24, 25, 29].

*Example 2*. Consider the BAM neural networks in (7) with