Research Article | Open Access

# Uniform Stability Analysis of Fractional-Order BAM Neural Networks with Delays in the Leakage Terms

**Academic Editor:**Xiao He

#### Abstract

A class of fractional-order BAM neural networks with delays in the leakage terms is considered. By using inequality technique and analysis method, several delay-dependent sufficient conditions are established to ensure the uniform stability of such networks. Moreover, the sufficient conditions guaranteeing the existence, uniqueness, and stability of the equilibrium point are also obtained. In addition, three simulation examples are given to demonstrate the effectiveness of the obtained results.

#### 1. Introduction

The bidirectional associative memory (BAM) neural networks models, first proposed and studied by Kosko [1], have been widely applied within various engineering and scientific fields such as pattern recognition, signal and image processing, artificial intelligence, and combinatorial optimization [2]. In such applications, it is of prime importance to ensure that the designed neural networks are stable [3].

In hardware implementation of a neural network using analog electronic circuits, time delay will be inevitable and occur in the signal transmission among the neurons [4], which will affect the stability of the neural system and may lead to some complex dynamic behaviors such as oscillation, divergence, chaos, and instability or other poor performances of the neural networks [5]. In this case, the time delay may substantially affect the performance of the recurrent neural networks. Therefore, the study of stability for delayed neural networks is of both theoretical and practical importance. In the past few decades, a considerable number of sufficient conditions on the existence, uniqueness, and stability of equilibrium point for delayed BAM neural networks were reported under some assumptions; for example, see [2–17] and references therein.

In recent years, since the theory and application of fractional differential equations gradually developed [18–20], efforts have been made to study the complex dynamics of fractional-order neural networks. In [21], the authors firstly introduced a new class of cellular neural networks with fractional order. The peculiarity of the new cellular neural networks model consisted in replacing the traditional first order cell with a noninteger order one. The introduction of fractional-order cells, with a suitable choice of the coupling parameters, led to the onset of chaos in a two-cell system of a total order of less than three. A theoretical approach, based on the interaction between equilibrium points and limit cycles, was used to discover chaotic motions in fractional cellular neural networks. In [22], the authors investigated the existence of chaos by using the harmonic balance theory. A circuit realization of the proposed fractional two-cell chaotic cellular neural networks was reported and the corresponding strange attractor was also shown. In [23], the authors presented an algorithm of numerical solution for fractional differential equations and investigated chaos control and synchronization in a fractional neuron network system. In [24], the authors proposed a fractional-order Hopfield neural network and investigated its stability by using energy function. In [25], a new type of stability and synchronization, -exponential stability and -synchronization, was investigated for a class of fractional-order neural networks. Several criteria were derived for such kind of stability of the addressed networks by handling a new fractional-order differential inequality. In [26], chaos and hyperchaos for fractional-order cellular neural networks were investigated by means of numerical simulations. The existence of chaotic and hyperchaotic attractors was verified with the related Lyapunov exponent spectrum, bifurcation diagram, and phase portraits. In [27], the authors investigated stability, multistability, bifurcations, and chaos for fractional-order Hopfield neural networks. In [28], the synchronization problem was studied for a class of fractional-order chaotic neural networks. By using the Mittag-Leffler function, M-matrix and linear feedback control, a sufficient condition was developed ensuring the synchronization of such neural models with the Caputo fractional derivatives. In [29], a class of fractional-order neural networks with delay was considered; a sufficient condition was established for the uniform stability of such networks. Moreover, the existence, uniqueness, and stability of its equilibrium point were also proved. In [30], the authors introduced memristor-based fractional-order neural networks. The conditions on the global Mittag-Leffler stability and synchronization were established by using Lyapunov method for these networks. In [31], the authors investigated the finite-time stability for Caputo fractional neural networks with distributed delay and established a delay-dependent stability criterion by using the theory of fractional calculus and generalized Gronwall-Bellman inequality approach. In [32], the global projective synchronization for fractional-order neural networks was investigated. Based on the preparation and some analysis techniques, several criteria were obtained to realize projective synchronization of fractional-order neural networks via combining open loop control and adaptive control.

Recently, some authors considered the uniform stability of delayed neural networks; for example, see [33–36] and references therein. In [33], the local uniform stability of competitive neural networks with different time-scales under vanishing perturbations was investigated; several stability conditions were established based on Gershgorin’s Theorem. In [34], the authors considered the uniform asymptotic stability and global asymptotic stability of the equilibrium point for time-delays Hopfield neural networks. Several criteria of the system were derived by using the Lyapunov functional method and the linear matrix inequality approach for estimating the upper bound of the derivative of Lyapunov functional. In [35], the authors showed the uniform stability and existence and uniqueness of the equilibrium point of the fractional-order complex-valued neural networks with time delays firstly. In [36], the authors discuss the existence and global uniform asymptotic stability of almost periodic solutions for cellular neural networks. By utilizing the theory of the almost periodic differential equation and the Lyapunov functionals method, some sufficient conditions were obtained to ensure the existence and global uniform asymptotic stability. To the best of our knowledge, however, there are few results on the uniform stability analysis of fractional-order BAM neural networks.

Motivated by the above discussions, the objective of this paper is to study the uniform stability analysis of fractional-order BAM neural networks with delays in the leakage terms. In order to demonstrate the stability of our proposed model, a novel norm which can be found in [29, 35] will be introduced, and several delay-dependent sufficient conditions ensuring the uniform stability of our model will be established. Incidentally, when it comes to the proof of the existence, uniqueness, and stability of the equilibrium point of the proposed model, we will utilize the common norm for convenience.

The rest of the paper is structured as follows. In Section 2, we will present the proposed model and recall some necessary definitions and lemmas. In Section 3, a sufficient criterion ensuring the uniform stability of such neural networks is presented and the existence and uniqueness of the equilibrium of the model is also demonstrated. Three numerical examples are presented to manifest the effectiveness of our theoretical results in Section 4. Finally, the paper is concluded in Section 5.

#### 2. Model Description and Preliminaries

In this paper, we consider the following fractional-order BAM neural networks with delays in the leakage terms: or in the vector form where denotes Caputo fractional derivative of order , ; , , , and are the state of the th neuron from the neural field and the th neuron from the neural field at time , respectively; and denote the activation functions of the th neuron from the neural field and and denote the activation functions of the th neuron from the neural field ; and are constants, which denote the external inputs on the th neuron from and the th neuron from , respectively; the positive constants and denote the rates with which the th neuron from the neural field and the th neuron from the neural field will reset their potential to the resting state in isolation when disconnected from the networks and external inputs, respectively; the constants , , , and denote the connection strengths; the nonnegative constants and denote the leakage delay and the transmission delay, respectively; and are diagonal matrices, , and are the connection weight matrices; and and are the external inputs.

The initial conditions associated with system (1) are of the form where , and it is usually assumed that , , , and , with the norm given by and .

Throughout this paper, we make the following assumption.

(H) The activation functions , , , and are Lipschitz continuous; that is, there exist constants , , , and such that for any , , and .

To prove our results, the following definitions and lemma are necessary.

*Definition 1 (see [18]). *The Riemann-Liouville fractional integral with fractional-order of function is defined as follows:
where is the Gamma function and .

*Definition 2 (see [18]). *The Caputo fractional derivative of fractional-order of function is defined as follows:
where is the first integer greater than ; that is, .

Particularly, when ,

*Definition 3. *The solution of system (1) is said to be uniformly stable if for any , there exists such that, for any two solutions , , of system (1) with initial functions , , respectively, it holds that
for all , whenever
where

Lemma 4 (see [20]). *Let be a positive integer such that ; if , then
**In particular, if and , then
*

*Remark 5. *It is noted that when the leakage delay , the system (1) becomes the following fractional-order BAM neural networks with delay
with initial conditions
for , .

#### 3. Main Results

Theorem 6. *Under assumption (H), the system (1) is uniformly stable, if , , and hold, where
*

*Proof. *Assume that and are any two solutions of system (1) with the initial conditions (3), then
From Lemma 4, we can obtain
Further, we have that
for , .

It follows from assumption (H) and inequality (18) that

From (20), we can get
which implies
where , .

Similarly, it follows from assumption (H) and inequality (19) that
where , .

Substituting (23) into (22), we can obtain
By using condition , (24) implies
And, substituting (22) into (23), we can get
By using condition , (26) implies
If we take
where
From (25), we can obtain
If we take
where
from (27), we can get

From (30) and (33), we say that, for any , then there exists a constant , , or such that , , when , , which means that the solution of system (1) is uniformly stable. The proof is completed.

Theorem 7. *Under assumption (H) and the conditions of Theorem 6, the system (1) has a unique equilibrium point, which is uniformly stable if and hold, where
*

*Proof. *Let , , and construct a mapping defined by
where
for , .

Now, we will show that is a contraction mapping on . In fact, for any two different points and , we have
By using conditions and , (37) implies
which implies that is a contraction mapping on . Hence, there exists a unique fixed point such that ; that is
for , . That is
for , , which implies that is an equilibrium point of system (1). Moreover, it follows from Theorem 6 that is uniformly stable. The proof is completed.

Corollary 8. *Under assumption (H), the system (13) is uniformly stable, if , , and hold, where
*

*Proof. *Similar to the proof of Theorem 6, we can obtain the above Corollary 8; thus, we omit it.

Corollary 9. *Under assumption (H) and the conditions of Corollary 8, the system (13) has a unique equilibrium point, which is uniformly stable if and hold, where
*

*Proof. *Similar to the proof of Theorem 7, we can obtain the above Corollary 9; thus, we omit it.

*Remark 10. *In [25], the authors investigated -stability and -synchronization of fractional-order neural networks without delays. In [28], the authors introduced a class of fractional-order chaotic neural networks without delays and discussed the synchronization of such networks. In [29], the authors took the constant delay into account and discussed the dynamic analysis of a class of fractional-order neural networks with constant delay. In [35], the authors investigated the uniform stability of fractional-order complex-valued neural networks with constant delay. Different from the previous works, here, we have viewed the stability analysis of fractional-order BAM neural networks with delays in the leakage terms.

*Remark 11. *In [29, 35], several delay-independent stability conditions were given for fractional-order neural networks with constant delay. In this paper, a delay-dependent stability condition was provided. It is known that delay-dependent conditions are usually less conservative than delay-independent ones, especially in the case when the delay size is small [10]. In addition, the positive constants () in the model of [29] was required satisfying . However, the obtained results in this paper show that when the leakage delay , the positive constants () and () could be possible, and the simulation examples in the next section verify the validity of our results.

#### 4. Examples

*Example 1. *Consider the following fractional-order BAM neural networks with delays in the leakage terms:
where , , , , ,

By calculation, , , , , , and , which satisfy ; according to Theorem 6, when we select the appropriate initial values, the system (43) could realize uniform stability. Furthermore, we have =, = ; by utilizing Theorem 7, we can obtain that the system (43) has an unique equilibrium point which is uniformly stable.

In order to check the validity of Theorems 6 and 7, the following five cases are given: case 1 with the initial values , case 2 with the initial values = , case 3 with the initial values , case 4 with the initial values = , and case 5 with the initial values