## Fractional Calculus and its Applications in Applied Mathematics and other Sciences

View this Special IssueResearch Article | Open Access

# Robust Stability Analysis of Fractional-Order Hopfield Neural Networks with Parameter Uncertainties

**Academic Editor:**Abdon Atangana

#### Abstract

The issue of robust stability for fractional-order Hopfield neural networks with parameter uncertainties is investigated in this paper. For such neural system, its existence, uniqueness, and global Mittag-Leffler stability of the equilibrium point are analyzed by employing suitable Lyapunov functionals. Based on the fractional-order Lyapunov direct method, the sufficient conditions are proposed for the robust stability of the studied networks. Moreover, robust synchronization and quasi-synchronization between the class of neural networks are discussed. Furthermore, some numerical examples are given to show the effectiveness of our obtained theoretical results.

#### 1. Introduction

During the last decades, neural networks have received increasing attention in various fields, such as image and signal processing, associative memory, pattern recognition, optimization, control, and modelling. The Hopfield neural model is one of the most popular neural models in the previous literature. Thus, for integer-order Hopfield neural networks, it is important to study their dynamical properties, in particular, the stability analysis of integer-order Hopfield neural networks which is a prime issue for the practical design and application of neural networks. In recent years, some sufficient conditions for global asymptotic and exponential stability of integer-order Hopfield neural networks have been proposed [1â€“4]. However, the above researches did not consider the influence of parameter uncertainties, which are unavoidable due to measure errors, the parameter fluctuation, external disturbance, and so forth. For example, in electronic implementation of neural networks, some essential parameters such as release rate of neurons, connection weights between the neurons, and the transmission delays might be subject to some deviations owing to the tolerances of electronic components. Therefore, it is necessary to ensure that system be stable with respect to these uncertainties in the design and applications of neural networks. In other words, the designed neural network must be robust against such uncertainties. Nowadays, many researchers have studied the existence, uniqueness, and globally robust asymptotic stability of the equilibrium point in integer-order neural networks with parameter uncertainties and given some robust stability conditions [5â€“12]. To establish the robust stability criteria, Lyapunov functionals and linear matrix inequality (LMI) are two effective tools utilized in [5â€“8] and [9â€“12], respectively.

It is well known that chaos synchronization has attracted considerable attention due to its great potential applications in secure communication, biological science, engineering, and so on. It is worth noting that integer-order neural networks can exhibit some complicated dynamics and even chaotic behavior if the parameters are appropriately chosen [13, 14]. Thus, the synchronization for integer-order chaotic neural networks has been a much-discussed topic and the robust synchronization issue for integer-order chaotic neural networks with parameter uncertainties is also very hot [15, 16]. Similar to the robust stability of integer-order neural networks, Lyapunov functionals and LMI are two commonly used tools to analyze the robust synchronization issue between two integer-order chaotic neural networks with parameter uncertainties. However, when the parameter uncertainties between two integer-order chaotic neural networks are different, parameter mismatches would appear unavoidably. In this case, the zero equilibrium point of the error system may not exist, and it is impossible to achieve the complete synchronization. But many researchers found that the synchronization error converges to a small region around zero which is called quasi-synchronization or weak synchronization and gained much research attention [17â€“19].

All the above researches investigated only integer-order neural networks. In fact, as a generalization of integer-order differentiation and integration to arbitrary noninteger order, fractional-order calculus exists in both theoretical and applied aspects of numerous branches of science and engineering. Especially, using fractional derivative, the description of some systems is more accurate than integer-order systems, such as viscoelastic materials, electrochemical processes, long lines, dielectric polarisations, coloured noise, and cardiac behaviour [20â€“23]. Therefore, neural networks have received growing interest of scholars to build their fractional-order Hopfield models and study the characteristics of them, which show higher nonlinearity and more degrees of freedom than integer-order models [24, 25]. For the stability analysis of fractional-order neural networks, the linear stability theory of fractional-order system is the common method [26]. Moreover, in [26], Kaslik and Sivasundaram discussed the bifurcations and chaos of the nonlinear fractional-order Hopfield neural networks, which is very meaningful for synchronization between fractional-order Hopfield neural networks. Sabatier et al. [27] proposed LMI stability conditions for fractional-order systems, which provides an approach to solving robust stability and synchronization issue of fractional-order systems with parameter uncertainties [28â€“31]. Liao et al. [28] studied robust stability for fractional-order linear time-invariant (FO-LTI) interval systems with parameter uncertainties. Wong et al. [31] investigated robust synchronization of fractional-order complex dynamical networks with parameter uncertainties. What a pity, no research discussed robust stability and synchronization for fractional-order Hopfield neural networks with parameter uncertainties. Besides, the above results are all in the form of LMI, and no scholar has ever employed the Lyapunov functional method which needs less calculation than LMI. Therefore, we try to deal with these problems and employ the Lyapunov functional to analyze the robust stability and synchronization for fractional-order Hopfield neural networks with parameter uncertainties.

It must be pointed out that Lyapunov stability theorem of integer-order systems cannot be used in fractional-order systems, until the fractional-order Lyapunov direct method was presented in [32]. The (generalized) Mittag-Leffler stability of fractional-order systems can be proved if the conditions of the fractional-order Lyapunov direct method are satisfied. So based on the fractional-order Lyapunov direct method, robust stability and synchronization for fractional-order systems can be analyzed by employing suitable Lyapunov functionals. Though for a multidimensional nonlinear fractional-order system, a suitable Lyapunov functional is difficultly designed with satisfying the conditions of the fractional-order Lyapunov direct method; one will be chosen and its suitability will be proved in this paper.

Motivated by the above discussion, robust stability and synchronization for fractional-order Hopfield neural networks with parameter uncertainties are studied. First, the globally robust stability of such neural system is analyzed via a Lyapunov functional, which has less calculation than LMI in early researches [28â€“31]. Besides, the Lyapunov functional is also employed to realize robust synchronization between two such neural systems with same parameter uncertainties. Moreover, the quasi-synchronization for this class of neural networks with different parameter uncertainties is investigated by a special Lyapunov functional, which is much simpler than the Laplace transform method used in [33]. In addition, numerical simulations are proposed to show their good agreement with the theoretical results.

The rest of the paper is organized as follows. In Section 2, some preliminaries are given, including Caputo fractional-order derivative, Mittag-Leffler function, and Mittag-Leffler stability of fractional-order systems. Then, robust stability for fractional-order Hopfield neural networks with parameter uncertainties is proposed in Section 3. Robust synchronization and quasi-synchronization for fractional-order Hopfield neural networks with parameter uncertainties are presented in Section 4. Some examples in Section 5 are gained to verify the theoretical results. Finally, the paper is concluded in Section 6.

#### 2. Preliminaries

##### 2.1. Caputo Fractional-Order Derivative

As an important role in nonlinear science, fractional-order calculus has three common definitions, such as Grunwald-Letnikov, Riemann-Liouville, and Caputo definitions [34]. Especially, Caputo fractional-order derivative owns same initial conditions with integer-order derivatives, which is well-understood in physical situations and more applicable to real world problems. Thus, we use the Caputo fractional-order derivative in this paper.

*Definition 1 (Caputo fractional-order derivative). *The Caputo fractional-order derivative of order for a function is defined as
where denotes the Gamma function and is a positive integer such that .

The Laplace transform of the Caputo fractional-order derivative is
where denotes the Laplace transform and is the variable in Laplace domain.

*Property 1. *When is any constant, holds.

*Property 2. *For constants and , the linearity of Caputo fractional-order derivative is described by

##### 2.2. Mittag-Leffler Function

Mittag-Leffler function is frequently used in the solutions of fractional-order differential equations, which is similar to exponential function used in the solutions of integer-order differential equations.

*Definition 2 (see [34]). *The Mittag-Leffler function with two parameters is defined as
where , , and . When , its one-parameter form is shown as
In particular, . The Laplace transform of Mittag-Leffler function with two parameters is
where , is the variable in Laplace domain, and is the real part of , .

##### 2.3. Mittag-Leffler Stability of Fractional-Order System

Consider the following n-dimensional Caputo fractional-order system: where , , is the initial time, and is piecewise continuous on and satisfies locally Lipschitz condition on .

*Definition 3. *The constant is an equilibrium point of Caputo fractional-order dynamic system (7) if and only if .

*Remark 4. *Based on Properties 1 and 2, any equilibrium point can be translated to the origin via a change of variables. When the equilibrium point in (7) is , system (7) with the change of variable can be rewritten as
where and the new system has equilibrium point at the origin for new variable . Thus, without loss of generality, all definitions and theorems are gained for the cases when the equilibrium point is the origin; that is, .

Lemma 5 (Existence and uniqueness Theorem in [35]). *There exists a unique solution of system (7), if system (7) has equilibrium point at the origin and satisfies locally Lipschitz condition on .*

*Definition 6 (Mittag-Leffler stability [32]). *If is an equilibrium point of system (7), the solution of (7) is said to be Mittag-Leffler stable if
where , , , denotes an arbitrary norm, and satisfies locally Lipschitz condition on with Lipschitz constant .

*Definition 7 (generalized Mittag-Leffler stability [32]). *If is an equilibrium point of system (7), the solution of (7) is said to be generalized Mittag-Leffler stable if
where , , , , , and satisfies locally Lipschitz condition on with Lipschitz constant .

*Remark 8. *Both Mittag-Leffler stability and generalized Mittag-Leffler stability imply asymptotic stability; that is, with .

*Remark 9. *When the equilibrium point , the solution of system (7) is said to be Mittag-Leffler stable (or generalized Mittag-Leffler stable) if
and satisfies .

In order to analyze Mittag-Leffler stability of system (7), the fractional-order Lyapunov direct method is introduced as follows.

Lemma 10 (fractional-order Lyapunov direct method [32]). *For , the fractional-order system (7) is Mittag-Leffler stable at the equilibrium point if there exists a continuously differentiable function that satisfies
**
where satisfies locally Lipschitz condition on ; is a domain containing the origin; , , , , , , and are arbitrary positive constants. If the assumptions hold globally on , then is globally Mittag-Leffler stable.*

*Remark 11. *According to the proof of fractional-order Lyapunov direct method in [32], the condition in Lemma 10 can be weakened. If the inequality in (13) holds almost everywhere, the result of Lemma 10 also holds.

Lemma 12. *If denotes a continuously differentiable function, the following inequality holds almost everywhere:
*

*Proof. *Without loss of generality, the trajectory of can be simply described as the solid line in Figure 1. The is differentiable except at the points and , which are the solutions of . The trajectory is divided into three parts by the points and , and and are the extreme points. Point is the initial date . The dash line is the trajectory of which may be the one of in any part. denotes the coordinate of point in Figure 1.

In Part , if , we obtain = ; ifâ€‰â€‰, then

According to , we get

Using the same method, Parts and have similar result. So we obtain
except at the zero points ( and ).

#### 3. Robust Stability for Fractional-Order Hopfield Neural Networks with Parameter Uncertainties

Consider the following* n*-dimensional Caputo fractional-order Hopfield neural networks:
where , , , , and . For , is the state of the th unit at time , denotes the activation function of the th neuron, denotes the charging rate for the th neuron, and is the constant connection weight of the th neuron on the th neuron. and are matrices with time-varying parametric uncertainties. is the constant external input vector.

In the rest of this paper, denotes the 1-norm of corresponding vector or matrix . When is a vector, . If is a matrix, . An* m*-dimensional vector represents any for . If , means .

Based on the above formulations, the concept of robust stability for fractional-order Hopfield neural networks with parameter uncertainties is introduced by the following definition.

*Definition 13. *The fractional-order Hopfield neural networks (18) are globally asymptotically robust stable if the unique equilibrium point of the neural system is globally asymptotically stable under the time-varying parameter uncertainties.

In order to obtain globally asymptotically robust stability of system (18), three assumptions are given as follows.Matrices with time-varying parametric uncertainties and are bounded: there exist constants such that and .Activation functions are continuous and satisfy Lipschitz condition on with Lipschitz constant ; that is,
â€‰for all and , which ensures the existence and uniqueness of the solutions of system (18) based on Lemma 5.There exist positive constants and such that
â€‰where , , and .

Theorem 14. *Under assumptions â€“, system (18) is globally asymptotically robust stable.*

*Proof. *Firstly, we need to prove the existence and uniqueness of equilibrium point of system (18). Define a mapping , where and
According to , for any two vectors , we obtain
Then, we have
From , for any , we get
So from inequalities (23) and (24), we gain
which means the mapping : is a contraction mapping on . Thus, there exists a unique fixed point such that ; that is,
Denote for , then
Thus, we have
where is the unique equilibrium of system (18).

Secondly, let us prove that system (18) is globally asymptotically robust stable. Let and be any two solutions of system (37) with different initial values. We can obtain the error system with error :

Construct a Lyapunov functional as
It is obvious that the Lyapunov functional (30) satisfies the condition as the inequality in (12). Then, we are going to prove that the Lyapunov functional (30) also satisfies the condition as the inequality in (13) almost everywhere.

According to Definition 1, is continuously differentiable. Due to Assumptions (*A*_{1})â€“(*A*_{3}), Lemma 12, and (24), the following inequality holds almost everywhere:
Thus, based on Lemma 10, the error system (29) is globally Mittag-Leffler stable; that is,
Due to the fact that is the unique equilibrium of system (37), we have
for any solution of system (37).

Therefore, the unique equilibrium point of system (18) is globally Mittag-Leffler stable and the fractional-order Hopfield neural networks (18) with parameter uncertainties are globally asymptotically robust stable.

*Remark 15. *In Theorem 14, the robust stability of system (18) is deduced from its Mittag-Leffler stability. Based on the fractional-order Lyapunov direct method, the global Mittag-Leffler stability of system (18) can be guaranteed by employing the Lyapunov functional techniques. This method which is very convenient to implement in practice can be applied to almost all neural networks with the uniform Lipschitz activation functions. Moreover, it should be noted that the convergence rate is affected by the parameter in Mittag-Leffler stability. In Theorem 14, the unique equilibrium point is faster converged to with a larger .

#### 4. Synchronization for Fractional-Order Hopfield Neural Networks with Parameter Uncertainties

In this section, synchronization for fractional-order Hopfield neural networks with parameter uncertainties is investigated. For same or different parameter uncertainties, robust synchronization and quasi-synchronization for fractional-order Hopfield neural networks are studied by employing the chosen Lyapunov functionals.

##### 4.1. Robust Synchronization for Fractional-Order Hopfield Neural Networks with Same Parameter Uncertainties

Consider two chaotic fractional-order Hopfield neural networks with same parameter uncertainties as where and are the state vectors of drive system (34) and response system (35), respectively. , , , , , , and are defined the same as the ones in system (18). is the control law and , where is a positive constant.

*Definition 16. *Systems (34) and (35) with parameter uncertainties are said to realize robust synchronization, if error vector converges to zero; that is,

Theorem 17. *For systems (34) and (35), the robust synchronization can be realized, if and hold and there exist positive constants and such that
**
where , , and .*

*Proof. *For systems (34) and (35), the error system can be described as
Based on Theorem 14, the unique equilibrium point of system (38) is globally Mittag-Leffler stable; that is,
Thus, the robust synchronization between systems (34) and (35) is realized.

##### 4.2. Quasi-Synchronization for Fractional-Order Hopfield Neural Networks with Different Parameter Uncertainties

Consider two chaotic fractional-order Hopfield neural networks with different parameter uncertainties as where and are the state vectors of drive system (40) and response system (41), respectively. , , , , and are defined the same as the ones in system (18). , , , and are different matrices with time-varying parametric uncertainties. is the control law and , where is a positive constant.

For systems (40) and (41), the error system can be written as

Owing to the differences of parameter uncertainties between systems (40) and (41), is not the equilibrium point of error system (42). So the robust synchronization between systems (40) and (41) cannot be realized under the given control law . However, the quasi-synchronization between systems (40) and (41) can be investigated under some conditions.

*Definition 18. *The quasi-synchronization between systems (40) and (41) is realized with error bound if there exists a such that for all and initial values ,

Then, the following assumptions and Lemma 19 are introduced to realize the quasi-synchronization between systems (40) and (41).The state vector of drive system (40) is bounded: there exist constants such that . And function vector is bounded; that is, , for any which satisfies .Matrices with time-varying parametric uncertainties , , , and are bounded: there exist constants such that , , , and .There exists positive constant such that
â€‰where and .

Lemma 19 (fractional-order comparison principle [36]). *If with and , then .*

Theorem 20. *If and ( A_{4})â€“(A_{6}) hold, the quasi-synchronization between systems (40) and (41) can be realized with error bound , where is an arbitrary small constant and .*

*Proof. *For the error system (42), choose a Lyapunov functional as
According to , , , and Lemma 12, the following inequality holds almost everywhere:
From (24) and , we obtain
Then, consider the following one-dimensional fractional-order system:
According to Theorem 14, the unique equilibrium point of system (48) is globally Mittag-Leffler stable; that is,
Due to (47), (48), and Lemma 19, we gain with . Thus, there exists a such that for all ,
and the quasi-synchronization between systems (40) and (41) can be realized with error bound .

*Remark 21. *If the control parameter gets larger, the error bound will become smaller. So the synchronization error bound can be made as small as we want by choosing suitable control parameters. It is very important to chaos synchronization and nonlinear systems control.

#### 5. Numerical Simulations

The effectiveness of the obtained theoretical results is demonstrated by the following examples. The predictor-corrector scheme is used for the approximate numerical solutions of the fractional-order neural networks.

##### 5.1. An Example of Robust Stability for Fractional-Order Hopfield Neural Networks with Parameter Uncertainties

For system (18), consider the following neural networks of neurons with hub structure [26]. Let , , , and
Then, system (18) can be written as
Obviously, we know that and in system (52). When and , Figure 2 is the solution of system (52) with different and initial value . Then, we choose , , and , so (*A*_{1})â€“(*A*_{3}) hold with the existing . According to Theorem 14, the unique equilibrium point of system (52) is globally robust stable. With same and initial value in Figure 2, Figure 3 shows that the solution of system (52) converges to the equilibrium point which verifies the effectiveness of Theorem 14.

**(a)**

**(b)**

**(a)**

**(b)**

##### 5.2. An Example of Synchronization for Fractional-Order Hopfield Neural Networks with Parameter Uncertainties

In systems (40) and (41), consider the following neural networks of neurons [37]. Let , , , and

*Case 1. *When and
systems (40) and (41) have same parameter uncertainties. So the drive-resonse systems can be written as
If the respond system (56) is not controlled by control law, that is, , the error states are shown in Figure 4 with different and initial values and . and are obtained in systems (55) and (56). , , and the control parameter are chosen, such that , , and (37) are satisfied with the existing . According to Theorem 17, the robust synchronization between systems (55) and (56) can be realized. With same and initial values in Figure 4, the error states are shown with converging to 0 in Figure 5.

**(a)**

**(b)**

**(a)**

**(b)**

*Remark 22. *In [37], authors pointed out that convergent rate is getting bigger with the increasing of fractional-order . From Figures 3 and 5, the convergent speed with is faster than the one with , which coincides with their results.

*Case 2. *If , , and
systems (40) and (41) have different parameter uncertainties. Then, the drive-response systems can be described as