- About this Journal ·
- Abstracting and Indexing ·
- Advance Access ·
- Aims and Scope ·
- Annual Issues ·
- Article Processing Charges ·
- Articles in Press ·
- Author Guidelines ·
- Bibliographic Information ·
- Citations to this Journal ·
- Contact Information ·
- Editorial Board ·
- Editorial Workflow ·
- Free eTOC Alerts ·
- Publication Ethics ·
- Reviewers Acknowledgment ·
- Submit a Manuscript ·
- Subscription Information ·
- Table of Contents

Abstract and Applied Analysis

Volume 2013 (2013), Article ID 143585, 9 pages

http://dx.doi.org/10.1155/2013/143585

## An Analysis of Stability of a Class of Neutral-Type Neural Networks with Discrete Time Delays

^{1}Department of Computer Engineering, Istanbul University, Avcilar, 34320 Istanbul, Turkey^{2}Department of Electrical and Electronics Engineering, Isik University, Sile, 34980 Istanbul, Turkey

Received 17 April 2013; Accepted 21 May 2013

Academic Editor: Zidong Wang

Copyright © 2013 Zeynep Orman and Sabri Arik. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

#### Abstract

The problem of existence, uniqueness, and global asymptotic stability is considered for the class of neutral-type neural network model with discrete time delays. By employing a suitable Lyapunov functional and using the homeomorphism mapping theorem, we derive some new delay-independent sufficient conditions for the existence, uniqueness, and global asymptotic stability of the equilibrium point for this class of neutral-type systems. The obtained conditions basically establish some norm and matrix inequalities involving the network parameters of the neural system. The main advantage of the proposed results is that they can be expressed in terms of network parameters only. Some comparative examples are also given to compare our results with the previous corresponding results and demonstrate the effectiveness of the results presented.

#### 1. Introduction

In recent years, dynamical neural networks have been employed in solving many practical engineering problems such as signal and image processing, pattern recognition, associative memories, parallel computation, and optimization and control problems [1–10]. In such applications, it is important to know the dynamics of the designed neural networks. In addition, when using delayed neural networks, time delays might affect the transmission rate and cause instability. Therefore, the analysis of stability of neural networks with time delays is indispensable for solving engineering system problems. In the recent literature, many papers have studied the problem of global stability of different classes of neural networks by exploiting various analysis techniques and methods and presented some useful stability results for delayed neural networks. In practice, in order to precisely determine the equilibrium and stability properties of neural networks, the information about time derivatives of the past states must be introduced into the state equations of neural networks. A neural network of this model is called neutral-type neural networks. Some global stability results of various classes of neural networks with time delays have been reported in [1–33]. The goal of our paper is to present some new and alternative stability results of neutral-type neural networks with discrete time delays with respect to Lipschitz continuous activation functions.

Throughout this paper we will use these notations: for any matrix , will denote that is symmetric and positive definite; , , , and will denote the transpose of , the inverse of , the minimum eigenvalue of , and the maximum eigenvalue of , respectively. We will use the matrix norm . For any two positive definite matrices and . If , then will imply that . For , we will use the vector norms and .

#### 2. Problem Statement

The class of neutral-type neural network model with discrete time delays is described by the following set of nonlinear differential equations: where is the number of the neurons in the network, denotes the state of the th neuron, and the parameters are some constants: the constants denote the strengths of the neuron interconnections within the network; the constants denote the strengths of the neuron interconnections with time delay parameters . are coefficients of the time derivative of the delayed states, the functions denote the neuron activations, and the constants are some external inputs. In system (1), represents the delay parameter with , . Accompanying the neutral system (1) is an initial condition of the form: , where denotes the set of all continuous functions from to .

We will assume that the activation functions , , are Lipschitz continuous; for example, there exist some constants such that Neural network model (1) can be written in the vector-matrix form as follows: where , , , , , , , and .

In order to obtain our main results, the following lemma will be needed.

Lemma 1 (see [23]). *If a map satisfies the following conditions:*(i)* for all ,*(ii)* as ,**then, is homeomorphism of . *

#### 3. Existence and Uniqueness Analysis

This section deals with obtaining the sufficient conditions that ensure the existence and uniqueness of the equilibrium point for neutral-type neural network model (1). The main result is given in the following result.

Theorem 2. *For the neutral-type neural network model (1), let and the activation functions satisfy (2). Then, the system (1) has unique equilibrium point for each if there exist positive diagonal matrices and and positive definite matrices , , and such that the following conditions hold:
**
where .*

*Proof. *We will make use of the result of Lemma 1 for the proof of the existence and uniqueness of the equilibrium point for system (1). Let us define the following mapping associated with system (1):
If is an equilibrium point of (1), then satisfies the equilibrium equation:
Clearly, the solution of the equation is an equilibrium point of (1). Therefore, in the light of Lemma 1, we can conclude that, for the system defined by (1), there exists a unique equilibrium point for every input vector if is homeomorphism of . We will now show that the conditions of Theorem 2 imply that is a homeomorphism of . To this end, we choose any two vectors and such that . When the activation functions satisfy (2), for , we have two cases: first case is and , and the second case is and . Let us carry out the existence and uniqueness analysis for the first case where and . In this case, for defined by (5), we can write
If we multiply both sides of (7) by the term , and then add the terms and to the right hand side of the resulting equation, we get
We note the following inequalities:
Using (9) in (8) results in
which is of the form
or equivalently
Since and , , , , and imply that
implying that
from which it follows that
implies that , and implies that . Therefore, it directly follows that , thus implying that . Hence, we conclude that for all and .

Now consider the case where and . In this case, defined by (5) satisfies
; and imply that
Based on the analysis carried out for the previous case, we conclude that for all for this case.

Now it is shown that the conditions of Theorem 2 imply that as . For , we can write
Taking the absolute value of the both sides of the above inequality, we obtain
from which it follows that
which yields
We note that . Hence, from (21), it follows that
Since is bounded, and , then as . Hence, the conditions of Theorem 2 ensure that is homeomorphism of , proving that neutral system defined by (1) has unique equilibrium point for each .

Choosing , , , , and in the conditions of Theorem 2 as , , , , and , we can express some special cases of Theorem 2 as follows.

Corollary 3. *For the neutral-type neural network model (1), let and the activation functions satisfy (2). Then, the system (1) has unique equilibrium point for each if there exist some positive constants , , , , and such that the following conditions hold:
**
where .*

A special case of Corollary 3 is the following result.

Corollary 4. *For the neutral-type neural network model (1), let and the activation functions satisfy (2). Then, the system (1) has unique equilibrium point for each if there exist some positive constants , , , , and such that the following conditions hold:
**
where , , and . *

#### 4. Stability Analysis

In this section, we will prove that the conditions obtained from Theorem 2 for the existence and uniqueness of the equilibrium point are also sufficient for the global stability of the equilibrium point of neutral system defined by (1). In order to simplify the proofs, we will first shift the equilibrium point of system (1) to the origin. By using the transformation , the neutral-type neural network model (1) can be put in the form: which can be written in vector-matrix form as follows: where is the state vector of transformed neural system, represents the new nonlinear activation, functions, and . The activation functions in (25) satisfy We can now state the following stability result.

Theorem 5. *For the neutral-type neural network model (25), let and the activation functions satisfy (27). Then, the origin of system (25) is globally asymptotically stable if there exist positive diagonal matrices and and positive definite matrices , , and such that the following conditions hold:
**
where .*

*Proof. *Define the following positive definite Lyapunov functional:
where and are some positive constants. The time derivative of along the trajectories of the system (25) is obtained as follows:
Since , we can write
We can write the following inequalities:
where , , and are some positive definite matrices. Using (32) in (31) yields

Equation (27) implies that
Hence, we have
which can be written as
or equivalently
Clearly, , , , and imply that if any of the vectors , , , and is nonzero, thus implying that if and only if which is the origin of system (25). On the other hand, as , meaning that the Lyapunov functional used for the stability analysis is radially unbounded. Thus, it can be concluded from the standard Lyapunov theorems [34] that the origin of system (25) or equivalently the equilibrium point of system (1) is globally asymptotically stable.

We can directly state the following corollaries.

Corollary 6. *For the neutral-type neural network model (25), let and the activation functions satisfy (27). Then, the origin of system (25) is globally asymptotically stable if there exist some positive constants , , , , and such that the following conditions hold:
**
where .*

Corollary 7. *For the neutral-type neural network model (25), let and the activation functions satisfy (27). Then, the origin of system (25) is globally asymptotically stable if there exist some positive constants , , , , and such that the following conditions hold:
**
where , , and . *

#### 5. A Comparative Example

In this section, we will give a numerical example to make a comparison between our results and some previous corresponding results derived in the literature. We should point our here that the stability results regarding the neutral-type neural networks involve complicated relationships between the network parameters and some positive definite matrices to be determined, which is a difficult task to achieve. Therefore, the example we give will show that, in a particular case, our results seem to be equivalent to the previous corresponding literature results. We now state some of the previous results.

Theorem 8 (see [23]). *For the neutral-type neural network model (1), let and the activation functions satisfy (2). Then, system (1) is globally asymptotically stable if there exist some positive constants , , , and such that the following conditions hold:
**
where .*

Theorem 9 (see [22]). *For the neutral-type neural network model (1), let and the activation functions satisfy (2). Then, system (1) is globally asymptotically stable if there exist positive constants , , , and such that the following conditions hold:
**
where , , and .*

Theorem 10 (see [24]). *For the neutral-type neural network model (1), let and the activation functions satisfy (2). Then, system (1) is globally asymptotically stable if the following condition holds:
**
where , .*

We now consider the following example.

*Example 11. *Assume that the network parameters of neutral-type neural system (1) are given as follows:
where is real number. Assume that and . We have .

For the sufficiently small values of and and sufficiently large value of , , and , the conditions of Corollary 7 can be approximately stated as follows:
The two required conditions for stability are and , implying that .

In the case of Theorem 8, for the sufficiently small value of and sufficiently large values of and , , and , the conditions of Theorem 8 can be approximately stated as follows:
The required condition for stability is .

In the case of Theorem 9, for the sufficiently small value of and sufficiently large values of and , , and , the conditions of Theorem 9 can be approximately stated as follows:
The two required conditions for stability are and , implying that .

In the case of Theorem 10, for a sufficiently small value of , the condition of Theorem 10 can be approximately stated as follows:
The required condition for stability is .

#### 6. Conclusions

In this paper, we have obtained some sufficient conditions for the existence, uniqueness, and global asymptotic stability of the equilibrium point for the class of neutral-type systems with discrete time delays. The results we obtained establish various relationships between the network parameters of the system. We have also given an example to show the applicability of our results and make a comparison between our results and some previous corresponding results derived in the literature. Most of the literature results express the stability conditions in terms of LMIs (linear matrix inequalities), which are then solved by some software tools. Such results may give less conservative results; however, the computational burden of this method can be high. Our results establish less complex relationships between the network parameters of the system. We should also point out that the delay-independent conditions may be more conservative than delay-dependent ones. In our paper, our stability conditions are independent of the time delays; this is due to the Lyapunov functional that we have employed in the analysis of our network model. In order to apply our techniques to obtain some delay-dependent conditions, one needs to modify the Lyapunov functional that we have used to include time delays in the conditions, which probably could be the subject of another study.

#### References

- R. Rakkiyappan and P. Balasubramaniam, “New global exponential stability results for neutral type neural networks with distributed time delays,”
*Neurocomputing*, vol. 71, no. 4–6, pp. 1039–1045, 2008. View at Publisher · View at Google Scholar · View at Scopus - T. Pan, B. Shi, and J. Yuan, “Global stability of almost periodic solution of a class of neutral-type BAM neural networks,”
*Abstract and Applied Analysis*, vol. 2012, Article ID 482584, 18 pages, 2012. View at Zentralblatt MATH · View at MathSciNet - C. Bai, “Global stability of almost periodic solutions of Hopfield neural networks with neutral time-varying delays,”
*Applied Mathematics and Computation*, vol. 203, no. 1, pp. 72–79, 2008. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet - J. Liu and G. Zong, “New delay-dependent asymptotic stability conditions concerning BAM neural networks of neutral type,”
*Neurocomputing*, vol. 72, no. 10–12, pp. 2549–2555, 2009. View at Publisher · View at Google Scholar · View at Scopus - Y. Zhang, B. Song, J. H. Park, and Z.-G. Wu, “Global synchronization of neutral-type stochastic delayed complex networks,”
*Abstract and Applied Analysis*, vol. 2012, Article ID 631932, 19 pages, 2012. View at Zentralblatt MATH · View at MathSciNet - J. H. Park, C. H. Park, O. M. Kwon, and S. M. Lee, “A new stability criterion for bidirectional associative memory neural networks of neutral-type,”
*Applied Mathematics and Computation*, vol. 199, no. 2, pp. 716–722, 2008. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet - G. Liu, S. Zhou, and H. Huang, “New LMI-based conditions on neural networks of neutral type with discrete interval delays and general activation functions,”
*Abstract and Applied Analysis*, vol. 2012, Article ID 306583, 14 pages, 2012. View at Zentralblatt MATH · View at MathSciNet - J. Feng, S. Xu, and Y. Zou, “Delay-dependent stability of neutral type neural networks with distributed delays,”
*Neurocomputing*, vol. 72, no. 10–12, pp. 2576–2580, 2009. View at Publisher · View at Google Scholar · View at Scopus - H. Mai, X. Liao, and C. Li, “A semi-free weighting matrices approach for neutral-type delayed neural networks,”
*Journal of Computational and Applied Mathematics*, vol. 225, no. 1, pp. 44–55, 2009. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet - C. H. Lien, K. W. Yu, Y. F. Lin, Y. J. Chung, and L. Y. Chung, “Global exponential stability for uncertain delayed neural networks of neutral type with mixed time delays,”
*IEEE Transactions on Systems, Man, and Cybernetics, Part B*, vol. 38, no. 3, pp. 709–720, 2008. View at Publisher · View at Google Scholar · View at Scopus - L. Cheng, Z. G. Hou, and M. Tan, “A neutral-type delayed projection neural network for solving nonlinear variational inequalities,”
*IEEE Transactions on Circuits and Systems II*, vol. 55, no. 8, pp. 806–810, 2008. View at Publisher · View at Google Scholar · View at Scopus - H. Zhang, Z. Liu, and G. B. Huang, “Novel delay-dependent robust stability analysis for switched neutral-type neural networks with time-varying delays via SC technique,”
*IEEE Transactions on Systems, Man, and Cybernetics Part B*, vol. 40, no. 6, pp. 1480–1491, 2010. View at Publisher · View at Google Scholar · View at Scopus - Y. Liu, Z. Wang, and X. Liu, “On delay-dependent robust exponential stability of stochastic neural networks with mixed time delays and Markovian switching,”
*Nonlinear Dynamics*, vol. 54, no. 3, pp. 199–212, 2008. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet - Q. Zhou, B. Chen, C. Lin, and H. Li, “Mean square exponential stability for uncertain delayed stochastic neural networks with markovian jump parameters,”
*Circuits, Systems, and Signal Processing*, vol. 29, no. 2, pp. 331–348, 2010. View at Publisher · View at Google Scholar · View at Scopus - Y. Chen, A. Xue, R. Lu, and S. Zhou, “On robustly exponential stability of uncertain neutral systems with time-varying delays and nonlinear perturbations,”
*Nonlinear Analysis. Theory, Methods & Applications*, vol. 68, no. 8, pp. 2464–2470, 2008. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet - B. Wang, X. Liu, and S. Zhong, “New stability analysis for uncertain neutral system with time-varying delay,”
*Applied Mathematics and Computation*, vol. 197, no. 1, pp. 457–465, 2008. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet - W.-A. Zhang and L. Yu, “Delay-dependent robust stability of neutral systems with mixed delays and nonlinear perturbations,”
*Acta Automatica Sinica*, vol. 33, no. 8, pp. 863–866, 2007. View at MathSciNet - H. Li, S.-M. Zhong, and H.-B. Li, “Some new simple stability criteria of linear neutral systems with a single delay,”
*Journal of Computational and Applied Mathematics*, vol. 200, no. 1, pp. 441–447, 2007. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet - F. Wang, “Exponential asymptotic stability for nonlinear neutral systems with multiple delays,”
*Nonlinear Analysis. Real World Applications*, vol. 8, no. 1, pp. 312–322, 2007. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet - J. H. Park and O. M. Kwon, “Design of state estimator for neural networks of neutral-type,”
*Applied Mathematics and Computation*, vol. 202, no. 1, pp. 360–369, 2008. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet - M. S. Mahmoud and Y. Xia, “Improved exponential stability analysis for delayed recurrent neural networks,”
*Journal of the Franklin Institute*, vol. 348, no. 2, pp. 201–211, 2011. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet - R. Samli and S. Arik, “New results for global stability of a class of neutral-type neural systems with time delays,”
*Applied Mathematics and Computation*, vol. 210, no. 2, pp. 564–570, 2009. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet - Z. Orman, “New sufficient conditions for global stability of neutral-type neural networks with
time delays,”
*Neurocomputing*, vol. 71, pp. 141–148, 2012. - C. Cheng, T. Liao, J. Yan, and C. Hwang, “Globally asymptotic stability of a class of neutral-type neural networks with delays,”
*IEEE Transactions on Systems, Man, and Cybernetics Part B*, vol. 36, no. 5, pp. 1191–1195, 2006. View at Publisher · View at Google Scholar · View at Scopus - B. Shen, Z. Wang, and X. Liu, “Bounded H∞ synchronization and state estimation for discrete time-varying stochastic complex networks over a finite horizon,”
*IEEE Transactions on Neural Networks*, vol. 22, no. 1, pp. 145–157, 2011. View at Publisher · View at Google Scholar · View at Scopus - Z. Wang, Y. Wang, and Y. Liu, “Global synchronization for discrete-time stochastic complex networks with randomly occurred nonlinearities and mixed time delays,”
*IEEE Transactions on Neural Networks*, vol. 21, no. 1, pp. 11–25, 2010. View at Publisher · View at Google Scholar · View at Scopus - L. Hu, H. Gao, and W. X. Zheng, “Novel stability of cellular neural networks with interval time-varying delay,”
*Neural Networks*, vol. 21, no. 10, pp. 1458–1463, 2008. View at Publisher · View at Google Scholar · View at Scopus - S. Mou, H. Gao, J. Lam, and W. Qiang, “A new criterion of delay-dependent asymptotic stability for Hopfield neural networks with time delay,”
*IEEE Transactions on Neural Networks*, vol. 19, no. 3, pp. 532–535, 2008. View at Publisher · View at Google Scholar · View at Scopus - J. Liang, Z. Wang, Y. Liu, and X. Liu, “Global synchronization control of general delayed discrete-time networks with stochastic coupling and disturbances,”
*IEEE Transactions on Systems, Man, and Cybernetics Part B*, vol. 38, no. 4, pp. 1073–1083, 2008. View at Publisher · View at Google Scholar · View at Scopus - Y. Liu, Z. Wang, J. Liang, and X. Liu, “Synchronization of coupled neutral-type neural networks
with jumping-mode-dependent discrete and unbounded distributed delays,”
*IEEE Transactions on Cybernetics*, vol. 43, no. 1, pp. 102–114, 2013. - B. Shen, Z. Wang, and X. Liu, “Sampled-data synchronization control of dynamical networks with stochastic sampling,”
*IEEE Transactions on Automatic Control*, vol. 57, no. 10, pp. 2644–2650, 2012. View at Publisher · View at Google Scholar · View at MathSciNet - D. Ding, Z. Wang, H. Dong, and H. Shu, “Distributed ${H}_{\infty}$ state estimation with stochastic parameters and nonlinearities through sensor networks: the finite-horizon case,”
*Automatica*, vol. 48, no. 8, pp. 1575–1585, 2012. View at Publisher · View at Google Scholar · View at MathSciNet - H. Dong, Z. Wang, and H. Gao, “Distributed filtering for a class of time-varying systems over sensor networks with quantization errors and successive packet dropouts,”
*IEEE Transactions on Signal Processing*, vol. 60, no. 6, pp. 3164–3173, 2012. View at Publisher · View at Google Scholar · View at MathSciNet - J. K. Hale and S. M. Verduyn Lunel,
*Introduction to Functional-Differential Equations*, vol. 99 of*Applied Mathematical Sciences*, Springer, New York, NY, USA, 1993. View at MathSciNet