About this Journal Submit a Manuscript Table of Contents
Abstract and Applied Analysis
Volume 2012 (2012), Article ID 306583, 14 pages
http://dx.doi.org/10.1155/2012/306583
Research Article

New LMI-Based Conditions on Neural Networks of Neutral Type with Discrete Interval Delays and General Activation Functions

1School of Mechanical and Electronic Engineering, East China Institute of Technology, Nanchang 330013, China
2Science and Technology on UAV Laboratory, Northwestern Polytechnical University, Xi'an 710072, China

Received 23 August 2012; Accepted 6 October 2012

Academic Editor: Sabri Arik

Copyright © 2012 Guoquan Liu et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Abstract

The stability analysis of global asymptotic stability of neural networks of neutral type with both discrete interval delays and general activation functions is discussed. New delay-dependent conditions are obtained by using more general Lyapunov-Krasovskii functionals. Meanwhile, these conditions are expressed in terms of a linear matrix inequality (LMI) and can be verified using the MATLAB LMI toolbox. Numerical examples are used to illustrate the effectiveness of the proposed approach.

1. Introduction

During the past decades, artificial neural networks have received considerable attention due to their applicability in solving signal processing, pattern recognition, associative memories, parallel computation, image processing, and optimization problems [16]. Research problems on dynamic behavior such as Chaos control, Hopf bifurcation analysis, and Stability analysis have arisen in such applications and received attention in recent years. In addition, time delays occur frequently in neural networks model [7, 8], which reduce the rate of transmission, as well as cause instability and poor performance of neural networks. Thus, the study of stability of neural networks with time delays is practically required for an engineering system. In recent years, various methods have been proposed to deal with the problem of global stability analysis for neural networks with time delays [913]. For example, Singh, 2007 [12], proposed an LMI method for delayed neural networks. Liu et al. 2008 [13] developed a delayed bidirectional associative memory neural network based on Young’s inequality and Hölder’s inequality techniques, and several new sufficient criteria are obtained by using a new Lyapunov functional and an-matrix.

In practice, in order to describe the dynamics of some complicated neural networks more precisely, the information about derivatives of the past state has been introduced in the state equations of a considered neural network model [1416]. This new type of neural networks is often called neural networks of neutral type [17]. In particular, the problem of establishing stability for neural networks of neutral type with discrete time-varying delays has received research attention recently [1820]. But, unbounded distributed delays were not taken into account in Park et al., 2008 [18]; Park and Kwon, 2009 [19]; Park and Kwon, 2009 [20]. In a real neural system, the presence of distributed delay affects the system stability. More recently, some important results have been obtained on the stability analysis issue for neural networks of neutral type with discrete and unbounded distributed [21, 22]. Nevertheless, in their works, the activation functions of neural networks of neutral type with discrete and unbounded distributed delays have to be Lipschitz continuous to avoid computational complexity. However, in a real system, the activation functions are neither bounded nor monotonous; the functions are also discontinuous and nondifferentiable. Despite important progress made in studies on stability of neutral-type neural networks with discrete delays, due to the lack of the generality of the proposed neural networks model, how to solve the global stability of the proposed model is a challenging and critical issue.

The objective of this paper is to further reduce the conservatism of the stability conditions for neural networks of neutral type with mixed delays (discrete interval delays and unbounded distributed delays) and general activation functions. Based on the Lyapunov-Krasovskii stability theory and the LMI technique, a new sufficient condition is proposed in terms of an LMI. Finally, a numerical example is presented to illustrate the validity of the proposed approach. The rest of this paper is organized as follows. In Section 2, the problem formulation is stated and two assumptions are presented. The proof of the main result of stability analysis is given in Section 3. In Section 4, two numerical examples are provided to demonstrate the effectiveness of the proposed method. The paper is concluded in Section 5.

Throughout this paper, for real symmetric matrices and , the notation (resp., ) means that is positive semidefinite (respectively, positive definite); and denote the -dimensional Euclidean space and the set of all real matrices, respectively. The superscripts and stand for matrix transposition and matrix inverse, respectively. The shorthand denotes a block diagonal matrix with diagonal blocks being the matrices . The symmetric terms in a symmetric matrix are denoted by . is the identity matrix with appropriate dimensions.

2. Problem Description

Consider the following neural networks of neutral-type model: where is the state of the th neuron at time, denotes the passive decay rate, , , , and are the interconnection matrices representing the weight coefficients of the neurons, ,, and are activation functions, and is an external constant input. The delay is a real valued continuous nonnegative function defined on , which is assumed to satisfy .

For system (2.1), the following assumptions are given.

Assumption 2.1. For , the neuron activation functions in (2.1) satisfy where , and are some constants.

Assumption 2.2. The time-varying delays and satisfy where , and are constants.

Assume is an equilibrium point of (2.1). Through , system (2.1) can be transformed into the following system: where is the neural state vector, is the neuron activation function vector with , is the neuron activation function vector with , is the neuron activation function vector with . , and , , , and are the connection weight matrices.

Note that since functions , , and satisfy Assumption 2.1, , , and also satisfy where , and are some constants.

3. Stability Analysis

In order to obtain the main results of stability analysis, the following lemma is introduced.

Lemma 3.1. For any constant matrix , any scalars and such that , and a vector function such that the integrals concerned are well defined, the following holds:

To simplify the proofs, the following notations are adopted:

Then, the following theorem is proposed.

Theorem 3.2. Under Assumptions 2.1 and 2.2, the origin of system (2.4) is globally asymptotically stable, if there exist matrices ,  , , ,  , , diagonal matrices , ,  , and , such that the following LMI holds: where

Proof. Construct a Lyapunov-Krasovskili functional for system (2.4) as follows: where The time derivative of along the trajectory of system (2.4) is calculated where By Lemma 3.1, the following inequalities are true: From (2.5), the following inequalities can be satisfied Then, for any , it follows that Then, combining (3.7)–(3.12), it follows that where is given in (3.3) and It is easy to see that if for any . Thus if the LMI given in (3.3) holds, the system (2.4) is globally asymptotically stable; the proof is completed.

Remark 3.3. To the best of the authors’ knowledge, the problem of global stability for the neural networks of neutral type with both mixed delays (discrete interval and unbounded distributed delays) and general activation functions has not been investigated in the existing literature.

Remark 3.4. In this paper, itisassumedthatthe resulting activation functions are non-monotonic and more general than the usual Lipschitz functions. Thus, the advantage of the proposed work lies in the less conservative assumptions of activation functions.

Remark 3.5. It should be noted that when , the system (2.4) is described as which has been intensively investigated in the literatures [21, 22]. Since the discrete delay are time varying and various in an interval, our work extends and improves the results of [21, 22].

Then the following corollary can be proved directly.

Corollary 3.6. Under Assumptions 2.1 and 2.2, the origin of system (3.15) is globally asymptotically stable, if there exist matrices ,  , , ,  , , diagonal matrices > 0, ,  , and , such that the following LMI holds: where

Proof. The proof is similar to that of Theorem 3.2.

4. Numerical Examples

Example 4.1. Consider the following three-neuron delayed neural networks of neutral type as (2.4), where Then, let , , , ,  ,  ,  , , , and . Using MATLAB LMI Control toolbox, by Theorem 3.2, we can find that the system (2.4) is globally asymptotically stable and the solutions of LMI (3.3) are as follows:

Example 4.2. Consider the following two-neuron delayed neural networks of neutral type as [21], where Then, let ,  , , ,  , and. Using MATLAB LMI Control toolbox, by Corollary 3.6, we can find that the system (3.15) is globally asymptotically stable and the solutions of LMI (3.16) are as follows: If , the conditions in Rakkiyappan and Balasubramaniam, 2008 [21], cannot be satisfied, but by Corollary 3.6 in this paper, one can find that system (3.15) is globally asymptotically stable. Therefore, the proposed result is less conservative than that in Rakkiyappan and Balasubramaniam, 2008 [21].

5. Conclusions

The problem of stability for neural networks of neutral type with discrete interval delays and general activation functions is investigated in this paper. An integrated approach based on a Lyapunov-Krasovskii functional and linear matrix inequality is proposed. In the proposed approach, a corresponding Lyapunov-Krasovskii functional is constructed for neural networks of neutral-type model. Then, by using inequality analysis technique, a reasonably general sufficient condition is obtained in terms of LMI, which can be tested easily using the MATLB LMI toolbox. Moreover, the proposed stability conditions extend and improve the exiting results. Two numerical examples show that the proposed stability result is effective, which can be used to guide engineering design.

In many real world systems, stochastic perturbations often affect the stability of neural networks. Therefore, considering the presence of stochastic perturbations is critical to the stability analysis of networks systems, and some recent progress has been made. In this paper, the proposed neural network of natural type with discrete model was studied by an integrated approach. For future researches, more theoretical analysis should be performed on stochastic neural networks of natural type with mixed delays.

Acknowledgments

The authors would thank the referee and the editor for their valuable comments and suggestions. This work is supported by the Start-up Foundation for Doctors of East China Institute of Technology, the National Natural Science Foundation (nos. 11065002, 61075029). Mr. Ho Simon Wang has provided tutorial assistance to improve the linguistic presentation of the paper.

References

  1. Y. He and L. Wang, “Chaotic neural networks and their application to optimization problems,” Control Theory & Applications, vol. 17, pp. 847–852, 2000. View at Zentralblatt MATH
  2. V. Miljković, S. Milosevic, R. Sknepnek, and I. Zivic, “Pattern recognition in damaged neural networks,” Physica A, vol. 295, no. 3-4, pp. 526–536, 2001. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at Scopus
  3. H. Kirschner and R. Hillebrand, “Neural networks for HREM image analysis,” Information sciences, vol. 129, no. 1–4, pp. 31–44, 2000. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at Scopus
  4. A. Cichocki and R. Unbehauen, Neural Networks for Optimization and Signal Processing, John Wiley & Sons, Chichester, UK, 1993.
  5. M. Itoh and L. O. Chua, “Star cellular neural networks for associative and dynamic memories,” International Journal of Bifurcation and Chaos in Applied Sciences and Engineering, vol. 14, no. 5, pp. 1725–1772, 2004. View at Publisher · View at Google Scholar · View at Zentralblatt MATH
  6. G. Labonte and M. Quintin, “Network parallel computing for SOM neural networks,” High Performance Computing Systems and Applications, vol. 541, pp. 575–586, 2000.
  7. C. X. Huang and J. D. Cao, “Almost sure exponential stability of stochastic cellular neural networks with unbounded distributed delays,” Neurocomputing, vol. 72, no. 13–15, pp. 3352–3356, 2009. View at Publisher · View at Google Scholar · View at Scopus
  8. H. J. Xiang and J. D. Cao, “Almost periodic solution of Cohen-Grossberg neural networks with bounded and unbounded delays,” Nonlinear Analysis: Real World Applications, vol. 10, no. 4, pp. 2407–2419, 2009. View at Publisher · View at Google Scholar · View at Zentralblatt MATH
  9. S. Arik, “Global robust stability of delayed neural networks,” IEEE Transactions on Circuits and Systems I, vol. 50, no. 1, pp. 156–160, 2003. View at Publisher · View at Google Scholar
  10. X. F. Liao and C. D. Li, “An LMI approach to asymptotical stability of multi-delayed neural networks,” Physica D, vol. 200, no. 1-2, pp. 139–155, 2005. View at Publisher · View at Google Scholar · View at Zentralblatt MATH
  11. H. D. Qi, “New sufficient conditions for global robust stability of delayed neural networks,” IEEE Transactions on Circuits and Systems I, vol. 54, no. 5, pp. 1131–1141, 2007. View at Publisher · View at Google Scholar
  12. V. Singh, “Improved global robust stability criterion for delayed neural networks,” Chaos, Solitons and Fractals, vol. 31, no. 1, pp. 224–229, 2007. View at Publisher · View at Google Scholar · View at Zentralblatt MATH
  13. X. G. Liu, R. R. Martin, M. Wu, and M. L. Tang, “Global exponential stability of bidirectional associative memory neural networks with time delays,” IEEE Transactions on Neural Networks, vol. 19, no. 3, pp. 397–407, 2008. View at Publisher · View at Google Scholar · View at Scopus
  14. S. Y. Xu, J. Lam, D. W. C. Ho, and Y. Zou, “Delay-dependent exponential stability for a class of neural networks with time delays,” Journal of Computational and Applied Mathematics, vol. 183, no. 1, pp. 16–28, 2005. View at Publisher · View at Google Scholar · View at Zentralblatt MATH
  15. Y. J. Zhang, S. Y. Xu, Y. M. Chu, and J. J. Lu, “Robust global synchronization of complex networks with neutral-type delayed nodes,” Applied Mathematics and Computation, vol. 216, no. 3, pp. 768–778, 2010. View at Publisher · View at Google Scholar
  16. R. Samli and S. Arik, “New results for global stability of a class of neutral-type neural systems with time delays,” Applied Mathematics and Computation, vol. 210, no. 2, pp. 564–570, 2009. View at Publisher · View at Google Scholar · View at Zentralblatt MATH
  17. Z. Orman, “New sufficient conditions for global stability of neutral-type neural networks with time delays,” Neurocomputing, vol. 97, pp. 141–148, 2012. View at Publisher · View at Google Scholar
  18. J. H. Park, O. M. Kwon, and S. M. Lee, “State estimation for neural networks of neutral-type with interval time-varying delays,” Applied Mathematics and Computation, vol. 203, no. 1, pp. 217–223, 2008. View at Publisher · View at Google Scholar · View at Zentralblatt MATH
  19. J. H. Park and O. M. Kwon, “Further results on state estimation for neural networks of neutral-type with time-varying delay,” Applied Mathematics and Computation, vol. 208, no. 1, pp. 69–75, 2009. View at Publisher · View at Google Scholar · View at Zentralblatt MATH
  20. J. H. Park and O. M. Kwon, “Global stability for neural networks of neutral-type with interval time-varying delays,” Chaos, Solitons and Fractals, vol. 41, no. 3, pp. 1174–1181, 2009. View at Publisher · View at Google Scholar · View at Zentralblatt MATH
  21. R. Rakkiyappan and P. Balasubramaniam, “New global exponential stability results for neutral type neural networks with distributed time delays,” Neurocomputing, vol. 71, no. 4–6, pp. 1039–1045, 2008. View at Publisher · View at Google Scholar · View at Scopus
  22. J. E. Feng, S. Y. Xu, and Y. Zou, “Delay-dependent stability of neutral type neural networks with distributed delays,” Neurocomputing, vol. 72, no. 10–12, pp. 2576–2580, 2009. View at Publisher · View at Google Scholar · View at Scopus