Table of Contents Author Guidelines Submit a Manuscript
Mathematical Problems in Engineering
Volume 2013, Article ID 639219, 12 pages
Research Article

Stability Analysis for Delayed Neural Networks: Reciprocally Convex Approach

1Space Control and Inertial Technology Research Center, Harbin Institute of Technology, Harbin 150001, China
2Designing Institute of Hubei Space Technology Academy, Wuhan 430034, China

Received 25 November 2012; Accepted 14 January 2013

Academic Editor: Ligang Wu

Copyright © 2013 Hongjun Yu et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.


This paper is concerned with global stability analysis for a class of continuous neural networks with time-varying delay. The lower and upper bounds of the delay and the upper bound of its first derivative are assumed to be known. By introducing a novel Lyapunov-Krasovskii functional, some delay-dependent stability criteria are derived in terms of linear matrix inequality, which guarantee the considered neural networks to be globally stable. When estimating the derivative of the LKF, instead of applying Jensen’s inequality directly, a substep is taken, and a slack variable is introduced by reciprocally convex combination approach, and as a result, conservatism reduction is proved to be more obvious than the available literature. Numerical examples are given to demonstrate the effectiveness and merits of the proposed method.