Discrete Dynamics in Nature and Society

Discrete Dynamics in Nature and Society / 2010 / Article

Research Article | Open Access

Volume 2010 |Article ID 368379 | https://doi.org/10.1155/2010/368379

Chengjun Duan, Qiankun Song, "Boundedness and Stability for Discrete-Time Delayed Neural Network with Complex-Valued Linear Threshold Neurons", Discrete Dynamics in Nature and Society, vol. 2010, Article ID 368379, 19 pages, 2010. https://doi.org/10.1155/2010/368379

Boundedness and Stability for Discrete-Time Delayed Neural Network with Complex-Valued Linear Threshold Neurons

Academic Editor: Josef Diblik
Received21 Apr 2010
Accepted30 Jun 2010
Published11 Aug 2010

Abstract

The discrete-time delayed neural network with complex-valued linear threshold neurons is considered. By constructing appropriate Lyapunov-Krasovskii functionals and employing linear matrix inequality technique and analysis method, several new delay-dependent criteria for checking the boundedness and global exponential stability are established. Illustrated examples are also given to show the effectiveness and less conservatism of the proposed criteria.

1. Introduction

In the past decade, neural networks have received increasing interest owing to their applications in many areas such as signal processing, pattern recognition, associative memories, parallel computation, and optimization solvers [1]. In such applications, the qualitative analysis of the dynamical behaviors is a necessary step for the practical design of neural networks [2].

On the other hand, artificial neural networks are usually implemented by integrated circuits. In the implementation of artificial neural networks, time delay is produced by finite switching and finite propagation speed of electronic signals. During the implementation on very large-scale integrated chips, transmitting time delays will destroy the dynamical behaviors of neural networks. Hence it is a worthy work to consider the dynamical behaviors of neural networks with delays [3]. In recent years, some important results on the boundedness, convergence, global exponential stability, synchronization, state estimation, and passivity analysis have been reported for delayed neural networks; see [19] and the references theorems for some recent publications.

It should be pointed out that all of the above-mentioned literatures on the dynamical behaviors of delayed neural networks are concerned with continuous-time case. However, when implementing the continuous-time delayed neural network for computer simulation, it becomes essential to formulate a discrete-time system that is an analogue of the continuous-time delayed neural network. To some extent, the discrete-time analogue inherits the dynamical characteristics of the continuous-time delayed neural network under mild or no restriction on the discretization step-size, and also remains some functional similarity [10]. Unfortunately, as pointed out in [11], the discretization cannot preserve the dynamics of the continuous-time counterpart even for a small sampling period, and therefore there is a crucial need to study the dynamics of discrete-time neural networks. Recently, the dynamics analysis problem for discrete-time delayed neural networks and discrete-time systems with delay has been extensively studied; see [1019] and references therein.

It is known that complex number calculus has been found useful in such areas as electrical engineering, informatics, control engineering, bioengineering, and other related fields. It is therefore not surprising to see that complex-valued neural networks which deal with complex-valued data, complex-valued weights and neuron activation functions have also been widely studied in recent years [20, 21]. Very recent, authors considered a class of discrete time recurrent neural networks with complex-valued weights and activation function [22, 23]. In [22], authors discussed the convergency for discrete-time recurrent neural networks with multivalued neurons, which have complex-valued weights and an activation function defined as a function of the argument of a weighted sum. In [23], the boundedness, global attractivity, and complete stability were investigated for discrete-time recurrent neural networks with complex-valued linear threshold neurons. However, the delay is not considered in [22, 23], and the given criteria for checking the boundedness, global attractivity, and complete stability are conservatism to some extent. Therefore, it is important and necessary to further improve the results reported in [23].

Motivated by the above discussions, the objective of this paper is to study the problem on boundedness and stability of discrete-time delayed neural network with complex-valued linear threshold neurons.

2. Model Description and Preliminaries

In this paper, we consider the following discrete-time complex-valued neural network with time-delay: Here is a nonnegative integer and is a vector defined as , where denotes the activity of the th neuron. Further, is a complex-valued function defined as and . In (2.1), , , and are stated as the input vector , the connection weight matrix and the delayed connection weight matrix , respectively, and denotes time-delay, which is a positive integer. The initial condition associated with model (2.1) is given by

Remark 2.1. When , model (2.1) turns into the following model [22, 23] Hence, the model in [22, 23] is a special cases of the model in this paper.

Definition 2.2. A vector is called an equilibrium point of neural network (2.1), if it satisfies

Definition 2.3. The neural network (2.1) is called to be bounded if each of its trajectories is bounded.

Definition 2.4. The equilibrium point of the model (2.1) with the initial condition (2.3) is said to be globally exponentially stable if there exist two positive constants and such that

Throughout this paper, for any constant , we denote , , , and . Now we give an assumption on connection weights.(H)For each , there exist positive real numbers satisfying For presentation convenience, in the following, we denote Let us define Let be the sequence defined by where .

To prove our results, the following lemma that can be found in [24] is necessary in this paper.

Lemma 2.5 (see [24]). Let be a nonzero number. Then is a solution of the homogeneous recurrence relation with constant coefficients if and only if is a root of the polynomial equation If the polynomial equation has distinct roots , then is the general solution of (2.11) in the following sense: no matter what initial values for are given, there are constants so that (2.13) is the unique sequence which satisfies both the recurrence relation (2.11) and the initial condition.

The polynomial equation (2.12) is called the characteristic equation of the recurrence relation (2.11) and its roots are the characteristic roots.

3. The Main Results and Their Proofs

Theorem 3.1. If the assumption (H) holds and , the network (2.1) is bounded.

Proof. Let . It is noted that the restriction is nonnegative and nondecreasing and the restriction if . Consequently, we have Moreover, it is easy to prove that
From (2.1), we get that for . Note that , , , , , , , , and . Hence, based on monotonicity of , we have Similarly, we have Hence, Thus, Now let be a sequence with From the definition of , we have the following two equations: for . It follows from (3.9) that That is for Then the characteristic equation of the recurrence relation of (3.11) is In the following, we will prove that (i) the roots of (3.12) are distinct and (ii) for each root of (3.12).
For (i), let . Then . It is clear that and are coprime since and . Hence has no multiple divisor, which means that (3.12) has distinct roots.
For (ii), assume to the contrary that there exists some such that . Then we have from (3.12) that Multiplying on both sides of inequality (3.13), we can get Thus, which is contrary to the assumption (H). Therefore, for any root of (3.12).
Let be the distinct roots of (3.12); then (). From Lemma 2.5, we get that for , where are constants which are uniquely determined by initial condition: .
From (3.8), we have It follows that for . From (), we know that , so Thus, series is bounded, that is, to say there exists a positive constant such that
By the definition of and inequality (3.7), we know that , for . It follows from the definition of that By the properties (3.1), (3.2) of function and the definition of , we know
On the other hand, we can get from (2.1) that for . It is noted that and , since , , , , and . Thus, we have where the last inequality is due to . Similarly, we can imply that From (3.21), (3.23), and (3.24), we know that the real part and imaginary part of are both bounded, so each trajectory of network (2.1) is bounded.

Theorem 3.2. If there exist three symmetric positive definite matrices , , and , two positive diagonal matrices and such that the matrix is a negative definite matrix, then network (2.1) is globally exponentially stable.

Proof. Let . For , define then is a Banach space with the topology of uniform convergence. For any , let and be the solutions of model (2.1) starting from and , respectively.
It follows from model (2.1) that Let and . Then (3.27) can be written as Now we consider the following Lyapunov functional candidate for system (3.28) as where Then Therefore, where
It is easy to prove that We define From (3.32) and (3.34), we have By the definition of , we can get the following two inequations: It is obvious that (3.37) is equivalent to and (3.38) is equivalent to where is the unit column vector having in th row and zeros elsewhere.
Let and , where for each . It follows from (3.36), (3.39) and (3.40) that Since is a negative definite matrix, we have
By the similar method in [16], we can prove that network (2.1) is globally exponentially stable.

Corollary 3.3. If there exist an positive diagonal matrix and three positive definite matrices , , and such that is negative definite, then network (2.4) is globally exponentially stable.

Proof. Similar to the proof of Theorem 3.2, let and . Then
Now we consider the following Lyapunov functional candidate for system (3.44) as where Then Let Then As the proof in Theorem 3.2, there exists a matrix with such that So By the similar method in [16], we can prove that network (2.4) is globally exponentially stable.

Remark 3.4. It is known that the obtained criteria for checking stability of discrete-time delayed neural networks depend on the mathematic technique and the constructed Lyapunov functionals or Lyapunov-Krasovskii functionals in varying degrees. Using elegant mathematic technique and constructing proper Lyapunov functionals or Lyapunov-Krasovskii functionals can reduce conservatism. So, establishing some less conservatism results will be in future works.

Remark 3.5. Recently, delay-fractioning approach is widely used to reduce conservatism; it is proved that it can reduce more conservatism than many previous methods due to the remaining of some useful terms [25]. In [25], the delay fractioning approach has been used to investigate the global synchronization of delayed-complex networks with stochastic disturbances, which has shown the potential of reducing conservatism. Using the delay-partitioning approach, we can also investigate the stability of discrete-time delayed neural networks; the corresponding results will appear in the near future.

4. Examples

Here, we present two examples to show the validity of our results.

Example 4.1. Consider a two-neuron neural network (2.1), where
Taking , we can calculate . From Theorem 3.1, we know that the considered network (2.1) is bounded.
Furthermore, when , the matrix in Theorem 3.2 is a negative definite matrix. From Theorem 3.2, we know that the considered network (2.1) is globally exponentially stable. In fact, we can verify that and are unique equilibrium points of and of the considered network (2.1), respectively. The global exponential stability of equilibrium points is further verified by the simulation given in Figures 1, 2, 3, and 4.

Example 4.2. Consider a two-neuron neural network (2.4), where Obviously, we cannot find a dialogue positive definite matrix such that is a Hermitian matrix, so it is impossible using the theorem in [23] to judge the stability of the considered network (2.4).
When , the matrix in Corollary 3.3 is a negative definite matrix. From Corollary 3.3, we know that the considered network (2.4) is globally exponentially stable.

5. Conclusion

In this paper, the discrete-time delayed neural network with complex-valued linear threshold neurons has been considered. Several new delay-dependent criteria for checking the boundedness and global exponential stability have been established by constructing appropriate Lyapunov-Krasovskii functionals and employing linear matrix inequality technique and analysis method. The proposed results are less conservative than some recently known ones in the literature, which are demonstrated via two examples.

We would like to point out that it is possible to generalize our main results to more complex systems, such as neural networks with time-varying delays [1, 8], neural networks with parameter uncertainties [19], stochastic perturbations [17], Markovian jumping parameters [18], and some nonlinear systems [2632]. The corresponding results will appear in the near future.

Acknowledgment

The work is supported by National Natural Science Foundation of China under Grants no. 60974132 and 10772152.

References

  1. S. Arik, “An analysis of exponential stability of delayed neural networks with time varying delays,” Neural Networks, vol. 17, no. 7, pp. 1027–1031, 2004. View at: Publisher Site | Google Scholar | Zentralblatt MATH
  2. T. Chen, W. Lu, and G. Chen, “Dynamical behaviors of a large class of general delayed neural networks,” Neural Computation, vol. 17, no. 4, pp. 949–968, 2005. View at: Publisher Site | Google Scholar | Zentralblatt MATH | MathSciNet
  3. W. Lu and T. Chen, “Synchronization of coupled connected neural networks with delays,” IEEE Transactions on Circuits and Systems. I. Regular Papers, vol. 51, no. 12, pp. 2491–2503, 2004. View at: Publisher Site | Google Scholar | MathSciNet
  4. Z. Wang, Y. Liu, and X. Liu, “State estimation for jumping recurrent neural networks with discrete and distributed delays,” Neural Networks, vol. 22, no. 1, pp. 41–48, 2009. View at: Publisher Site | Google Scholar
  5. S. Xu, W. X. Zheng, and Y. Zou, “Passivity analysis of neural networks with time-varying delays,” IEEE Transactions on Circuits and Systems II: Express Briefs, vol. 56, no. 4, pp. 325–329, 2009. View at: Publisher Site | Google Scholar
  6. H. Jiang and Z. Teng, “Boundedness and stability for nonautonomous bidirectional associative neural networks with delay,” IEEE Transactions on Circuits and Systems II: Express Briefs, vol. 51, no. 4, pp. 174–180, 2004. View at: Publisher Site | Google Scholar
  7. J. Liang and J. Cao, “Boundedness and stability for recurrent neural networks with variable coefficients and time-varying delays,” Physics Letters A, vol. 318, no. 1-2, pp. 53–64, 2003. View at: Publisher Site | Google Scholar | Zentralblatt MATH | MathSciNet
  8. J. Cao and J. Liang, “Boundedness and stability for Cohen-Grossberg neural network with time-varying delays,” Journal of Mathematical Analysis and Applications, vol. 296, no. 2, pp. 665–685, 2004. View at: Publisher Site | Google Scholar | Zentralblatt MATH | MathSciNet
  9. H. Zhang, W. Wu, F. Liu, and M. Yao, “Boundedness and convergence of online gadient method with penalty for feedforward neural networks,” IEEE Transactions on Neural Networks, vol. 20, no. 6, pp. 1050–1054, 2009. View at: Publisher Site | Google Scholar
  10. S. Mohamad and K. Gopalsamy, “Exponential stability of continuous-time and discrete-time cellular neural networks with delays,” Applied Mathematics and Computation, vol. 135, no. 1, pp. 17–38, 2003. View at: Publisher Site | Google Scholar | Zentralblatt MATH | MathSciNet
  11. S. Hu and J. Wang, “Global robust stability of a class of discrete-time interval neural networks,” IEEE Transactions on Circuits and Systems. I. Regular Papers, vol. 53, no. 1, pp. 129–138, 2006. View at: Publisher Site | Google Scholar | MathSciNet
  12. H. Gao and T. Chen, “New results on stability of discrete-time systems with time-varying state delay,” IEEE Transactions on Automatic Control, vol. 52, no. 2, pp. 328–334, 2007. View at: Publisher Site | Google Scholar | MathSciNet
  13. M. Liu, “Global asymptotic stability analysis of discrete-time Cohen-Grossberg neural networks based on interval systems,” Nonlinear Analysis: Theory, Methods & Applications, vol. 69, no. 8, pp. 2403–2411, 2008. View at: Publisher Site | Google Scholar | Zentralblatt MATH | MathSciNet
  14. J. Yu, “New results on passivity analysis of delayed discrete-time stochastic neural networks,” Discrete Dynamics in Nature and Society, vol. 2009, Article ID 139671, 17 pages, 2009. View at: Google Scholar | MathSciNet
  15. Q. Song and J. Cao, “Global dissipativity on uncertain discrete-time neural networks with time-varying delays,” Discrete Dynamics in Nature and Society, vol. 2010, Article ID 810408, 19 pages, 2010. View at: Publisher Site | Google Scholar | Zentralblatt MATH
  16. Q. Song and Z. Wang, “A delay-dependent LMI approach to dynamics analysis of discrete-time recurrent neural networks with time-varying delays,” Physics Letters A, vol. 368, no. 1-2, pp. 134–145, 2007. View at: Publisher Site | Google Scholar
  17. Z. Wang, D. W. C. Ho, Y. Liu, and X. Liu, “Robust H control for a class of nonlinear discrete time-delay stochastic systems with missing measurements,” Automatica, vol. 45, no. 3, pp. 684–691, 2009. View at: Publisher Site | Google Scholar | Zentralblatt MATH | MathSciNet
  18. Y. Liu, Z. Wang, J. Liang, and X. Liu, “Stability and synchronization of discrete-time Markovian jumping neural networks with mixed mode-dependent time delays,” IEEE Transactions on Neural Networks, vol. 20, no. 7, pp. 1102–1116, 2009. View at: Publisher Site | Google Scholar
  19. J. Liang, Z. Wang, and X. Liu, “State estimation for coupled uncertain stochastic networks with missing measurements and time-varying delays: the discrete-time case,” IEEE Transactions on Neural Networks, vol. 20, no. 5, pp. 781–793, 2009. View at: Publisher Site | Google Scholar
  20. A. Hirose, Complex-Valued Neural Networks: Theories and Applications, vol. 5 of Series on Innovative Intelligence, World Scientific, River Edge, NJ, USA, 2003. View at: MathSciNet
  21. I. N. Aizenberg, N. N. Aizenberg, and J. P. L. Vandewalle, Multi-Valued and Universal Binary Neurons: Theory, Learning, Applications, Kluwer Academic Publishers, Dordrecht, The Netherlands, 2000.
  22. W. Zhou and J. M. Zurada, “A class of discrete time recurrent neural networks with multivalued neurons,” Neurocomputing, vol. 72, no. 16–18, pp. 3782–3788, 2009. View at: Publisher Site | Google Scholar
  23. W. Zhou and J. M. Zurada, “Discrete-time recurrent neural networks with complex-valued linear threshold neurons,” IEEE Transactions on Circuits and Systems II: Express Briefs, vol. 56, no. 8, pp. 669–673, 2009. View at: Publisher Site | Google Scholar
  24. R. A. Brualdi, Introductory Combinatorics, Person Education, Upper Saddle River, NJ, USA, 2004.
  25. Y. Wang, Z. Wang, and J. Liang, “A delay fractioning approach to global synchronization of delayed complex networks with stochastic disturbances,” Physics Letters A, vol. 372, no. 39, pp. 6066–6073, 2008. View at: Publisher Site | Google Scholar
  26. J. Diblík, D. Ya. Khusainov, I. V. Grytsay, and Z. Šmarda, “Stability of nonlinear autonomous quadratic discrete systems in the critical case,” Discrete Dynamics in Nature and Society, vol. 2010, Article ID 539087, 23 pages, 2010. View at: Publisher Site | Google Scholar
  27. J. Baštinec, J. Diblík, and Z. Šmarda, “Existence of positive solutions of discrete linear equations with a single delay,” Journal of Difference Equations and Applications, vol. 16, no. 5, pp. 1165–1177, 2010. View at: Google Scholar
  28. J. Baštinec, J. Diblík, and Z. Šmarda, “Oscillation of solutions of a linear second-order discrete delayed equation,” Advances in Difference Equations, vol. 2010, Article ID 693867, 12 pages, 2010. View at: Google Scholar
  29. J. Diblík, D. Ya. Khusainov, and Z. Šmarda, “Construction of the general solution of planar linear discrete systems with constant coefficients and weak delay,” Advances in Difference Equations, vol. 2009, Article ID 784935, 18 pages, 2009. View at: Google Scholar | Zentralblatt MATH | MathSciNet
  30. J. Diblík, D. Ya. Khusainov, and I. V. Grytsay, “Stability investigation of nonlinear quadratic discrete dynamics systems in the critical case,” Journal of Physics: Conference Series, vol. 96, no. 1, Article ID 012042, 2008. View at: Publisher Site | Google Scholar
  31. J. Baštinec and J. Diblík, “Remark on positive solutions of discrete equation Δu(k+n)=p(k)u(k),” Nonlinear Analysis: Theory, Methods & Applications, vol. 63, no. 5–7, pp. e2145–e2151, 2005. View at: Publisher Site | Google Scholar
  32. J. Diblík, E. Schmeidel, and M. Růžičková, “Asymptotically periodic solutions of Volterra system of difference equations,” Computers and Mathematics with Applications, vol. 59, no. 8, pp. 2854–2867, 2010. View at: Publisher Site | Google Scholar

Copyright © 2010 Chengjun Duan and Qiankun Song. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.


More related articles

 PDF Download Citation Citation
 Download other formatsMore
 Order printed copiesOrder
Views965
Downloads625
Citations

Related articles

We are committed to sharing findings related to COVID-19 as quickly as possible. We will be providing unlimited waivers of publication charges for accepted research articles as well as case reports and case series related to COVID-19. Review articles are excluded from this waiver policy. Sign up here as a reviewer to help fast-track new submissions.