Complexity

Complexity / 2020 / Article
Special Issue

Finite-time Control of Complex Systems and Their Applications

View this Special Issue

Research Article | Open Access

Volume 2020 |Article ID 1973548 | https://doi.org/10.1155/2020/1973548

Ozlem Faydasicok, "New Results on Stability of Delayed Cohen–Grossberg Neural Networks of Neutral Type", Complexity, vol. 2020, Article ID 1973548, 10 pages, 2020. https://doi.org/10.1155/2020/1973548

New Results on Stability of Delayed Cohen–Grossberg Neural Networks of Neutral Type

Academic Editor: Xiaodi Li
Received15 May 2020
Accepted27 May 2020
Published16 Jun 2020

Abstract

This research work conducts an investigation of the stability issues of neutral-type Cohen–Grossberg neural network models possessing discrete time delays in states and discrete neutral delays in time derivatives of neuron states. By setting a new generalized appropriate Lyapunov functional candidate, some novel sufficient conditions are proposed for global asymptotic stability for the considered neural networks of neutral type. This paper exploits some basic properties of matrices in the derivation of the results that establish a set of algebraic mathematical relationships between network parameters of this neural system. A key feature of the obtained stability criteria is to be independent from time and neutral delays. Therefore, the derived results can be easily tested. Moreover, a constructive numerical example is studied to check the verification of presented global stability conditions.

1. Introduction

In the past few decades, a variety of neural network models including Hopfield neural networks (HNNs), cellular neural networks (CNNs), Cohen–Grossberg neural networks (CGNNs), and bidirectional associative memory neural networks (BAMNNs) have been utilized for solving some typical engineering problems associated with pattern recognitions, signal processing, associative memories, and optimization related problems [19]. In these typical engineering applications, it is usually desired that the dynamics of the employed neural network must exhibit some certain behaviors depending on the characteristics of the problem to be solved. For instance, if one needs to solve an optimization problem, then the aimed designed neural network may require to possess a unique and globally asymptotically stable equilibrium point for every fixed input value. In this aspect, it becomes an important requirement to analyze stability behaviors of dynamical neural systems. On the contrary, neural networks have also been electronically implemented for real time applications of various classes of engineering problems. It is known that, in the process of electronically implementing a neural network, because of the finite switching speed of operational amplifiers and signal transmission times of neurons due to the communications of neurons, delay parameters encounter. The presence of the time delay parameters may lead to various complex nonlinear dynamics including instability, periodic solutions, and chaos. Therefore, one needs to consider the possible effects of these time delays on the stability properties of neural systems. In the recent literature, the stability issues for delayed neural networks have been addressed by a variety of researchers, and various sets of novel sufficient results on global asymptotic stability of the equilibrium point for different neural network models have been published [1025]. It should mention that stability analysis of neural networks whose mathematical model with only time delays may not be appropriate to address the complete characteristics of dynamics for these types of neural network models. The reason for this fact is that, in many cases, beside the states involving time delays, the time derivative of the states may also have some different types of delays. In this sense, we need to consider the neural networks having delays both in states and in time derivative of states. A neural network that is modelled in this way is called neutral-type neural network. A widely studied neural network of this class is that of Cohen–Grossberg neural networks possessing discrete time and neutral delay parameters. This neural network model is defined by the nonlinear dynamical equations:where is representing states for the neurons, is some behaved functions, and is representing the amplification functions. The constant elements and are representing interconnection weights among the neurons. is representing time delays and is representing neutral delays. The element is denoting the weights of time derivative of states including delays. is denoting neuronal activation functions, and is constant input of th neuron. In system (1), we state some general assumptions. Let , , , and . Under these assumptions, neural network model (1) keeps the initial values stated by and . We note that represents real-valued functions that are described on the interval to .

We can make some remarks on system (1) to address the role of this system. If we make some simple changes in the mathematical model of system (1), we can easily have some other forms of neural network models. If we let , then system (5) becomes a delayed Cohen–Grossberg network. If we let and and , then, system (1) will define the class of Hopfield neural networks. If we let , , , and be a specific activation function with the binary output values, then neural network (1) turns into the cellular neural network. Thus, the stability analysis of (1) will also address the stability of many different neural network models.

In stability analysis of the neutral-type network system whose dynamical activities are governed by (1), the primary question to be addressed is the determination of mathematical relationships in the neuron states and the functions , , and . The well-known basic assumptions on these nonlinear functions are given below.

: the function has the following property:where and are positive valued real constants.

: have the following property:where and are positive valued real constants.

the function has the following property:where is positive-valued real constant.

If neutral-type neural networks possess discrete delays, then the mathematical models of such neural systems can be stated in the forms of vectors and matrices. Then, we can study the stability of such neural networks by exploiting the linear matrix inequality approach using the other appropriate mathematical methods. In [2634], the stability of neutral-type neural networks have been studied, and by constructing some classes of suitable Lyapunov functionals together with setting some useful lemmas and new mathematical techniques, different novel stability results on the considered neutral-type neural networks of various types of linear matrix inequalities have been presented. In [3540], novel global stability conditions for neutral-type neural networks in the forms of different linear matrix inequality formulations have been proposed by employing various proper Lyapunov functionals with the triple or four integral terms. In [41, 42] various stability problems for neutral-type neural networks have been analyzed, and by setting the semifree weighting matrix techniques and an augmented Lyapunov functional, some less conservative and restrictive global stability conditions of linear matrix inequalities have been proposed. In [42], the stability for neural networks of neutral-type possessing discrete delays has been suitable conducted, and by employing a proper Lyapunov functional that makes a combination of the descriptor model transformation, a novel stability criterion has been formulated in linear matrix inequalities. In [16], stability of neural systems has been addressed, and by proposing an appropriate Lyapunov functionals utilizing auxiliary function-type integral inequalities and reciprocally convex method, various sets of stability results via linear matrix inequalities have been obtained. In [43], the Lagrange stability issue of neutral-type neural systems having mixed delays has been analyzed, and by using the suitable Lyapunov functionals and applying some appropriate linear matrix inequality techniques, various sufficient criteria have been obtained to ensure Lagrange stability of neural networks of neutral type. In [44], the issues associated with stability of neutral-type singular neural systems involving different delay parameters have been studied, and by setting a novel adequate Lyapunov functional and some rarely integral inequalities, a new global asymptotic stability condition via linear matrix inequality has been derived. In [45], dynamical issues of neural networks of neutral-type possessing some various delay parameters have been analyzed, and different stability results have been derived employing linear matrix inequality combining with Razumikhin-like approaches.

We should point out that the results of [16, 2645] employ some various classes of linear matrix inequality techniques to obtain different sets of stability conditions for neutral-type neural networks. However, the stability results derived via the linear matrix inequality method are required to test some negative definite properties of high-dimensional matrices whose elements are formed by the system parameters of neural networks. Due to these complex calculation problems, it becomes a necessity to propose different stability conditions for neutral-type neural networks, which are not stated in linear matrix inequality forms. In this concept, the current paper will focus on the dynamical analysis of neural system (1) to derive some easily verifiable algebraic stability conditions.

2. Stability Analysis

The basic contribution of this section will be deriving some stability conditions implying the stability of neutral-type Cohen–Grossberg neural system whose model is given by (1). We now proceed with a first step to provide a simpler procedure with the proofs of the stability conditions. This step needs to transform the equilibrium points of Cohen–Grossberg neural system represented by equation (1) to the origin. This will be achieved by utilizing the simple formula , which turns neutral-type neural network (1) to an equivalent neutral-type neural network which has a set of differential equations that govern the dynamics of this system:

(5) can be represented in the vectors and matrices forms as given below:where the system matrices are , and

After transforming neutral system (1) into neutral system (5), we have new transformed functions in system (5). The function are of the form

The function are of the form

The function are of the form

According the properties by , , and , these new transformed functions possess the following properties:

Fact 1. Consider a real matrix and a real vector . We can state the following inequality:

Fact 2. Consider a real matrix and a real vector . We can state the following inequality:A combination of facts 1 and 2 can be expressed by the following fact.

Fact 3. Consider a real matrix and a real vector . We can state the following inequality:where and are the binary constants such that and .

Fact 4. Let be a real matrix, be a positive diagonal matrix, and be a real vector. The following inequality can be stated:

Fact 5. Consider any two real vectors and . The following inequality can be stated:where can be chosen as any arbitrary positive real number.
The key contribution of this paper can now be presented by the theorem stated below.

Theorem 1. Suppose the conditions given by , , and hold. Let and be positive real-valued numbers. Then, the origin of Cohen–Grossberg neural system of neutral type expressed by (5) is globally asymptotically stable if the system parameters of (5) satisfy the conditions:where , , , , , , , and are the binary constants such that , , , , , and .

Proof. Construct a suitable Lyapunov functional candidate given bywhere is a real-valued positive number whose appropriate value will be specified in the process of the proof. If we take the time derivative of the Lyapunov functional along the trajectories of Cohen–Grossberg neural network model defined by (5), we will derive the equation:(19) may be rearranged asNote the inequalitiesCombining (22) with (23) leads toBy the virtue of fact 5, the following inequalities can be written asUsing (24)–(26) in (23) results inWe first note the following equality:By fact 3, we express the inequalities:Using the property of fact 4, we express the inequality:Using (28)–(30) in (27) yields implies the following inequalities: implies that implies thatUsing (32)–(36) in (37) leads to implies the following inequality:Then, by (38), we can writeUsing (37) and (39) in (20) results inwhere , , and .
From (40), one can obtain the inequality:In (41), the choice will make it possible for to be negative definite and every transformed states .
In (40), taking will directly yield the result:(42) directly implies that will have negative values for every transformed delayed state .
Let the transformed state together with delayed transformed state . Then, we immediately get from (40) that(43) directly yields that time derivative of this studied Lyapunov functional will be negative for every .
Let , , and . This case leads to the fact that . Hence, from (19), we get that . Hence, we note that at the equilibrium point which is the origin of system (5) and except for the equilibrium point. Hence, this Lyapunov functional analysis ensures that the origin of system (5) is asymptotically stable. In addition, the Lyapunov functional given by (18) is radially bounded, meaning that when . The radially unboundedness of this Lyapunov functional guarantees that the origin of neutral-type Cohen–Grossberg neural network (5) is globally asymptotically stable. Q.E.D.

3. An Instructive Example

This section gives an instructive example for the sake of indicating the applicability of results expressed by the conditions of Theorem 1.

Example: consider a case of neutral-type neural system (1) of four neurons, which has the system matrices given as follows:where , , and are being some positive constants. For this example, we also make the choices for the parameters , , , , and . For the system matrices , , and , one may calculate

Then, we can obtain the following: , and

According to Theorem 1, this example establishes the following conditions:

Let and . Then, and Clearly, the conditions and establish the global stability of system (5).

4. Conclusions

This research work has been conducted as an investigation of the stability issues for neutral-type Cohen–Grossberg neural network models possessing discrete time delays in states and discrete neutral delays in time derivatives of neuron states. By setting a novel generalized appropriate Lyapunov functional candidate, some new sufficient conditions have been proposed for global asymptotic stability for the considered delayed neural networks of neutral type. This paper has exploited some basic properties of matrices in the derivation of the results that established a set of algebraic mathematical relationships between network parameters of the neural system. The obtained stability criteria proved to be independent from the time and neutral delays. Therefore, the proposed results can be easily verified. A constructive numerical example has also been presented to check the applicability of the presented global stability conditions.

Data Availability

No data were used to support this study.

Conflicts of Interest

The author declares that there are no conflicts of interest.

References

  1. M. A. Cohen and S. Grossberg, “Absolute stability of global pattern formation and parallel memory storage by competitive neural networks,” IEEE Transactions on Systems, Man, and Cybernetics, vol. SMC-13, no. 5, pp. 815–826, 1983. View at: Publisher Site | Google Scholar
  2. L. Wang and X. Zou, “Exponential stability of Cohen-Grossberg neural networks,” Neural Networks, vol. 15, no. 3, pp. 415–422, 2002. View at: Publisher Site | Google Scholar
  3. S. Mohamad and K. Gopalsamy, “Exponential stability of continuous-time and discrete-time cellular neural networks with delays,” Applied Mathematics and Computation, vol. 135, no. 1, pp. 17–38, 2003. View at: Publisher Site | Google Scholar
  4. J. J. Hopfield, “Neural networks and physical systems with emergent collective computational abilities,” Proceedings of the National Academy of Sciences, vol. 79, no. 8, pp. 2554–2558, 1982. View at: Publisher Site | Google Scholar
  5. L. O. Chua and L. Yang, “Cellular neural networks: theory,” IEEE Transactions on Circuits and Systems, vol. 35, no. 10, pp. 1257–1272, 1988. View at: Publisher Site | Google Scholar
  6. A. Guez, V. Protopopsecu, and J. Barhen, “On the stability, and design of nonlinear continuous neural networks,” IEEE Transactions on Systems, Man and Cybernetics, vol. 18, no. 1, pp. 80–87, 1998. View at: Publisher Site | Google Scholar
  7. S. C. Tong, Y. M. Li, and H. G. Zhang, “Adaptive neural network decentralized backstepping output-feedback control for nonlinear large-scale systems with time delays,” IEEE Transactions on Neural Networks, vol. 22, no. 7, pp. 1073–1086, 2011. View at: Publisher Site | Google Scholar
  8. M. Galicki, H. Witte, J. Dörschel, M. Eiselt, and G. Griessbach, “Common optimization of adaptive preprocessing units and a neural network during the learning period. Application in EEG pattern recognition,” Neural Networks, vol. 10, no. 6, pp. 1153–1163, 1997. View at: Publisher Site | Google Scholar
  9. B. Kosko, “Bidirectional associative memories,” IEEE Transactions on Systems, Man, and Cybernetics, vol. 18, no. 1, pp. 49–60, 1988. View at: Publisher Site | Google Scholar
  10. H. Zhu, R. Rakkiyappan, and X. Li, “Delayed state-feedback control for stabilization of neural networks with leakage delay,” Neural Networks, vol. 105, pp. 249–255, 2018. View at: Publisher Site | Google Scholar
  11. Q. Song, Q. Yu, Z. Zhao, Y. Liu, and F. E. Alsaadi, “Boundedness and global robust stability analysis of delayed complex-valued neural networks with interval parameter uncertainties,” Neural Networks, vol. 103, pp. 55–62, 2018. View at: Publisher Site | Google Scholar
  12. J. Wang, H. Jiang, T. Ma, and C. Hu, “Delay-dependent dynamical analysis of complex-valued memristive neural networks: continuous-time and discrete-time cases,” Neural Networks, vol. 101, pp. 33–46, 2018. View at: Publisher Site | Google Scholar
  13. W. Xie and Q. Zhu, “Mean square exponential stability of stochastic fuzzy delayed Cohen-Grossberg neural networks with expectations in the coefficients,” Neurocomputing, vol. 166, pp. 133–139, 2015. View at: Publisher Site | Google Scholar
  14. Q. Zhu and J. Cao, “Robust exponential stability of Markovian jump impulsive stochastic Cohen-Grossberg neural networks with mixed time delays,” IEEE Transactions on Neural Networks, vol. 21, no. 8, pp. 1314–1325, 2010. View at: Google Scholar
  15. H. Liu, Z. Wang, B. Shen, T. Huang, and F. E. Alsaadi, “Stability analysis for discrete-time stochastic memristive neural networks with both leakage and probabilistic delays,” Neural Networks, vol. 102, pp. 1–9, 2018. View at: Publisher Site | Google Scholar
  16. R. Manivannan, R. Samidurai, J. Cao, A. Alsaedi, and F. E. Alsaadi, “Stability analysis of interval time-varying delayed neural networks including neutral time-delay and leakage delay,” Chaos, Solitons & Fractals, vol. 114, pp. 433–445, 2018. View at: Publisher Site | Google Scholar
  17. J. Zhou, Y. Liu, J. Xia, Z. Wang, and S. Arik, “Resilient fault-tolerant anti-synchronization for stochastic delayed reaction-diffusion neural networks with semi-Markov jump parameters,” Neural Networks, vol. 125, pp. 194–204, 2020. View at: Publisher Site | Google Scholar
  18. Z. Xu, X. Li, and P. Duan, “Synchronization of complex networks with time-varying delay of unknown bound via delayed impulsive control,” Neural Networks, vol. 125, pp. 224–232, 2020. View at: Publisher Site | Google Scholar
  19. X. Song, J. Man, S. Song, and Z. Wang, “Finite-time nonfragile time-varying proportional retarded synchronization for Markovian Inertial Memristive NNs with reaction-diffusion items,” Neural Networks, vol. 123, pp. 317–330, 2020. View at: Publisher Site | Google Scholar
  20. X. You, Q. Song, and Z. Zhao, “Existence and finite-time stability of discrete fractional-order complex-valued neural networks with time delays,” Neural Networks, vol. 123, pp. 248–260, 2020. View at: Publisher Site | Google Scholar
  21. J. Xiao, S. Wen, X. Yang, and S. Zhong, “New approach to global Mittag-Leffler synchronization problem of fractional-order quaternion-valued BAM neural networks based on a new inequality,” Neural Networks, vol. 122, pp. 320–337, 2020. View at: Publisher Site | Google Scholar
  22. X. Huang, J. Jia, Y. Fan, Z. Wang, and J. Xia, “Interval matrix method based synchronization criteria for fractional-order memristive neural networks with multiple time-varying delays,” Journal of the Franklin Institute, vol. 357, no. 3, pp. 1707–1733, 2020. View at: Publisher Site | Google Scholar
  23. M. S. Ali, G. Narayanan, S. Sevgen, V. Shekher, and S. Arik, “Global stability analysis of fractional-order fuzzy BAM neural networks with time delay and impulsive effects,” Communications in Nonlinear Science and Numerical Simulation, vol. 78, Article ID 104853, 2019. View at: Publisher Site | Google Scholar
  24. Y. Cao, S. Wang, Z. Guo, T. Huang, and S. Wen, “Synchronization of memristive neural networks with leakage delay and parameters mismatch via event-triggered control,” Neural Networks, vol. 119, pp. 178–189, 2019. View at: Publisher Site | Google Scholar
  25. T. Wei, P. Lin, Y. Wang, and L. Wang, “Stability of stochastic impulsive reaction-diffusion neural networks with S-type distributed delays and its application to image encryption,” Neural Networks, vol. 116, pp. 35–45, 2019. View at: Publisher Site | Google Scholar
  26. M. S. Mahmoud and A. Ismail, “Improved results on robust exponential stability criteria for neutral-type delayed neural networks,” Applied Mathematics and Computation, vol. 217, no. 7, pp. 3011–3019, 2010. View at: Publisher Site | Google Scholar
  27. J. H. Park, O. M. Kwon, and S. M. Lee, “LMI optimization approach on stability for delayed neural networks of neutral-type,” Applied Mathematics and Computation, vol. 196, no. 1, pp. 236–244, 2008. View at: Publisher Site | Google Scholar
  28. R. Rakkiyappan and P. Balasubramaniam, “LMI conditions for global asymptotic stability results for neutral-type neural networks with distributed time delays,” Applied Mathematics and Computation, vol. 204, no. 1, pp. 317–324, 2008. View at: Publisher Site | Google Scholar
  29. S. M. Lee, O. M. Kwon, and J. H. Park, “A novel delay-dependent criterion for delayed neural networks of neutral type,” Physics Letters A, vol. 374, no. 17-18, pp. 1843–1848, 2010. View at: Publisher Site | Google Scholar
  30. S. Xu, J. Lam, D. W. C. Ho, and Y. Zou, “Delay-dependent exponential stability for a class of neural networks with time delays,” Journal of Computational and Applied Mathematics, vol. 183, no. 1, pp. 16–28, 2005. View at: Publisher Site | Google Scholar
  31. R. Rakkiyappan and P. Balasubramaniam, “New global exponential stability results for neutral type neural networks with distributed time delays,” Neurocomputing, vol. 71, no. 4–6, pp. 1039–1045, 2008. View at: Publisher Site | Google Scholar
  32. W. Weera and P. Niamsup, “Novel delay-dependent exponential stability criteria for neutral-type neural networks with non-differentiable time-varying discrete and neutral delays,” Neurocomputing, vol. 173, pp. 886–898, 2016. View at: Publisher Site | Google Scholar
  33. M. Zheng, L. Li, H. Peng, J. Xiao, Y. Yang, and H. Zhao, “Finite-time stability analysis for neutral-type neural networks with hybrid time-varying delays without using Lyapunov method,” Neurocomputing, vol. 238, pp. 67–75, 2017. View at: Publisher Site | Google Scholar
  34. Y. Dong, L. Guo, and J. Hao, “Robust exponential stabilization for uncertain neutral neural networks with interval time-varying delays by periodically intermittent control,” Neural Computing and Applications, vol. 32, no. 7, pp. 2651–2664, 2020. View at: Publisher Site | Google Scholar
  35. K. Shi, H. Zhu, S. Zhong, Y. Zeng, and Y. Zhang, “New stability analysis for neutral type neural networks with discrete and distributed delays using a multiple integral approach,” Journal of the Franklin Institute, vol. 352, no. 1, pp. 155–176, 2015. View at: Publisher Site | Google Scholar
  36. K. Shi, S. Zhong, H. Zhu, X. Liu, and Y. Zeng, “New delay-dependent stability criteria for neutral-type neural networks with mixed random time-varying delays,” Neurocomputing, vol. 168, pp. 896–907, 2015. View at: Publisher Site | Google Scholar
  37. D. Liu and Y. Du, “New results of stability analysis for a class of neutral-type neural network with mixed time delays,” International Journal of Machine Learning and Cybernetics, vol. 6, no. 4, pp. 555–566, 2015. View at: Publisher Site | Google Scholar
  38. R. Samidurai, S. Rajavel, R. Sriraman, J. Cao, A. Alsaedi, and F. E. Alsaadi, “Novel results on stability analysis of neutral-type neural networks with additive time-varying delay components and leakage delay,” International Journal of Control, Automation and Systems, vol. 15, no. 4, pp. 1888–1900, 2017. View at: Publisher Site | Google Scholar
  39. K. Shi, H. Zhu, S. Zhong, Y. Zhang, and W. Wang, “Stability analysis of neutral type neural networks with mixed time-varying delays using triple-integral and delay-partitioning methods,” ISA Transactions, vol. 58, pp. 85–95, 2015. View at: Publisher Site | Google Scholar
  40. P. Balasubramaniam, G. Nagamani, and R. Rakkiyappan, “Global passivity analysis of interval neural networks with discrete and distributed delays of neutral type,” Neural Processing Letters, vol. 32, no. 2, pp. 109–130, 2010. View at: Publisher Site | Google Scholar
  41. H. Mai, X. Liao, and C. Li, “A semi-free weighting matrices approach for neutral-type delayed neural networks,” Journal of Computational and Applied Mathematics, vol. 225, no. 1, pp. 44–55, 2009. View at: Publisher Site | Google Scholar
  42. S. Lakshmanan, C. P. Lim, M. Prakash, S. Nahavandi, and P. Balasubramaniam, “Neutral-type of delayed inertial neural networks and their stability analysis using the LMI Approach,” Neurocomputing, vol. 230, pp. 243–250, 2017. View at: Publisher Site | Google Scholar
  43. Z. Tu and L. Wang, “Global Lagrange stability for neutral type neural networks with mixed time-varying delays,” International Journal of Machine Learning and Cybernetics, vol. 9, no. 4, pp. 599–609, 2018. View at: Publisher Site | Google Scholar
  44. Y. Ma, N. Ma, L. Chen, Y. Zheng, and Y. Han, “Exponential stability for the neutral-type singular neural network with time-varying delays,” International Journal of Machine Learning and Cybernetics, vol. 10, no. 5, pp. 853–858, 2019. View at: Publisher Site | Google Scholar
  45. H. Zhang, Z. Liu, and G. B. Huang, “Novel delay-dependent robust stability analysis for switched neutral-type neural networks with time-varying delays via SC technique,” IEEE Transactions on Systems, Man, and Cybernetics-Part B:Cybernetics, vol. 40, pp. 1480–1491, 2010. View at: Google Scholar

Copyright © 2020 Ozlem Faydasicok. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.


More related articles

 PDF Download Citation Citation
 Download other formatsMore
 Order printed copiesOrder
Views102
Downloads137
Citations

Related articles

We are committed to sharing findings related to COVID-19 as quickly as possible. We will be providing unlimited waivers of publication charges for accepted research articles as well as case reports and case series related to COVID-19. Review articles are excluded from this waiver policy. Sign up here as a reviewer to help fast-track new submissions.