Research Article | Open Access

# Stability Analysis of Discrete Hopfield Neural Networks with the Nonnegative Definite Monotone Increasing Weight Function Matrix

**Academic Editor:**Guang Zhang

#### Abstract

The original Hopfield neural networks model is adapted so that the weights of the resulting network are time varying. In this paper, the Discrete Hopfield neural networks with weight function matrix (DHNNWFM) the weight changes with time, are considered, and the stability of DHNNWFM is analyzed. Combined with the Lyapunov function, we obtain some important results that if weight function matrix (WFM) is weakly (or strongly) nonnegative definite function matrix, the DHNNWFM will converge to a stable state in serial (or parallel) model, and if WFM consisted of strongly nonnegative definite function matrix and column (or row) diagonally dominant function matrix, DHNNWFM will converge to a stable state in parallel model.

#### 1. Introduction

Discrete Hopfield neural network (DHNN) [1] is one of the famous neural networks with a wide range of applications. With the development of DHNN in theory and application, the model is more and more complex. It is well known that the nonautonomous phenomena often occur in many realistic systems. Particularly when we consider a long-term dynamical behavior of the system and consider seasonality of the changing environment, the parameters of the system usually will change with time [2, 3]. However, the original DHNN is difficult to adapt this change, because the matrixes of DHNN and DHNN with time or time-varying delay are constant matrixes [1, 4–22] and the parameters of DHNN, which change with time, are seldom considered. In order to implement a desired flow vector field distribution by using conventional matrix encoding scheme, a time-varying Hopfield model (TVHM) is proposed [23]. In many applications, the properties of periodic oscillatory solutions are of great interest. For example, the human brain has been in periodic oscillatory or chaos state, hence it is of prime importance to study periodic oscillatory and chaos phenomenon of neural networks. So, the literature [2, 3] studies the global exponential stability and existence of periodic solutions of the high-order Hopfield-type neural networks. In [23, 24], we consider that the weight function matrix and the threshold function vector, respectively, converge to a constant matrix and a constant vector and the weight function matrix is a symmetric function matrix, and, we analyze the stability of the model. In this paper, with the stability of asymmetric Hopfield Neural Networks [4, 5], we work on the stability analysis of discrete Hopfield neural networks with the nonnegative definite monotone increasing weight function matrix.

This paper has the following organization. In Section 1, we provide the introduction. In Section 2, we introduce some basic concepts. In Section 3, we analyze the stability analysis of discrete Hopfield neural networks with the nonnegative definite monotone increasing weight function matrix. The last section offers the conclusions of this paper.

#### 2. Basic Definitions

In this section, we will introduce basic concepts which will be used in the following to obtain some results.

DHNN with weight function matrix (DHNNWFM) varies with the discrete time factor by step length (in this paper, ). Formally, let be a DHNN with neurons, which have the discrete time factor with step length and are denoted by . In the pair , is an function matrix where changes with time , representing the connection weight from to , and is an -dimensional function vector where changes with time , representing the threshold attached to the neuron . The state of the neuron at time is denoted by . Each neuron is assumed to have two possible states: and . The state of the network at time is the vector . In general, the state, , of the network at time is a function of and the state, , of the network at time . The network is, thus, completely determined by the parameters , the initial value, , of the states, and the manner in which the neurons are updated (evolved).

If at time step , a neuron is chosen to be updated, then at the next step

where

() The network is updated asynchronously, that is, only one neuron is selected at time . The updating rule is

() The network is updated synchronously, that is, every neuron is selected at time . The updating rule is

Let be a DHNNWFM. If the output of neurons does not change any longer after a limited time interval from its initial state , that is, , then we can say that the network is a stable state, and we call a stable point of . In addition, we say that the network converges about the initial state . It is easy to know if is a stable point of DHNNWFM, then . Sometimes we call that satisfies the above formula a stable attraction factor of the network . If is an attraction factor DHNNWFM , we denote attraction domain of as , which represents the set which consists of all of the initial state attracted to .

Let be a DHNNWFM. are -dimensional vectors. and , then we called a limit cycle attraction factor of , sometimes abbreviated to limit cycle; its length is , and is denoted by . Similar to the attraction domain of a stable attraction factor, the attraction domain of cycle attraction factor , denoted by , represents the set that consists of all the possible initial states which are attracted to .

*Definition 2.1. * is a column or row-diagonally dominant function matrix, if it satisfies the following conditions:
or

*Definition 2.2. *let be a column or row-diagonally dominant function matrix. is called a column or row-diagonally dominant monotone increasing function matrix, if it satisfies the following conditions:
or
where .

*Definition 2.3. * is called a nonnegative definite matrix on the set , if it satisfies for each .

*Definition 2.4. *let be a nonnegative definite function matrix. is called nonnegative definite monotone increasing function matrix, if it satisfies
where , for each .

*Definition 2.5. * is called a weakly nonnegative definite function matrix, if it satisfies

*Definition 2.6. *let be a weakly nonnegative definite function matrix. is called weakly nonnegative definite monotone increasing function matrix, if it satisfies
where .

For an matrix , we denote in this paper the corresponding matrix with

*Definition 2.7. * is called a strongly nonnegative definite function matrix, if the corresponding matrix is a nonnegative definite function matrix.

*Definition 2.8. *let be a strongly nonnegative definite function matrix. is called a strongly nonnegative definite monotone increasing function matrix, if the corresponding matrix is nonnegative definite monotone increasing function matrix.

In this paper, the function matrix is the sum of the initial weight matrix and increment matrixes, that is,

And the -dimensional function vector is the sum of the initial vector and increment vectors, that is,

In order to describe, let , and . represents of

#### 3. Main Results

Theorem 3.1. *Let be a DHNNWFM. **() If is a weakly nonnegative definite monotone increasing function matrix and , then will converge to a stable state in serial mode.**() If is a strongly nonnegative definite monotone increasing function matrix and , then will converge to a stable state in parallel mode.*

*Proof. *Based on (2.12) and (2.13), we have the following.

Let , where

If , then is assigned an arbitrary negative number. Suppose , where and . Then we consider energy function (Lyapunov function) of the DHNNWFM as follows:

Combined with (2.12) and (2.13), we have
where
that is the increasing energy for the connected weight matrix increases. So, the change of energy is
where
According to (3.4) and we obtain
By ,

Because (i.e., ) and is nonnegative definite, we have

() Here, is a weakly nonnegative definite monotone increasing function matrix, so is weakly nonnegative definite matrixes. Then, based on [4], when is operating in serial mode, we obtain . Then . Therefore, will converge to a stable state in serial mode.

() Here, is a strongly nonnegative definite monotone increasing function matrix, so is strongly nonnegative definite matrixes. According to [4], we know that in parallel mode. Then . Therefore, will converge to a stable state in parallel mode. The proof is completed.

Theorem 3.2. *Let be a DHNNWFM. **() If there exits an integer constant such that is symmetric or weakly nonnegative definite matrix, is weakly nonnegative definite matrix and , then will converge to a stable state in serial mode.**() If there exits an integer constant such that is symmetric or strongly nonnegative definite matrix is strongly nonnegative definite matrix and , then will converge to a limit cycle of length at most 2 or a stable state in parallel mode.*

*Proof. *We consider energy function (Lyapunov function) of the DHNNWFM as follows:

According to the proof of Theorem 3.1, now we have

() Here, we know . Then

Based on [1, 4], when is operating in serial mode, we obtain for is symmetric or weakly nonnegative definite matrixes and according to (3.9) we know . Then . Therefore, will converge to a stable state in serial mode.

() Here, we know . Then

If , according to [4] we know that if and only if or in parallel mode.

If , according to [4] we know that in parallel mode.

According to (3.9), we know

Based on the above, we obtain that if or , then . So, will converge to a limit cycle of length at most 2 or a stable state in parallel mode. The proof is completed.

Combined with [25], we have the following.

Theorem 3.3. *Let be a DHNNWFM. If is column diagonally dominant monotone increasing function matrix, is row diagonally dominant monotone increasing function matrix, and is strongly nonnegative definite monotone increasing function matrix, then will converge to a stable state in parallel mode. *

*Proof. *Let , where

If , then is assigned an arbitrary negative number. Suppose , where and . Then we consider energy function (Lyapunov function) of the DHNNWFM as follows:
where , , , and .

We have that the change of energy is
where

Based on [4, Theorem 2], we obtain .

Obviously, when , , .

When it is operating in parallel mode, let .

According to the property of column (or row) diagonally dominant matrix, we have

According to Definition 2.2 and (3.9), we know .

Based on the above, we have . So, will converge to a stable state in parallel mode. The proof is completed.

#### 4. Examples

*Example 4.1. *Let be a DHNNWFM, where and . will converge to a stable state in parallel mode.

*Example 4.2. *Let be a DHNNWFM, where and . will converge to a stable state in parallel mode.

*Example 4.3. *Let be a DHNNWFM, where , , and . will converge to a stable state in parallel mode.

#### 5. Conclusion

In this paper, we firstly introduce the DHNNWFM. Then we mainly discuss stability of DHNNWFM that WFM is a symmetric or nonnegative definite, or column (or row) diagonally dominant function matrix. This work widens the DHNN model. And we obtain some important results, which supply some theoretical principles to the application. DHNNWFM has many interesting phenomena. We will continue for the theoretic and the practical research about DHNNFWM.

#### References

- J. J. Hopfield, “Neural networks and physical systems with emergent collective computational abilities,”
*Proceedings of the National Academy of Sciences of the United States of America*, vol. 79, no. 8, pp. 2554–2558, 1982. View at: Publisher Site | Google Scholar | MathSciNet - H. Xiang, K.-M. Yan, and B.-Y. Wang, “Existence and global exponential stability of periodic solution for delayed high-order Hopfield-type neural networks,”
*Physics Letters A*, vol. 352, no. 4-5, pp. 341–349, 2006. View at: Publisher Site | Google Scholar - H. Xiang, K.-M. Yan, and B.-Y. Wang, “Existence and global stability of periodic solution for delayed discrete high-order Hopfield-type neural networks,”
*Discrete Dynamics in Nature and Society*, vol. 2005, no. 3, pp. 281–297, 2005. View at: Publisher Site | Google Scholar | Zentralblatt MATH | MathSciNet - Z. B. Xu and C. P. Kwong, “Global convergence and asymptotic stability of asymmetric Hopfield neural networks,”
*Journal of Mathematical Analysis and Applications*, vol. 191, no. 3, pp. 405–427, 1995. View at: Publisher Site | Google Scholar | Zentralblatt MATH | MathSciNet - Z.-B. Xu, G.-Q. Hu, and C.-P. Kwong, “Asymmetric Hopfield-type networks: theory and applications,”
*Neural Networks*, vol. 9, no. 3, pp. 483–501, 1996. View at: Publisher Site | Google Scholar - E. Goles Chacc, F. Fogelman-Soulié, and D. Pellegrin, “Decreasing energy functions as a tool for studying threshold networks,”
*Discrete Applied Mathematics*, vol. 12, no. 3, pp. 261–277, 1985. View at: Google Scholar | Zentralblatt MATH | MathSciNet - Y. Zhang, “Global exponential stability and periodic solutions of delay Hopfield neural networks,”
*International Journal of Systems Science*, vol. 27, no. 2, pp. 227–231, 1996. View at: Publisher Site | Google Scholar | Zentralblatt MATH - X. Liao and J. Yu, “Hopfield neural networks with time delays Robust stability for interval Hopfield neural networks with time delay,”
*IEEE Transactions on Neural Networks*, vol. 9, no. 5, pp. 1042–1045, 1998. View at: Publisher Site | Google Scholar - V. Singh, “Improved global robust stability for interval-delayed Hopfield neural networks,”
*Neural Processing Letters*, vol. 27, no. 3, pp. 257–265, 2008. View at: Publisher Site | Google Scholar - E. C. C. Tsang, S. S. Qiu, and D. S. Yeung, “Stability analysis of a discrete Hopfield neural network with delay,”
*Neurocomputing*, vol. 70, no. 13–15, pp. 2598–2602, 2007. View at: Publisher Site | Google Scholar - C. Ou, “Anti-periodic solutions for high-order Hopfield neural networks,”
*Computers & Mathematics with Applications*, vol. 56, no. 7, pp. 1838–1844, 2008. View at: Google Scholar | Zentralblatt MATH | MathSciNet - J. Zhang and Z. Gui, “Existence and stability of periodic solutions of high-order Hopfield neural networks with impulses and delays,”
*Journal of Computational and Applied Mathematics*, vol. 224, no. 2, pp. 602–613, 2009. View at: Publisher Site | Google Scholar | MathSciNet - J. Cao and J. Wang, “Global asymptotic and robust stability of recurrent neural networks with time delays,”
*IEEE Transactions on Circuits and Systems I*, vol. 52, no. 2, pp. 417–426, 2005. View at: Publisher Site | Google Scholar | MathSciNet - S. Mou, H. Gao, J. Lam, and W. Qiang, “A new criterion of delay-dependent asymptotic stability for Hopfield neural networks with time delay,”
*IEEE Transactions on Neural Networks*, vol. 19, no. 3, pp. 532–535, 2008. View at: Publisher Site | Google Scholar - Z. C. Yang and D. Y. Xu, “Global exponential stability of Hopfield neural networks with variable delays and impulsive effects,”
*Applied Mathematics and Mechanics*, vol. 27, no. 11, pp. 1329–1334, 2006. View at: Google Scholar | MathSciNet - Q. Zhang, X. Wei, and J. Xu, “On global exponential stability of discrete-time Hopfield neural networks with variable delays,”
*Discrete Dynamics in Nature and Society*, vol. 2007, Article ID 67675, 9 pages, 2007. View at: Google Scholar | MathSciNet - H. Zhang, Z. Wang, and D. Liu, “Robust exponential stability of recurrent neural networks with multiple time-varying delays,”
*IEEE Transactions on Circuits and Systems II*, vol. 54, no. 8, pp. 730–734, 2007. View at: Publisher Site | Google Scholar - H. Zhang, Z. Wang, and D. Liu, “Global asymptotic stability of recurrent neural networks with multiple time-varying delays,”
*IEEE Transactions on Neural Networks*, vol. 19, no. 5, pp. 855–873, 2008. View at: Publisher Site | Google Scholar - H. Huang, D. W. C. Ho, and J. Lam, “Stochastic stability analysis of fuzzy Hopfield neural networks with time-varying delays,”
*IEEE Transactions on Circuits and Systems II*, vol. 52, no. 5, pp. 251–255, 2005. View at: Publisher Site | Google Scholar - X. Liu and Q. Wang, “Impulsive stabilization of high-order Hopfield-type neural networks with time-varying delays,”
*IEEE Transactions on Neural Networks*, vol. 19, no. 1, pp. 71–79, 2008. View at: Google Scholar - D.-L. Lee, “Pattern sequence recognition using a time-varying Hopfield network,”
*IEEE Transactions on Neural Networks*, vol. 13, no. 2, pp. 330–342, 2002. View at: Publisher Site | Google Scholar - J. Xu and Z. Bao, “Neural networks and graph theory,”
*Science in China. Series F*, vol. 45, no. 1, pp. 1–24, 2002. View at: Google Scholar | MathSciNet - J. Li, Y. Diao, J. Mao, Y. Zhang, and X. Yin, “Stability analysis of discrete Hopfield neural networks with weight function matrix,” in
*Proceedings of the 3rd International Symposium on Intelligence Computation and Applications (ISICA '08)*, vol. 5370 of*Lecture Notes in Computer Science*, pp. 760–768, Wuhan, China, December 2008. View at: Publisher Site | Google Scholar - Y. Zhang, J. Li, and Z.-W. Ye, “A new stability condition of discrete Hopfield neural networks with weight function matrix,” in
*Proceedings of the International Seminar on Future Information Technology and Management Engineering (FITME '08)*, pp. 65–68, November 2008. View at: Publisher Site | Google Scholar - R. Ma, S. Zhang, and S. Lei, “Stability conditions for discrete neural networks in partial simultaneous updating mode,” in
*Proceedings of the 2nd International Symposium on Neural Networks (ISNN '05)*, vol. 3496 of*Lecture Notes in Computer Science*, pp. 253–258, Chongqing, China, May-June 2005. View at: Google Scholar

#### Copyright

Copyright © 2009 Jun Li et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.