About this Journal Submit a Manuscript Table of Contents
Journal of Applied Mathematics
Volume 2013 (2013), Article ID 356746, 8 pages
http://dx.doi.org/10.1155/2013/356746
Research Article

Stability of Nonlinear Stochastic Discrete-Time Systems

1College of Information Science & Engineering, Shandong University of Science and Technology, Qingdao, Shandong 266590, China
2College of Information and Electrical Engineering, Shandong University of Science and Technology, Qingdao, Shandong 266590, China

Received 6 May 2013; Accepted 12 July 2013

Academic Editor: Baocang Ding

Copyright © 2013 Yan Li et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Abstract

This paper studies the stability for nonlinear stochastic discrete-time systems. First of all, several definitions on stability are introduced, such as stability, asymptotical stability, and pth moment exponential stability. Moreover, using the method of the Lyapunov functionals, some efficient criteria for stochastic stability are obtained. Some examples are presented to illustrate the effectiveness of the proposed theoretical results.

1. Introduction

Stability is the first of all the considered problems in the system analysis and synthesis of modern control theory, which plays an essential role in dealing with infinite-horizon linear-quadratic regulator, robust optimal control, and other control problems; see [15]. In 1892, Lyapunov introduced the concept of stability of dynamic systems and created a very powerful tool known as the Lyapunov method in the study of stability. It can be found that the Lyapunov method has been developed and applied to investigate stochastic stability of the Itô-type systems, and many important classical results on deterministic differential equations have been generalized to the stochastic Itô systems; we refer the reader to Arnold [6], Friedman [7], Has’minskii [8], Kushner [9], Kolmanovskii and Myshkis [10], Ladde and Lakshmikantham [11], Mohammed [12], and Mao [13].

Compared with the plenty of fruits of the continuous-time Itô systems, few results have been obtained on the stability of discrete-time nonlinear stochastic systems:

In [14], the mean square stability of the discrete-time time-varying Markov jump system was studied. Nevertheless [15], based on the exact observability assumption, extensively researched the mean square stability of the following linear discrete-time time-invariant system with multiplicative noise: where the Classical Lyapunov Theorem was extended. Reference [16] considered the mean stability of the following difference equations with random coefficients. For the nonlinear stochastic difference equation its stability in probability was investigated in [17]. It is not difficult to find that, different from the continuous-time Itô systems, up to now, there lacks the systematic theory on stability of nonlinear discrete-time stochastic systems. The aim of this paper is to develop a parallel theory for stability of general nonlinear stochastic discrete-time systems, and some sufficient criteria for various stabilities are given.

Different from the Itô systems, most sufficient criteria are presented via or together with other assumptions on the Lyapunov function , where is the so-called infinitesimal generator associated with the given Itô system. In discrete-time stochastic systems, most stability criteria are given via or , where represents the mathematical expectation. So general discrete stochastic stability is more difficult to be tested due to the appearance of the mathematical expectation .

The organization of this paper is as follows. Section 2 presents some stability definitions. Section 3 is devoted to developing some efficient criteria for various stabilities. Section 4 contains three examples provided to show the efficiency of the proposed results. Finally, we end this paper by Section 5 with a brief conclusion.

For convenience, we adopt the following notations:: the transpose of the matrix ; (): is a positive semidefinite (positive definite) matrix; for ;: the class of functions twice continuously differential with respect to ;;: the probability of event ;a.s.: almost surely, or with probability 1;: the indicator function of a set ; that is, if or otherwise 0;: the minimum of and .

2. Definitions of Stability

We will investigate various types of stabilities in probability for the -dimensional stochastic discrete-time system where is a constant vector. For any given initial value , (6) has a unique solution that is denoted by or simply. is a one-dimensional stochastic process defined on the complete probability space . We assume that for all , so (6) has the solution corresponding to the initial value . This solution is called the trivial solution or the equilibrium position.

Definition 1. The trivial solution of (6) is said to be stochastically stable or stable in probability if, for every and , there exists , such that when . Otherwise, it is said to be stochastically unstable.
If the previous is independent of , that is, , then the trivial solution of (6) is said to be stochastically uniformly stable in probability.

Definition 2. The trivial solution of (6) is said to be stochastically asymptotically stable in probability if it is stochastically stable, and for every , there exists , such that when .

Definition 3. The trivial solution of (6) is said to be stochastically uniformly asymptotically stable in probability if it is stochastically uniformly stable in probability, and for every , , there exist and a , such that

Definition 4. The trivial solution of (6) is said to be stochastically asymptotically stable in the large in probability if it is stochastically stable, and for all ,

Definition 5. The trivial solution of (6) is said to be uniformly bounded if, for every and , there exists , such that when and .

Definition 6. The trivial solution of (6) is said to be stochastically uniformly asymptotically stable in the large in probability if the following are satisfied:(i)it is stochastically uniformly stable;(ii)it is uniformly bounded;(iii)for any , , and , there exists , such that

Definition 7. The trivial solution of (6) is said to be moment exponentially stable if there exist positive constants and , such that where , , and . When , it is usually said to be exponentially stable in mean square.
Below, we consider such a continuous function with , and write

Definition 8 (see [13]). A continuous function is said to belong to class if it is strictly increasing and .

Definition 9 (see [13]). A continuous function defined on is said to be positive definite (in the sense of Lyapunov) if and, for some ,
A continuous function defined on is said to be negative definite (in the sense of Lyapunov) if is positive definite.

Definition 10 (see [13]). A function defined on is said to be radially unbounded if

Definition 11. A function defined on is said to have infinite small upper bound if there exists such that

3. Main Results

In this section, we state our main results in this paper. By using the method of the Lyapunov functionals, some efficient criteria for the stability are obtained.

Theorem 12. If there exists a positive definite function , such that for all , then the trivial solution of (6) is stochastically stable in probability.

Proof. By the definition of , we obtain that and that there exists a function , such that
For any and , without loss of generality, we assume that . Because is continuous, we can find that , such that
It is obvious that . We fix the initial value arbitrarily. Let be the first exit time of from ; that is,
Let , for any , we have
Taking the expectation on both sides, it is easy to see that
If and we note that , then
From (21) and (24), we achieve that
Letting , then ; that is,
Therefore, the trivial solution of (6) is stochastically stable.

Theorem 13. If there exists a positive definite and infinite small upper bounded function , such that then the trivial solution of (6) is stochastically uniformly stable in probability.

Proof. By the assumptions, there exist and , such that
Let and be arbitrary. Without loss of generality, we may assume that . We define
Because of , we can obtain , and it has nothing to do with .
Similar to the proof of Theorem 12, Theorem 13 is established.

Remark 14. We note that in Theorems 12-13 corresponds to in the Itô systems. In Theorem 13, is not only a positive function, but it is also an infinite small upper bounded function; this is because Theorem 13 is stronger than Theorem 12.

Theorem 15. If there exist a function and a positive definite function , such that for all , then the trivial solution of (6) is stochastically asymptotically stable in probability.

Proof. From Theorem 12, we have that the trivial solution of (6) is stochastically stable. Fix arbitrarily; then there is , such that when .
Fix arbitrarily. By the assumptions on function , we know that and that there exist two functions , , such that
Let arbitrarily, and choose , sufficiently small; because of being continuous, we can find that  , such that
Define the stopping times
Choose sufficiently large, such that
Let , for any , we have
Taking the expectation on both sides, we can derive that
Hence,
This means that By (31), . So which implies that
Hence,
Define the two stopping times
Similar to the proof of (24), we can show that, for ,
If , then we note that , then
By (31) and , we have Together with (33), we get
Letting , we obtain
By (42), it follows that
This means that
Since is arbitrary, then we have

Theorem 16. If there exist a function and a positive definite, infinite small upper bounded function , such that , for all , then the trivial solution of (6) is stochastically uniformly asymptotically stable in probability.

Proof. By the assumptions, there exist , , and , such that
From Theorem 13, we know that the trivial solution of (6) is stochastically uniformly stable. Therefore, for every and , there exists , such that
According to Definition 3 we only need to show that, for every and , there exist and , such that
We use a contradiction argument; take ; suppose, for any , that , such that . By , we can show that
So That is,
Thus, whenever .
Especially, if , it follows that
This contradicts the positive definite property of . Then, we can prove that there exists , such that
According to Definition 3, we have
Therefore,
The proof is complete.

Remark 17. By comparing Theorems 1215, we know that guarantees the system to be stochastically asymptotically stable. The difference between Theorems 15 and 16 is that is additionally required to have an infinite small upper bound in Theorem 16, which ensures the trivial solution of (6) to be stochastically uniformly asymptotically stable in probability.

Theorem 18. If there exist a function and a positive definite radially unbounded function , such that , for all , then the trivial solution of (6) is stochastically asymptotically stable in the large.

Proof. By Theorem 12, we know that the trivial solution of (6) is stochastically stable.
Let be arbitrary, and fix any . Since is radially unbounded, then we can choose sufficiently large, such that
Define the stopping time
Similar to the proof of (24), we can obtain that, for any ,
From (63), we have
Together with (65), it yields that
Let ; we have . That is to say,
In the same way as that of the proof of Theorem 15, we can show that
This immediately implies that . The proof is complete.

Theorem 19. If there exist a function and a positive definite, infinite small upper bound and radially unbounded function , such that , for all , then the trivial solution of (6) is stochastically uniformly asymptotically stable in the large in probability.

Proof. Under the conditions of Theorem 19, there exist , , and , such that
By Theorem 13, we know that the trivial solution of (6) is stochastically uniformly stable.
In the following, we first verify that the trivial solution of (6) is uniformly bounded. Actually, for any ,  , due to , , there exists , such that
It is easy to show that
When , we have
Because of being strictly increasing, so , a.s., . This implies that the trivial solution of (6) is uniformly bounded.
We further show that, for every , , and , there exists , such that
As previously stated the trivial solution of (6) is stochastically uniformly stable. Therefore, for every and , there exists , such that The rest is similar to the proof of Theorem 16 and is thus omitted.

Remark 20. Theorems 18 and 19 are stronger versions of Theorems 15 and 16, respectively, where is additionally required to be a radially unbounded function that is used to prove the stability in the large.

In what follows, we will discuss the moment exponential stability for (6).

Theorem 21. Suppose that there exist a function and positive constants , , and , such that
Then
That is, the trivial solution of (6) is th moment exponentially stable.

Proof. Define the stopping time
It is easy to see that as almost surely.
By , we can derive that
By , we have that
Letting , then which implies (77).
As a corollary, Theorem 21 yields a sufficient criterion for the exponential stability in mean square sense.

Corollary 22. Suppose that there exist a function and positive constants , , and , such that Then the trivial solution of (6) is exponentially stable in mean square.

4. Illustrative Examples

In this section, we present three simple examples to illustrate applications of the stability results developed in this paper. We will let be a one-dimensional stochastic process defined on the complete probability space , such that and , where is the Kronecker delta.

Example 1. Consider the following equation: where and are matrices. Assume that there is a symmetric positive definite matrix , such that Now, define the stochastic Lyapunov function . It is obvious that By Theorem 12, we conclude that the trivial solution of (83) is stochastically stable in probability.

Example 2. Consider the following stochastic difference equation: where , , and are all matrix-valued functions defined on ,  , and   . Assume that for all .
We define the Lyapunov function . It is positive definite and radially unbounded. Moreover, That is, . By Theorem 18, the trivial solution is stochastically asymptotically stable in the large.

Example 3. Consider a one-dimensional linear stochastic difference equation where , are all constants, and . We assume that there exist positive constants and , such that .
We define the Lyapunov function ; then
By Corollary 22, the trivial solution is exponentially stable in mean square.

5. Conclusions

This paper has discussed the stability in probability for stochastic discrete-time systems. Using the method of Lyapunov functionals, some efficient criteria for the stability are obtained. Some results of the stability [13] for stochastic differential equations are generalized to stochastic discrete-time systems. There are some interesting problems such as the almost sure exponential stability and the stochastic nonlinear control that merit further study.

Acknowledgments

This work is supported by NSF of China (Grants nos. 61174078 and 61170054), Specialized Research Fund for the Doctoral Program of Higher Education (Grant no. 20103718110006), the Research Fund for the Taishan Scholar Project of Shandong Province of China, the SDUST Research Fund (Grant no. 2011KYTD105), and the State Key Laboratory of Alternate Electrical Power System with Renewable Energy Sources (Grant no. LAPS13018).

References

  1. D. J. N. Limebeer, B. D. O. Anderson, and B. Hendel, “A Nash game approach to mixed H2/H control,” IEEE Transactions on Automatic Control, vol. 39, no. 1, pp. 69–82, 1994. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet
  2. D. Hinrichsen and A. J. Pritchard, “Stochastic H,” SIAM Journal on Control and Optimization, vol. 36, no. 5, pp. 1504–1538, 1998. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet
  3. W. Zhang and B.-S. Chen, “On stabilizability and exact observability of stochastic systems with their applications,” Automatica, vol. 40, no. 1, pp. 87–94, 2004. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet
  4. B.-S. Chen and W. Zhang, “Stochastic H2/H control with state-dependent noise,” IEEE Transactions on Automatic Control, vol. 49, no. 1, pp. 45–57, 2004. View at Publisher · View at Google Scholar · View at MathSciNet
  5. V. Dragan, T. Morozan, and A.-M. Stoica, Mathematical Methods in Robust Control of Discrete-Time Linear Stochastic Systems, Springer, New York, NY, USA, 2010. View at Publisher · View at Google Scholar · View at MathSciNet
  6. L. Arnold, Stochastic Differential Equations: Theory and Applications, Wiley-Interscience, New York, NY, USA, 1972.
  7. A. Friedman, Stochastic Differential Equations and Their Applications, vol. 2, Academic Press, San Diego, Calif, USA, 1976.
  8. R. Z. Has'minskii, Stochastic Stability of Differential Equations, vol. 7 of Monographs and Textbooks on Mechanics of Solids and Fluids: Mechanics and Analysis, Sijthoff & Noordhoff, Rockville, Md, USA, 1980. View at MathSciNet
  9. H. J. Kushner, Stochastic Stability and Control, Academic Press, New York, NY, USA, 1967. View at MathSciNet
  10. V. B. Kolmanovskii and A. Myshkis, Applied Theory of Functional Differential Equations, Kluwer Academic Publishers, Norwell, Mass, USA, 1992.
  11. G. S. Ladde and V. Lakshmikantham, Random Differential Inequalities, vol. 150 of Mathematics in Science and Engineering, Academic Press, New York, NY, USA, 1980. View at MathSciNet
  12. S. E. A. Mohammed, Stochastic Functional Differential Equations, Longman, New York, NY, USA, 1986.
  13. X. R. Mao, Stochastic Differential Equations and Applications, Horwood, Chichester, UK, 1997.
  14. V. Dragan and T. Morozan, “Mean square exponential stability for some stochastic linear discrete time systems,” European Journal of Control, vol. 12, no. 4, pp. 373–399, 2006. View at Publisher · View at Google Scholar · View at MathSciNet
  15. Y. Huang, W. Zhang, and H. Zhang, “Infinite horizon linear quadratic optimal control for discrete-time stochastic systems,” Asian Journal of Control, vol. 10, no. 5, pp. 608–615, 2008. View at Publisher · View at Google Scholar · View at MathSciNet
  16. T. Taniguchi, “Stability theorems of stochastic difference equations,” Journal of Mathematical Analysis and Applications, vol. 147, no. 1, pp. 81–96, 1990. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet
  17. B. Paternoster and L. Shaikhet, “About stability of nonlinear stochastic difference equations,” Applied Mathematics Letters, vol. 13, no. 5, pp. 27–32, 2000. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet