Research Article | Open Access
Ying Yang, Guopei Chen, "Finite Time Stability of Stochastic Hybrid Systems", Abstract and Applied Analysis, vol. 2014, Article ID 867189, 7 pages, 2014. https://doi.org/10.1155/2014/867189
Finite Time Stability of Stochastic Hybrid Systems
This paper considers the finite time stability of stochastic hybrid systems, which has both Markovian switching and impulsive effect. First, the concept of finite time stability is extended to stochastic hybrid systems. Then, by using common Lyapunov function and multiple Lyapunov functions theory, two sufficient conditions for finite time stability of stochastic hybrid systems are presented. Furthermore, a new notion called stochastic minimum dwell time is proposed and then, combining it with the method of multiple Lyapunov functions, a sufficient condition for finite time stability of stochastic hybrid systems is given. Finally, a numerical example is provided to illustrate the theoretical results.
Nowadays, stochastic modeling, control, and optimization have played a crucial role in many applications especially in the areas of controlling science and communication technology [1, 2]. In practical application, many stochastic systems exhibit impulsive and switching behaviors due to abrupt changes and switches of states at certain instants during the dynamical processes; that is, the systems switch with impulsive effects [3–6]. Moreover, impulsive and switching phenomena can be found in the fields of physics, biology, engineering, and information science. Many sudden and sharp changes occur instantaneously, in the form of impulses and switches, which cannot be well described by using pure continuous or pure discrete models. Therefore, it is important and, in fact, necessary to study hybrid impulsive and switching stochastic systems.
In many applications, it is desirable that the stochastic system possesses the property that trajectories which converge to a Lyapunov stable equilibrium state must do so in finite time rather than infinite time. Hence, the concept of finite time stability for stochastic systems arises naturally in stochastic control problems. For the deterministic case, finite time stability for continuous time systems was studied in  using Hölder continuous Lyapunov functions. Its improvements and extensions have been given by [8–10] for continuous systems satisfying uniqueness of solutions in forward time, for nonautonomous continuous systems, and for functional differential equations, respectively. In , the notion of finite time input-to-state stability is introduced for continuous systems with locally essentially bounded input. The problem of finite time stabilization for deterministic nonlinear systems has been accordingly studied in the literature. Numerous theoretical control design methods, including backstepping, sliding mode, control Lyapunov function, input-to-state stability, and small-gain techniques, were presented and developed for various types of nonlinear systems over the last two decades [8–18]. For the stochastic case, the notion of stochastically finite time attractiveness is introduced for a class of stochastic nonlinear systems and a theorem established on finite time attractiveness for such systems based on Lyapunov functions . In , the notion of finite time stability is extended to stochastic nonlinear systems, and the Lyapunov theorems are established on finite time stability and finite time instability for stochastic nonlinear systems. It is worth noting that these works just discussed the single stochastic system and did not take the switching and impulsive parts into account.
In this paper, we will study the finite time stability of stochastic hybrid systems, which has both Markovian switching and impulsive effect. The main contributions of this paper include the following: (i) extend the concept of finite time stability to stochastic hybrid systems, (ii) present two sufficient conditions for finite time stability of stochastic hybrid systems by using common Lyapunov function and multiple Lyapunov functions theory, (iii) introduce a new notion called stochastic minimum dwell time (SMDT), and (iv) propose a sufficient condition for finite time stability of stochastic hybrid systems by combining SMDT with the method of multiple Lyapunov functions.
The organization of the paper is as follows. In Section 2, we present some preliminary materials and a formulation of problems to be considered in this paper. In Section 3, the finite time stability of stochastic hybrid systems is studied and several sufficient conditions are presented. A numerical example is provided in Section 4. Finally, concluding remarks are given in Section 5.
2. Problem Statement and Preliminaries
Throughout this paper, we let denote the -dimensional Euclidean space, denote all nonnegative real numbers, and denote the space of matrices with real entries. Let be a complete probability space with a filtration satisfying the usual conditions (i.e., it is right continuous and contains all -null sets). Let be an -dimensional Brownian motion defined on the probability space. Let denote the Euclidean norm in . If is a vector or matrix, its transpose is denoted by . If is a matrix, its trace form is denoted by . means the minimum of and , while means the maximum.
Let be a Markov chain on the probability space taking values in a finite state space with generator given by where , . Here is the transition rate from to if while , and .
A function is said to be if it is -times continuously differentiable. Let denote the gradient of a function ; we always write as a row vector. For a function , denotes the Hessian of , the matrix of second-order partial derivatives of . A function is said to be positive-definite if and for all . A function is said to be a class function if it is continuous and strictly increasing and . A class function is said to belong to class if as .
In this paper, we will consider a stochastic hybrid system with modes described by and suppose that the dynamics is described by the following forms: for with initial value , where is the state vector and , , and are Borel measurable, continuous in , and satisfy , and for all . is an -dimensional Brownian motion defined on the underlying probability space and independent of , . , , and , , which implies that the solution of the system (2) is right continuous. At the switching times, there exists an impulse described by the second equation of (2).
To ensure the existence and uniqueness of solutions for (2) we impose the following hypothesis.
Assumption 1. Assume that for any functions and satisfy all conditions of Lemma 2.1 in Yin et al. ; that is, functions and are all continuous in . Further, for each , and each , the following conditions hold:(i), ;(ii), ;(iii) as , , ,where and , , are nonnegative functions such that and ; , as , is nonrandom, strictly increasing, continuous, and concave such that .
Assumption 2. Assume that, for any and , , functions satisfy the global Lipschitz condition; that is, there exist constants , , such that for arbitrary and arbitrary ,
Remark 4. It is known that the existence of a unique solution to (2) is ensured if functions and satisfy the usual conditions (i.e., the local Lipschitz condition and the linear growth condition), and satisfy the global Lipschitz condition. However, in this paper, we assume that functions and only satisfy the conditions of Assumption 1, rather than the usual conditions. This is because (1) the conditions of the former have less conservation than that of the latter; (2) it is impossible to discuss the finite time stability in probability for (2) if functions and satisfy the local Lipschitz condition. Therefore, in this paper, we assume that functions , , and satisfy the conditions of Assumptions 1 and 2.
Remark 5. In mathematics, we need that functions and satisfy the non-Lipschitz condition. The aim of this requirement is to ensure the finite time stability of industry systems considered. In other words, if functions and of the industry system satisfy the Lipschitz condition, then this industry system can not be finite time stable.
Next, we will extend the concept of finite time stability to the stochastic hybrid system (2).
Definition 6. The trivial solution of (2) is said to be finite time stable in probability with respect to , if the system (2) admits a unique solution for any initial data and a given Markov chain , denoted by . Moreover, the following statements hold.(i)Finite time attractiveness in probability: For every initial value and a given Markov chain , the first hitting time , which is called the stochastic settling time, is finite almost surely; that is, .(ii)Stability in probability: for every pair of and , there exists a such that whenever .
Remark 7. In , the research object was just a single stochastic system and did not consider the influence of switching part and impulsive effects. Therefore, our definition is an extension of .
Let denote the family of all nonnegative functions on which are continuously twice differentiable in . If , define an operator from to by where , .
For the convenience of the reader we cite the generalized Itô formula established by  as a lemma.
Lemma 8 (generalized Itô formula ). If , then for any switching times , as long as the integrations involved exist and are finite.
We will show next how finite time stability can be indirectly determined by studying the probability associated with a function defined for the stochastic hybrid system.
3. Finite Time Stability Analysis
In this section, we will extend the existing results to study the finite time stability for the stochastic hybrid system (2); several sufficient conditions will be given. First, we need the following lemma.
Lemma 9. Let , , and denote a strictly increasing time sequence with , . Assume that there exists a scalar function with such that(1)function is continuous on time interval , ,(2)for any , , ,(3), , where . Then there exists a real number such that
Proof. Let . Note that and for all ; we have . Now define a continuous function
It is not hard to verify that
for any , where . Note that and as . This is because and , so we have . Then by the definition of , we obtain
and hence as . Let . If , then the required assertion follows by taking . Suppose that there exists a . This means that and .
Let ; we claim that for some , where is such that . In fact, if with , then by the continuity of and , , there exists a constant such that for any . However, this contradicts the definition of .
Therefore, we have that or with for some . That is, is continuous at . Combining the continuity of , we have that , and thus for any . Let , and by condition (2) we have that for all , where Note that is continuous on time interval , so the integral term of (11) is derivable with respect to . Now, for any , we have that That is, is an increasing function as . Since , we immediately obtain that , . By this, (9), and (11), we have which implies that there exists at least a such that ; however, this contradicts the assumption that for any . The proof is complete.
Remark 10. In Lemma 9, the function is only piecewise continuous, rather than continuous. Thus, Lemma 9 is an extension of [20, Lemma 3.1]. Moreover, by (7) and the definition of , we have that can be chosen as .
Remark 11. In mathematics, we allow that the function is piecewise continuous, rather than continuous. This is an advantage for practical application. Because it allows that the system has discontinuous dynamics, such as optimal control systems, nonsmooth mechanics systems, and robotic manipulation systems. In addition, it follows from that we can adjust the time by using the actual initial value of systems, which leads to greater flexibility in practical application.
Theorem 12. Consider the system (2); if there exist a function , class functions and , and positive real numbers and , such that for all , , and a given Markov chain ,(1),(2), ,(3), , then the trivial solution of (2) is finite time stable in probability with respect to .
Proof. The proof follows the same line of the proof of [20, Theorem 3.1].
Remark 13. If all conditions of Theorem 12 hold for arbitrary Markov chain , then we will obtain the sufficient condition for finite time stability of stochastic hybrid system (2) under arbitrary Markovian switching.
Theorem 12 gives us a sufficient condition for finite time stability of stochastic hybrid system (2) by using common Lyapunov function technique. Next, we will give another sufficient condition by using multiple Lyapunov functions technique.
Let be a piecewise constant function with a strictly increasing sequence of switching times , where , and is a constant as , . Before giving the next sufficient condition, we need the following lemma.
Lemma 14. Assume that there exist a constant and a scalar switching function with such that(1) is continuous as , ,(2), ,(3)for any , , where , , and . Then there exists a real number such that
Proof. Using similar arguments of Lemma 9, we can obtain the desired conclusion.
Remark 15. Lemma 14 is an extension of Lemma 9. Moreover, from the analysis of Lemma 14, we have that where . It implies that , where . By this, the real number in Lemma 14 can be chosen as Let denote the th switching times of the Markov chain , where , and . By Lemma 14, we have the following theorem.
Theorem 16. Consider the system (2); if there exist a function , class functions and , and positive real numbers , , , such that for all , , and a given Markov chain ,(1), ,(2)for any , , (3), , where and , then the trivial solution of (2) is finite time stable in probability with respect to .
Before giving this sufficient condition, we need the following definition and lemma.
Definition 17. For a given Markov chain , let denote its th switching times, where , and . If holds for , then is called stochastic minimum dwell time of the Markov chain .
Lemma 18. For a given Markov chain and any , the following inequality holds: where , and .
Proof. Using Definition 17 and definitions of and , we can obtain the desired conclusion.
We now state another sufficient condition for finite time stability of system (2).
Theorem 19. Consider the system (2); if there exist a function , class functions and , and positive real numbers , , , such that for all , ,(1), ,(2)for any , , (3), , where and , then the trivial solution of (2) is finite time stable in probability with respect to arbitrary Markov chain with stochastic minimum dwell time satisfying where and .
Remark 20. In Theorem 19, if , then, by similar arguments as proof of [20, Theorem 3.1], we can deduce that the trivial solution of (2) is finite time stable in probability with respect to arbitrary Markov chain . Therefore, Theorem 19 always holds for any .
Remark 21. The mathematical conclusion of Theorem 19 proposes a method to compute the stochastic minimum dwell time, which ensures the finite time stability of practical systems. Furthermore, it follows from (22) that we can adjust the stochastic minimum dwell time by using the actual initial value of systems, which leads to greater flexibility in practical application.
4. Numerical Example
In this section, we will present an example to illustrate the theoretical results.
Example 1. Consider a three-dimensional stochastic hybrid system of the form where , , is a Markov chain and and is a three-dimensional Brownian motion defined on the underlying probability space and independent of , .
It is not hard to verify that all conditions in Assumption 1 are satisfied. In addition, it is also easy to verify that satisfy Assumption 2 with . So the system (23) admits a unique solution. By using a Lyapunov function , , one can verify that and .
Now, by choosing , , , , and , we can verify that all conditions of Theorem 19 are satisfied. Let , , and ; by using inequality (22), we have Simulations have been carried out for system (23) with the Markov chain , which satisfies the condition , . Figure 1 shows that the state of (23) converges to zero in finite time. The Markov chain of (23) is shown in Figure 2.
It is worth noting that, although each subsystem of (23) is finite time stable in probability, the entire system (23) still may be not finite time stable in probability when the stochastic minimum dwell time of systems is too small. Figure 3 shows the state of (23) under the Markov chain , which satisfies the condition , .
Remark 22. Our simulation results are obtained by using the Matlab (version 6.5) programming, where all the differential equations are solved through using the improved Runge-Kutta algorithm. Comparing with the existing methods (such as ode23 and Euler method), our method has higher precision and less simulation time. The accuracy and simulation time of other methods (such as ode23, Euler method) are 10−3~10−4 and 50~70 minutes, respectively, while the accuracy and simulation time of our algorithm are 10−5 and 36 minutes, respectively.
The issues of finite time stability for stochastic hybrid systems have been studied and corresponding results have been presented. Based on common Lyapunov function and multiple Lyapunov functions theory, two sufficient conditions for finite time stability of systems have been derived. Furthermore, a new notion called stochastic minimum dwell time has been proposed. A sufficient condition for finite time stability of systems has also been given by combining the method of multiple Lyapunov functions with the stochastic minimum dwell time.
Future research directions include the research for more relaxed conditions for finite time stability and the applications of the results presented here to packet-dropping problems in network control systems and time-delayed systems.
Conflict of Interests
The authors declare that there is no conflict of interests regarding the publication of this paper.
This work was supported by the Natural Science Foundation of Guangdong Province, China (S2011040003733), National Natural Science Foundation of China (60974139), and Science and Technology Project of Huizhou (2012P10).
- H. I. Kushner and P. Dupuis, Numerical Methods for Stochastic Control Problems in Continuous Time, Springer, New York, NY, USA, 2001.
- J. P. Hespanha, “A model for stochastic hybrid systems with application to communication networks,” Nonlinear Analysis: Theory, Methods and Applications, vol. 62, no. 8, pp. 1353–1383, 2005.
- Y. Ji and H. J. Chizeck, “Controllability, stabilizability, and continuous-time Markovian jump linear quadratic control,” IEEE Transactions on Automatic Control, vol. 35, no. 7, pp. 777–788, 1990.
- X. Mao and C. Yuan, “Asymptotic stability in distribution of stochastic differential equations with Markovian switching,” Stochastic Processes and Their Applications, vol. 103, no. 2, pp. 277–291, 2003.
- L. Hu, P. Shi, and B. Huang, “Stochastic stability and robust control for sampled-data systems with Markovian jump parameters,” Journal of Mathematical Analysis and Applications, vol. 313, no. 2, pp. 504–517, 2006.
- V. Dragan and T. Morozan, “Stability and robust stabilization to linear stochastic systems described by differential equations with Markovian jumping and multiplicative white noise,” Stochastic Analysis and Applications, vol. 20, no. 1, pp. 33–92, 2002.
- S. P. Bhat and D. S. Bernstein, “Finite-time stability of continuous autonomous systems,” SIAM Journal on Control and Optimization, vol. 38, no. 3, pp. 751–766, 2000.
- E. Moulay and W. Perruquetti, “Finite time stability and stabilization of a class of continuous systems,” Journal of Mathematical Analysis and Applications, vol. 323, no. 2, pp. 1430–1443, 2006.
- E. Moulay and W. Perruquetti, “Finite time stability conditions for non-autonomous continuous systems,” International Journal of Control, vol. 81, no. 5, pp. 797–803, 2008.
- E. Moulay, M. Dambrine, N. Yeganefar, and W. Perruquetti, “Finite-time stability and stabilization of time-delay systems,” Systems and Control Letters, vol. 57, no. 7, pp. 561–566, 2008.
- Y. Hong, Z.-P. Jiang, and G. Feng, “Finite-time input-to-state stability and applications to finite-time control design,” SIAM Journal on Control and Optimization, vol. 48, no. 7, pp. 4395–4418, 2010.
- Y. Hong, J. Huang, and Y. Xu, “On an output feedback finite-time stabilization problem,” IEEE Transactions on Automatic Control, vol. 46, no. 2, pp. 305–309, 2001.
- Y. Hong, J. Wang, and D. Cheng, “Adaptive finite-time control of nonlinear systems with parametric uncertainty,” IEEE Transactions on Automatic Control, vol. 51, no. 5, pp. 858–862, 2006.
- X. Huang, W. Lin, and B. Yang, “Global finite-time stabilization of a class of uncertain nonlinear systems,” Automatica, vol. 41, no. 5, pp. 881–888, 2005.
- Z.-P. Jiang and I. M. Y. Mareels, “A small-gain control method for nonlinear cascaded systems with dynamic uncertainties,” IEEE Transactions on Automatic Control, vol. 42, no. 3, pp. 292–308, 1997.
- Y. Wu, X. Yu, and Z. Man, “Terminal sliding mode control design for uncertain dynamic systems,” Systems and Control Letters, vol. 34, no. 5, pp. 281–287, 1998.
- X. Yu and M. Zhihong, “Fast terminal sliding-mode control design for nonlinear dynamical systems,” IEEE Transactions on Circuits and Systems I, vol. 49, no. 2, pp. 261–264, 2002.
- S. Yu, X. Yu, B. Shirinzadeh, and Z. Man, “Continuous finite-time control for robotic manipulators with terminal sliding mode,” Automatica, vol. 41, no. 11, pp. 1957–1964, 2005.
- W. Chen and L. C. Jiao, “Finite-time stability theorem of stochastic nonlinear systems,” Automatica, vol. 46, no. 12, pp. 2105–2108, 2010.
- J. Yin, S. Khoo, Z. Man, and X. Yu, “Finite-time stability and instability of stochastic nonlinear systems,” Automatica, vol. 47, no. 12, pp. 2671–2677, 2011.
- R. Situ, Theory of Stochastic Differential Equations with Jumps and Appli-Cations: Mathematical and Analysis Techniques with Applications to Engineering, Springer, New York, NY, USA, 2005.
- A. V. Skorohod, Symptotic Methods in the Theory of Stochastic Differential Equations, American Mathematical Society, Providence, RI, USA, 2004.
Copyright © 2014 Ying Yang and Guopei Chen. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.