- About this Journal ·
- Abstracting and Indexing ·
- Aims and Scope ·
- Annual Issues ·
- Article Processing Charges ·
- Author Guidelines ·
- Bibliographic Information ·
- Citations to this Journal ·
- Contact Information ·
- Editorial Board ·
- Editorial Workflow ·
- Free eTOC Alerts ·
- Publication Ethics ·
- Recently Accepted Articles ·
- Reviewers Acknowledgment ·
- Submit a Manuscript ·
- Subscription Information ·
- Table of Contents
Mathematical Problems in Engineering
Volume 2012 (2012), Article ID 183729, 21 pages
Almost Sure Stability and Stabilization for Hybrid Stochastic Systems with Time-Varying Delays
1School of Information Science and Technology, Donghua University, Shanghai 200051, China
2College of Information Science and Engineering, Shanxi Agricultural University, Taigu 030801, China
3Department of Applied Mathematics, Donghua University, Shanghai 200051, China
Received 21 June 2012; Accepted 1 August 2012
Academic Editor: Bo Shen
Copyright © 2012 Hua Yang et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
The problems of almost sure (a.s.) stability and a.s. stabilization are investigated for hybrid stochastic systems (HSSs) with time-varying delays. The different time-varying delays in the drift part and in the diffusion part are considered. Based on nonnegative semimartingale convergence theorem, Hölder’s inequality, Doob’s martingale inequality, and Chebyshev’s inequality, some sufficient conditions are proposed to guarantee that the underlying nonlinear hybrid stochastic delay systems (HSDSs) are almost surely (a.s.) stable. With these conditions, a.s. stabilization problem for a class of nonlinear HSDSs is addressed through designing linear state feedback controllers, which are obtained in terms of the solutions to a set of linear matrix inequalities (LMIs). Two numerical simulation examples are given to show the usefulness of the results derived.
In the past decades, the problems of stability analysis and stabilization synthesis of stochastic systems have received significant attentions, and many results have been reported; see, for example [1–7] and the references therein. Commonly, the above problems can be solved not only in moment sense [8–10] but also in a.s. sense [11, 12]. However, in recent years, much interest has been focused on a.s. stability problems for stochastic systems; see, for example [8, 13] and the references therein.
It is well known that a lot of dynamical systems have variable structures subject to abrupt changes in their parameters, which are usually caused by abrupt phenomena such as component failures or repairs, changing subsystem interconnections, and abrupt environmental disturbances. The HSSs, which are regarded as the stochastic systems with Markovian switching in this paper, have been used to model the previous phenomena; see, for example [14–18] and the references therein. The HSSs combine a part of the state that takes values in continuously and another part of the state that is a Markov chain taking discrete values in a finite space . One of the important issues in the study of HSSs is the analysis of stability. In particular, it is not necessary for the stable HSSs to require every subsystem to be stable; in other words, even all the subsystems are unstable; as the result of Markovian switching, the HSSs may be stable. These reveal that the Markovian jumps play an important role in the stability analysis of HSSs. Therefore, in the past few decades, a great deal of literature has appeared on the topic of stability analysis and stabilization synthesis of HSSs; see, for example [2, 13, 14, 19, 20].
On the other hand, time delays are frequently encountered in a variety of dynamic systems, such as nuclear reactors, chemical engineering systems, biological systems, and population dynamics models. They are often a source of instability and poor performance of systems. So the problems of stability analysis and stabilization synthesis of HSDSs have been of great importance and interest. The classical efforts can be classified into two categories, namely, moment sense criteria, see, for example [21–23], and a.s. sense criteria, see, for example [24, 25]. Among the existing results, in , based on the techniques proposed in  which were developed via the results of , a.s. stability and stabilization of HSDSs were studied. In , the a.s. stability analysis problem for a general class of HSDSs was derived from extending the results in  to HSSs with mode-dependent interval delays. However, to the author’s best knowledge, when the different time-varying delays in the drift part and in the diffusion part are considered, the a.s. stability analysis and stabilization synthesis problems for nonlinear HSDSs have not been adequately addressed and remain an interesting and challenging research topic. This situation motivates the present study.
In this paper, we are concerned with a.s. stability analysis and stabilization synthesis problems for HSDSs. The purpose of stability is to develop conditions such that the underlying systems are a.s. stable. Following the same idea as in dealing with the stability problem, linear state feedback controllers are designed such that the special nonlinear or linear closed-loop systems are a.s. stable. The explicit expressions for the desired state feedback controllers are given by means of the solutions to a set of LMIs. Two numerical simulation examples are exploited to verify the effectiveness of the theoretical results. The main contribution of this paper is mainly twofold: the different time-varying delays in the drift part and in the diffusion part are considered for nonlinear HSDSs; for a class of nonlinear HSDSs, the stabilization synthesis problem is investigated in the a.s. sense.
This paper is organized as follows. In Section 2, we formulate some preliminaries. In Section 3, we investigate the a.s. stability for the hybrid stochastic systems with time-varying delays. In Section 4, the results of Section 3 are then applied to establish a sufficient criterion for the stabilization. In Section 5, two examples are discussed for illustration. Finally, conclusions are drawn in Section 6.
Notation 1. The notation used here is fairly standard unless otherwise specified. and denote, respectively, the dimensional Euclidean space and the set of all real matrices, and let . be a complete probability space with a natural filtration satisfying the usual conditions (i.e., it is right continuous, and contains all -null sets). If are real numbers, then stands for the maximum of and , and the minimum of and . represents the transpose of the matrix . and denote the largest and smallest eigenvalue of , respectively. denotes the Euclidean norm in . stands for the mathematical expectation. means the probability. denotes the family of all continuous -valued function on with the norm . being the family of all -measurable bounded -value random variables . denotes the family of functions such that .
2. Problem Formulation
In this paper, let be a right-continuous Markov chain on the probability space taking values in a finite state space with generator given by where and is the transition rate from mode to mode if while . Assume that the Markov chain is independent of the Brownian motion . It is known that almost all sample paths of are right-continuous step functions with a finite number of simple jumps in any finite subinterval of .
Let us consider a class of stochastic systems with time-varying delays: with initial data and , where , and are positive constant and and are nonnegative differential functions which denote the time-varying delays and satisfy The nonlinear functions and satisfy the local Lipschitz condition in ; that is, for any , there is such that for all and , and moreover, with some nonnegative number .
Remark 2.1. It should be pointed out that the systems (2.2) can be seen as the specialization of multiple time-varying delays systems which are of the form
But it is easy to see that the results in this paper can be applied to the systems (2.5) by the similar assumption in (2.4).
Let denote the family of all nonnegative functions on that are twice continuously differentiable in and once in . If , define an operator associated with (2.2) from to by
Remark 2.2. is thought as a single notation and is defined on while is defined on .
Definition 2.3. The system (2.2) is said to be a.s. stable if for all and
3. Main Results
Theorem 3.1. Assume that there exist nonnegative functions , , such that where and are positive numbers satisfying . Then system (2.2) is almost surely stable.
To prove this theorem, let us present the following lemmas.
Lemma 3.3 (see ). Let and be two continuous adapted increasing processes on with , let be a real-valued continuous local martingale with ., and let be a nonnegative -measurable random variable such that . Denote for all . If is nonnegative, then
where . means . In particular, if ., then,
That is, all of the three processes , and converge to finite random variables with probability one.
Proof. Fix any initial data , , and let be the bound for . For each integer , define
where we set when . Define similarly. By (2.4), we can observe that and satisfy the global Lipschitz condition and the linear growth condition. By the known existence-and-uniqueness theorem, there exists a unique global solution on to the equation
with initial data and .
Define the stopping time where we set as usual. It is easy to show that if , which implies that is increasing in . Letting , the property above also enables us to define for as if .
It is clear that is a unique solution of (2.2) for . To complete the proof, we only need to show . By Lemma 3.2, we have that for any , where operator is defined similarly as was defined by (2.6). By the definitions of and , if , we hence observe that By the conditions of (3.1) and (3.2), we derive that
On the other hand,
Letting and using (3.3), we obtain . Since is arbitrary, we must have . The proof is therefore complete.
Let us now begin to prove our main result.
Proof. Let for all . Inequality (3.2) implies whenever . Fix any initial value and any initial state , and for simplicity write .
By Lemma 3.2 and condition (3.1), we have Since , applying Lemma 3.3 we obtain that Define as . Then, it is obvious to see from (3.17) that
On the other hand, by (3.3) we have a.s.. It is easy to find an integer such that a.s. because of . Furthermore, for any integer , we can define the stopping time where as usual. Clearly, a.s. as . Moreover, for any given , there is such that for any .
It is straightforward to see from (3.16) that a.s.; then we claim that
The rest of the proof is carried out by contradiction. That is, assuming that (3.20) is false, we have
Furthermore, there exist and such that where is a set of natural numbers and are a sequence of stopping times defined by
By the local Lipschitz condition (2.4), for any given , there exists such that for all and .
For any , let ; by Hölder’s inequality and Doob’s martingale inequality, we compute where is the indicator of set .
Since is continuous in , it must be uniformly continuous in the closed ball . For any given , we can choose such that whenever and . Furthermore, let us choose
By inequality (3.25) and Chebyshev’s inequality, we have
Meanwhile, we can also choose sufficiently small for
And then, (3.27) and (3.28) yield where .
In the following, we can obtain from (3.16) and (3.29) that
This is a contradiction. So there is an with such that
Finally, any fixed , is bounded in . By Bolzano-Weierstrass theorem, there is an increasing sequence such that converges to some with . Since whenever , we must have if and only if . This implies that the solution of (2.2) is a.s. stable, and the proof is therefore completed.
Remark 3.5. The techniques proposed in Theorem 3.1 can be used to deal with the a.s. stability problem for other HSDSs, such as the ones in . In a very special case when for all and , it is easy to see that , and Theorem 3.1 is exactly Theorem 2.1 in . Similarly, Theorem 2.2 in  can be generalized to system (2.2) as a LaSalle-type theorem (see [24, 26]) for HSSs with multiple time-varying delays.
4. Almost Sure Stabilization of Nonlinear HSDSs
Consider the following nonlinear HSDSs: where are known constant matrices with appropriate dimensions and represents a scalar Brownian motion (Wiener process) on that is independent of Markov chain and satisfies: and are both functions from to which satisfy local Lipschitz condition and the following assumptions: where, for each , , are known constant matrices with appropriate dimensions, and , are positive definite matrices.
In the sequel, we denote the matrix associated with the th mode by where the matrix could be , or .
As the given HSDSs (4.1) is nonlinear, we here consider the resulting systems can be stabilized only by linear state feedback controller which is of the form where are controller parameters to be designed.
Under control law (4.5), the closed-loop system can be given as follow: The stabilization problem is therefore to design matrices for the closed-loop system (4.6) to be a.s. stable. In order to guarantee the solvability of , the following theorem is given.
Theorem 4.1. If there exist sequences of scalars , positive definite matrices and matrices such that the following LMIs hold, where then the controlled system (4.6) is a.s. stable and the state feedback controller determined by
Proof. Let and .
The operator has the form So where
By assumption 1, it is easy to see that we can choose and such that for all .
Noting that and , we can pre- and postmultiply (4.7) by , and using Schur complements, we can obtain where This implies
Let , and .
Clearly Moreover, by (4.24) we further obtain
The required assertion now follows from Theorem 3.1.
If the systems (4.6) reduces to linear HSDSs of the form where , and are known constant matrices with appropriate dimensions.
Then, the following corollary follows directly from Theorem 4.1.
Corollary 4.2. If there exist sequences of scalars , positive definite matrices and matrices such that the following LMIs hold, where then the controlled system (4.19) is a.s. stable and the state feedback controller determined by
Proof. Let and .
The operator has the form So where
It is easy to see that we can choose and such that for all .
Noting that and , we can pre- and postmultiply (4.7) by , and using Schur complements, we can obtain where
Let , and .
Clearly Moreover, by (4.24) we further obtain
The required assertion now follows from Theorem 3.1.
In this section we will provide two examples to illustrate our results. In the following examples we assume that is a scalar Brownian motion, is a right-continuous Markov chain independent of and taking values in , and the step size . By using the YALMIP toolbox, simulations results are shown in Figures 1–3. Figure 1 gives a portion of state of Example 5.1 for clear display. Figure 2 simulates the numerical results for Example 5.1. The simulation results have illustrated our theoretical analysis. Following from Theorem 4.1, the simulation results for Example 5.2 can be founded in Figure 3, which verify our desired results.
Example 5.1. Let
Consider scalar nonlinear HSDSs: where .
To examine the stability of system (5.2), we consider a Lyapunov function candidate as for . Then we have
By the elementary inequality for all , and , we see that inequality holds for any , where .
From inequalities (5.4)–(5.5), we have for all and . By , it is easy to see that ; then, we choose constant such that , and hence conditions of Theorem 3.1 are satisfied.
Example 5.2. Let
Consider scalar nonlinear closed-loop HSDSs: with , , , , , , , , , , , .
By Theorem 4.1 we can find the feasible solution for the a.s. stability.
In this paper, we have investigated the a.s. stability analysis and stabilization synthesis problems for nonlinear HSDSs. Some sufficient conditions are given to guarantee the resulting systems to be a.s. stable. Under these conditions, a.s. stabilization problem for a class of nonlinear HSDSs is solved in terms of the solutions to a set of LMIs. Finally, the results of this paper have been demonstrated by two numerical simulation examples.
This work is supported in part by the National Natural Science Foundation of P.R. China (no. 60974030).
- J. Hu, Z. Wang, H. Gao, and L. K. Stergioulas, “Robust sliding mode control for discrete stochastic systems with mixed time delays, randomly occurring uncertainties, and randomly occurring nonlinearities,” IEEE Transactions on Industrial Electronics, vol. 59, no. 7, pp. 3008–3015, 2012.
- L. Hu and X. Mao, “Almost sure exponential stabilisation of stochastic systems by state-feedback control,” Automatica, vol. 44, no. 2, pp. 465–471, 2008.
- X. Li and C. E. De Souza, “Criteria for robust stability and stabilization of uncertain linear systems with state delay,” Automatica, vol. 33, no. 9, pp. 1657–1662, 1997.
- X. Li and C. E. De Souza, “Delay-dependent robust stability and stabilization of uncertain linear delay systems: a linear matrix inequality approach,” IEEE Transactions on Automatic Control, vol. 42, no. 8, pp. 1144–1148, 1997.
- S. Ma, Z. Cheng, and C. Zhang, “Delay-dependent robust stability and stabilisation for uncertain discrete singular systems with time-varying delays,” Control Theory & Applications, vol. 1, no. 4, pp. 1086–1095, 2007.
- B. Shen, Z. Wang, Y. S. Hung, and G. Chesi, “Distributed H∞ filtering for polynomial nonlinear stochastic systems in sensor networks,” IEEE Transactions on Industrial Electronics, vol. 58, no. 5, pp. 1971–1979, 2011.
- D. Yue and Q. L. Han, “Delay-dependent exponential stability of stochastic systems with time-varying delay, nonlinearity, and Markovian switching,” IEEE Transactions on Automatic Control, vol. 50, no. 2, pp. 217–222, 2005.
- L. Liu, Y. Shen, and F. Jiang, “The almost sure asymptotic stability and th moment asymptotic stability of nonlinear stochastic differential systems with polynomial growth,” IEEE Transactions on Automatic Control, vol. 56, no. 8, pp. 1985–1990, 2011.
- G. Wei, Z. Wang, H. Shu, and J. Fang, “Robust H∞ control of stochastic time-delay jumping systems with nonlinear disturbances,” Optimal Control Applications and Methods, vol. 27, no. 5, pp. 255–271, 2006.
- G. Wei, Z. Wang, and H. Shu, “Nonlinear H∞ control of stochastic time-delay systems with Markovian switching,” Chaos, Solitons and Fractals, vol. 35, no. 3, pp. 442–451, 2008.
- X. Mao, “Stochastic versions of the LaSalle theorem,” Journal of Differential Equations, vol. 153, no. 1, pp. 175–195, 1999.
- X. Mao, Y. Shen, and C. Yuan, “Almost surely asymptotic stability of neutral stochastic differential delay equations with Markovian switching,” Stochastic Processes and their Applications, vol. 118, no. 8, pp. 1385–1406, 2008.
- B. Bercu, F. Dufour, and G. G. Yin, “Almost sure stabilization for feedback controls of regime-switching linear systems with a hidden Markov Chain,” IEEE Transactions on Automatic Control, vol. 54, no. 9, pp. 2114–2125, 2009.
- Z. Lin, Y. Lin, and W. Zhang, “H∞ filtering for non-linear stochastic Markovian jump systems,” Control Theory & Applications, vol. 4, no. 12, pp. 2743–2756, 2010.
- L. Wu, D. W. C. Ho, and C. W. Li, “Stabilisation and performance synthesis for switched stochastic systems,” Control Theory & Applications, vol. 4, no. 10, pp. 1877–1888, 2010.
- L. Wu, D. W. C. Ho, and C. W. Li, “Sliding mode control of switched hybrid systems with stochastic perturbation,” Systems and Control Letters, vol. 60, no. 8, pp. 531–539, 2011.
- J. Yao, F. Lin, and B. Liu, “H∞ control for stochastic stability and disturbance attenuation in a class of networked hybrid systems,” Control Theory & Applications, vol. 5, no. 15, pp. 1698–1708, 2011.
- N. Zeng, Z. Wang, Y. Li, M. Du, and X. Liu, “a Hybrid EKF and switching PSO algorithm for joint state and parameter estimation of lateral flow immunoassay models,” IEEE/ACM Transactions on Computational Biology and Bioinformatics, vol. 9, no. 2, pp. 321–329, 2012.
- X. Mao, “Stability of stochastic differential equations with Markovian switching,” Stochastic Processes and their Applications, vol. 79, no. 1, pp. 45–67, 1999.
- C. Yuan and J. Lygeros, “Stabilization of a class of stochastic differential equations with Markovian switching,” Systems and Control Letters, vol. 54, no. 9, pp. 819–833, 2005.
- X. Mao, J. Lam, and L. Huang, “Stabilisation of hybrid stochastic differential equations by delay feedback control,” Systems and Control Letters, vol. 57, no. 11, pp. 927–935, 2008.
- Z. Wang, H. Qiao, and K. J. Burnham, “On stabilization of bilinear uncertain time-delay stochastic systems with Markovian jumping parameters,” IEEE Transactions on Automatic Control, vol. 47, no. 4, pp. 640–646, 2002.
- Z. Wang, Y. Liu, and X. Liu, “Exponential stabilization of a class of stochastic system with markovian jump parameters and mode-dependent mixed time-delays,” IEEE Transactions on Automatic Control, vol. 55, no. 7, pp. 1656–1662, 2010.
- L. Huang and X. Mao, “On almost sure stability of hybrid stochastic systems with mode-dependent interval delays,” IEEE Transactions on Automatic Control, vol. 55, no. 8, pp. 1946–1952, 2010.
- C. Yuan and X. Mao, “Robust stability and controllability of stochastic differential delay equations with Markovian switching,” Automatica, vol. 40, no. 3, pp. 343–354, 2004.
- X. Mao, “A note on the LaSalle-type theorems for stochastic differential delay equations,” Journal of Mathematical Analysis and Applications, vol. 268, no. 1, pp. 125–142, 2002.
- R. S. Lipster and A. N. Shiryayev, Theory of Martingales, Kluwer Academic, Dodrecht, The Netherlands, 1989.