Abstract

We deal with the stabilization problem of discrete nonlinear systems. We construct a control Lyapunov function on discrete nonlinear systems. Then, we present a new method to construct a continuous state feedback law.

1. Introduction

For nonlinear dynamical systems, Lyapunov function-based methods play a vital role in stability analysis since the Lyapunov functions have been used to design stabilizing feedback laws that render asymptotically controllable systems ISS to actuator errors and small observation noise [1]. This is an important and challenging problem for the general class of nonlinear systems. Artstein [2] proves that, for smooth control-affine systems evolving on with controls in , there exists a globally stabilizing feedback control law continuously on if the system has a smooth control Lyapunov function. Sontag [3] provided an explicit proof of Artstein's Theorem which uses a “universal formula” for the stabilizing controller. Shahmansoorian et al. [4] presented a new stabilizing control law with respect to a control Lyapunov function. Many researchers dealt with the effects of control Lyapunov functions and obtained considerably meaningful results, just as in [512].

The stabilization of discrete systems is studied by means of Lyapunov functions in this paper. Xiushan et al. [13] also studied discrete system, but the control Lyapunov function and the control law they give are not easy to get. We will present an explicit control Lyapunov function and feedback law stabilizing a discrete-time control system. Furthermore, a new way to construct a continuous state feedback law is designed.

2. Main Results

The problem amounts to find a function with , such that the closed-loop system is asymptotically stable at the origin. Of course we assume that , but the origin is not asymptotically stable equilibrium for the unforced system .

Equation (2.1) describes the behavior of the system in the sense that its evolution depends not only on an initial state, but also on a sequence of input signal. In this paper, we address the following discrete-time nonlinear control systems of the form: which is said to be affine-in-control, where and are continuous on and , (a sequence of input signal). The system (2.2) is globally asymptotically stable at the origin if there exists a map , such that the system is globally asymptotically stable at .

Definition 2.1. A smooth, proper, and positive definite function mapping into is said to be a control Lyapunov for the discrete system (2.2) if and only if for all .

Remark 2.2. Suppose that , then is a function about and .

Theorem 2.3. Suppose that , , and let Assume that there exist constants , such that , and then there exists a positive definite matrix , such that is a control Lyapunov function of system (2.2).

Proof. For , let Obviously, .
(i) If , does not depend on , then we just need Suppose are the eigenvalues of , then So, it is easy for us to give a suitable positive definite matrix such that which satisfies the relationship.
(ii) If , , for the same reason, there exists such that . Let An easy computation shows that is equal to and let , through a simple computation, we can get Thus, we obtain So, if we can choose suitable positive definite matrix .
Then, we can choose a suitable positive definite matrix satisfies (i) and (ii), such that is a control Lyapunov function of system (2.2). The proof is end.

Corollary 2.4. If is a control Lyapunov function for the discrete-time control system (2.2), then the control globally asymptotically stabilizes the equilibrium of the system (2.2).

Proposition 2.5. Suppose that for any . is a control Lyapunov function for the discrete-time system (2.1), then the control Law (2.16) is smooth and globally asymptotically stabilizes the equilibrium of the system (2.1).

Theorem 2.6. If there exists a quadratic control Lyapunov function for the discrete-time system (2.2), there exists a continuous stabilizing feedback law , and .

Proposition 2.7. Consider where .
Assume that are the roots of (2.17). If only have one discontinuity of the second kind , then one can construct one continuous function , such that and .

Proof. Since is the only second category discontinuous point of the roots of , so for any , there must exist , such that for any thus, we can let Define in the same way and let then it is easy for us to verify that is what we want to get.

Proposition 2.8. Consider where , , .
If is the only second category discontinuous point of the roots of (2.21), then one can construct one continuous function , such that and .

Proof. For all , there must exist , such that for any . Let , . Obviously, there exists satisfying , , then we can let Using the same way, we can get , then we can get what we want to get.
By Proposition 2.8, it is easy for us to prove Theorem 2.6.

3. Example

Consider a discrete-time nonlinear system where , . Let , then it is easy for us to obtain By Theorem 2.6, we can get that a continuous control law globally asymptotically stabilizes the equilibrium of the system (2.21).