Abstract

We first propose the concept of almost periodic time scales and then give the definition of almost periodic functions on almost periodic time scales, then by using the theory of calculus on time scales and some mathematical methods, some basic results about almost periodic differential equations on almost periodic time scales are established. Based on these results, a class of high-order Hopfield neural networks with variable delays are studied on almost periodic time scales, and some sufficient conditions are established for the existence and global asymptotic stability of the almost periodic solution. Finally, two examples and numerical simulations are presented to illustrate the feasibility and effectiveness of the results.

1. Introduction

It is well known that in celestial mechanics, almost periodic solutions and stable solutions to differential equations or difference equations are intimately related. In the same way, stable electronic circuits, ecological systems, neural networks, and so forth exhibit almost periodic behavior. A vast amount of researches have been directed toward studying these phenomena (see [16]). Also, the theory of calculus on time scales (see [7] and references cited therein) was initiated by Stefan Hilger in his Ph.D. thesis in 1988 [8] in order to unify continuous and discrete analysis, and it has a tremendous potential for applications and has recently received much attention since his foundational work. Therefore, it is meaningful to study that on time scales which can unify the continuous and discrete situations. However, there are no concepts of almost periodic time scales and almost periodic functions on time scales, so that it is impossible for us to study almost periodic solutions to differential equations on time scales.

Motivated by the above, the main purpose of this paper is to propose the concept of almost periodic time scales and then give the definition of almost periodic functions on almost periodic time scales, then establish some basic results about almost periodic differential equations on almost periodic time scales by using the theory of calculus on time scales and some mathematical methods. Furthermore, based on these results, as an application, we consider the following high-order Hopfield neural networks with variable delays on time scales: where corresponds to the number of units in a neural network, corresponds to the state vector of the th unit at the time , represents the rate with which the th unit will reset its potential to the resting state in isolation when disconnected from the network and external inputs, and are the first- and second-order connection weights of the neural network, and correspond to the transmission delays, denote the external inputs at time , and and are the activation functions of signal transmission.

2. Almost Periodic Differential Equations on Time Scales

In this section, we will establish some basic results about almost periodic differential equations on almost periodic time scales.

Let be a nonempty closed subset (time scale) of . The forward and backward jump operators and the graininess are defined, respectively, by

A point is called left-dense if and , left-scattered if , right-dense if and , and right-scattered if . If has a left-scattered maximum , then , otherwise . If has a right-scattered minimum , then , otherwise .

A function is right-dense continuous provided it is continuous at right-dense point in and its left-side limits exist at left-dense points in . If is continuous at each right-dense point and each left-dense point, then is said to be a continuous function on .

For and , we define the delta derivative of , to be the number (if it exists) with the property that for a given , there exists a neighborhood of such that for all .

Let be right-dense continuous, if , then we define the delta integral by

A function is called regressive provided for all . The set of all regressive and rd-continuous functions will be denoted by . We define the set .

If is a regressive function, then the generalized exponential function is defined by for all , with the cylinder transformation

Definition 2.1 (see [7]). Let are two regressive functions, define

Lemma 2.2 (see [7]). Assume that be two regressive functions, then(i) and ;(ii);(iii);(iv);(v);(vi)if , then .

Definition 2.3. A subset of is called relatively dense if there exists a positive number such that for all . The number is called the inclusion length.

Definition 2.4. Let be a collection of sets which is constructed by subsets of . A time scale is called an almost periodic time scale with respect to , if and is called the smallest almost periodic set of .

Remark 2.5. If , then , is called the smallest almost periodic set of . If is a set which is constructed by absolute values of all the elements in , that is , obviously, is the smallest positive almost periodic set of the time scale . Let , is called the smallest positive period of a time scale with respect to . It is easy to see that this definition includes the concept of periodic time scale and is proper (see [9]).
Throughout this paper, we always restrict our discussion on almost periodic time scales. In this section, we use the notation to denote a norm of .

Definition 2.6. Let be an almost periodic time scale with respect to . A function is called almost periodic if for any given , the set is relatively dense in ; that is, for any given , there exists an such that each interval of length contains at least one satisfying The set is called -translation set of , is called -translation number of , and is called contain interval length of .
Obviously, . So if , then we can discuss almost periodic problems on an almost periodic time scale and it is meaningful. We denote as a set constructed by all almost periodic functions on an almost time scale .

Remark 2.7. If and , then , in this case, Definition 2.6 is equivalent to Definition 1.1 in [10]. If and , then , in this case, Definition 2.6 is equivalent to the definition of the almost periodic sequences in [11, 12].

Lemma 2.8. Let be an almost periodic function, then is bounded on .

Proof. For given , there exists a constant , such that in any interval of length , there exists , such that the inequality holds. And noticing that , then in the limited interval , there exists a number , such that . For any given , we can take , then we have . Hence, we can obtain and . So for all , we have . This completes the proof.

Similar to the case of , one can easily show the following theorems.

Theorem 2.9. If are almost periodic, then are almost periodic.

Theorem 2.10. If is almost periodic, then is almost periodic if and only if is bounded on , where .

Theorem 2.11. If is almost periodic, is uniformly continuous on the value field of , then is almost periodic.

Definition 2.12. Let , and be an rd-continuous matrix on , the linear system is said to admit an exponential dichotomy on if there exist positive constant , projection , and the fundamental solution matrix of (2.10), satisfying where is a matrix norm on , (say, e.g., , then we can take .
Consider the following almost periodic system where is an almost periodic matrix function, is an almost periodic vector function.

Lemma 2.13. If the linear system (2.10) admits exponential dichotomy, then system (2.12) has a bounded solution as follows: where is the fundamental solution matrix of (2.10).

Proof. In fact, where . So, by Lemma 2.8, is a bounded solution of system (2.12). The proof is complete.

Lemma 2.14 (see [7]). Let be a regressive -matrix-valued function on . Let and . Then the initial value problem has a unique solution . Moreover, the solution is given by

Lemma 2.15. Let be an almost periodic function on , where , , and then the linear system admits an exponential dichotomy on .

Proof. According to Lemma 2.14, one can see that where , is a fundamental solution matrix of (2.18).
Now, we prove that admits an exponential dichotomy on . In fact, noticing that , then for .
If , we have then therefore then, we can get
If , we can get Hence, set , then where . We can take , therefore, admits an exponential dichotomy on with . This completes the proof.

3. An Application

It is well known that high-order Hopfield neural networks (HHNNs) have been extensively applied in psychophysics, speech, perception, robotics, adaptive pattern recognition, vision, and image processing. There exist many results on the existence and stability of periodic and almost periodic solutions for the neural networks with delays. We refer the reader to [1327] and references therein.

In fact, both continuous and discrete systems are very important in implementing and applications. But it is troublesome to study the existence and stability of almost periodic solutions for continuous and discrete systems, respectively. Therefore, it is meaningful to study that on time scales which can unify the continuous and discrete situations (see [28, 29]). In this section, by using the concepts and results developed in previous sections, we will study the existence and global asymptotic stability of almost periodic solution of (1.1).

The system (1.1) is supplemented with initial values given by where .

For the sake of convenience, we introduce the following notations:

In this section, we assume the follwing. are almost periodic, and , for . There exist positive constants such that for . Functions satisfy the Lipschitz condition, that is, there exist constants such that .

Let with the norm . Clearly, is a Banach space.

Definition 3.1. The almost periodic solution of system (1.1) is said to be globally asymptotically stable if for any and , there exists and such that for implies for all .

Theorem 3.2. Assume that hold, and suppose that
then (1.1) has a unique almost periodic solution.

Proof. For any given , we consider the following almost periodic differential system:
Since , it follows from Lemma 2.15 that the linear system admits an exponential dichotomy on . Thus, by Lemmas 2.13 and 2.15, we obtain that system (1.1) has a bounded solution: and it follows from Theorems 2.92.11 and being almost periodic that is also almost periodic.
Denote and define a mapping . Set
Next, let us check that . For any given , it suffices to prove that ; which shows that . So is a self-mapping from to .
Next, we shall prove that is a contraction of .
For any , Because , so is a contraction of .
By the fixed point theorem of Banach space, has a unique fixed point in such that , is an almost periodic solution of system (1.1) in . The proof is complete.

Theorem 3.3. Assume that hold. Suppose further that . Let and for any , the following holds Then the almost periodic solution of system (1.1) is globally asymptotically stable.

Proof. According to Theorem 3.2, we know that (1.1) has an almost periodic solution . Suppose that is an arbitrary solution of (1.1). Then it follows from system (1.1) that for , the initial condition of (3.12) is Let , we use the Lyapunov function , where , from (3.12) we have
Then we can easily get where ,
Let and where , it is easy to see that (i); (ii).
Let be a domain in the space that contains the origin of coordinates. We choose a constant and set . Assume that statement has the following form: for any and , , there exists a constant such that .
Assume that statement is not true, that is, there exist and such that for any constant , one has . Since we conclude that, for , there exists a constant such that Every can be represented in the form . Hence, for all , we have . Integrating the inequality from , we get which is impossible. Thus, is true.
We choose an arbitrary and set . Since statement is true, for any and there exists a constant such that .
Since for all , we obtain the following inequality for the solution : Hence, for all .
Thus, for any and , there is a constant such that for all . Choosing a constant so that , we obtain . Therefore, the almost periodic solution of system (1.1) is globally asymptotically stable. This completes the proof.

4. Numerical Examples and Simulations

Consider the following neural networks system on time scales: where Obviously, satisfy and , and

Example 4.1. , We get that is satisfied, and so, we have Thus, it is easy to see that The conditions of Theorems 3.2 and 3.3 is satisfied. Hence, we know that system (4.1) has an almost periodic solution, which is asymptotically stable.
We take , and the initial condition , we can give the following numerical simulation figures to show our results are plausible and effective on time scales (see Figures 1, 2, and 3).
The numerical simulations of Figures 1, 2, and 3 in Example 4.1 show that the unique almost periodic solution is asymptotically stable, our results are effective on time scales.

Example 4.2. ,
We get that is satisfied, and so, we have Thus, it is easy to see that The conditions of Theorems 3.2 and 3.3 is satisfied. Hence, we know that system (4.1) has an almost periodic solution, which is asymptotically stable.
We take , and the initial condition ,, we can give the following numerical simulation figures to show our results are plausible and effective on time scales (see Figures 3, 4, and 5).
The numerical simulations of Figures 4, 5, and 6 in Example 4.2 show that the unique almost periodic solution is asymptotically stable, our results are effective on time scales.

5. Conclusion

In this paper, some basic results about almost periodic differential equations on almost periodic time scales are established, and the existence and global asymptotic stability of an almost periodic solution for a class of high-order Hopfield neural networks on almost periodic time scales is investigated. The results derived in this paper are meaningful.

Acknowledgment

This work is supported by the National Natural Sciences Foundation of People’s Republic of China under Grant 10971183.