Abstract

This paper is concerned with the dynamical stability analysis for almost periodic solution of memristive neural networks with time-varying delays. Under the framework of Filippov solutions, by applying the inequality analysis techniques, the existence and asymptotically almost periodic behavior of solutions are discussed. Based on the differential inclusions theory and Lyapunov functional approach, the stability issues of almost periodic solution are investigated, and a sufficient condition for the existence, uniqueness, and global exponential stability of the almost periodic solution is established. Moreover, as a special case, the condition which ensures the global exponential stability of a unique periodic solution is also presented for the considered memristive neural networks. Two examples are given to illustrate the validity of the theoretical results.

1. Introduction

Memristor (resistor with memory), which was firstly postulated by Chua in [1], is the fourth fundamental electronic component along with the resistor, inductor, and capacitor. On May 1, 2008, the Hewlett-Packard (HP) research team announced their realization of a memristor prototype, with an official publication in Nature [2, 3]. This new circuit element is a two-terminal element, either a charge-controlled memristor or a flux-controlled memristor, and shares many properties of resistors and the same unit of measurement (ohm). Subsequently, memristor has received a great deal of attention from many scientists because of its potential applications in next generation computer and powerful brain-like “neural” computer [414].

Recently, various memristor-based networks have been established by means of the memristive circuits, and many applications have been made in science and engineering fields [1517]; see, for example, Cserey et al. who presented simulation measurements of a memristor crossbar device and designed a PCB memristor package and the appropriate measurement board [17]. It should be pointed that, in many applications, the existing memristor-based networks which many researchers had designed have been found to be computationally restrictive.

Neural networks, such as Hopfield neural networks, Cellular neural networks, Cohen-Grossberg, and bidirectional associative neural networks, are very important nonlinear circuit networks, and, in the past few decades, have been extensively studied due to their potential applications in classification, signal and image processing, parallel computing, associate memories, optimization, cryptography, and so forth. [1827]. Many results, which deal with the dynamics of various neural networks such as stability, periodic oscillation, bifurcation, and chaos, have been obtained by applying Lyapunov stability theory; see, for example [2845] and the references therein. Very recently, memristor-based neural networks (memristive neural networks) have been designed by replacing the resistors in the primitive neural networks with memristors in [4652]. As is well known, the memristor exhibits the feature of pinched hysteresis, which means that a lag occurs between the application and the removal of a field and its subsequent effect, just as the neurons in the human brain have. Because of this feature, the memristive neural networks can remember its past dynamical history, store a continuous set of states, and be “plastic” according to the presynaptic and postsynaptic neuronal activity. In [46], Itoh and Chua designed a memristor cellular automaton and a memristor discrete-time cellular neural network, which can perform a number of applications such as logical operations, image processing operations, complex behaviors, higher brain functions, and RSA algorithm. In [47], Pershin and Di Ventra constructed a simple neural network consisting of three electronic neurons connected by two memristor-emulator synapses and demonstrated experimentally the formation of associative memory in these memristive neural networks. This experimental demonstration opens up new possibilities in the understanding of neural processes using memory devices, an important step forward to reproduce complex learning, adaptive and spontaneous behavior with electronic neural networks.

It is well known that, in the design of practical neural networks, the qualitative analysis of neural network dynamics plays an important role. For example, to solve problems of optimization, neural control, and signal processing, neural networks have to be designed in such a way that, for a given external input, they exhibit only one globally asymptotically/exponentially stable equilibrium point. Hence, in practice applications, it is an essential issue to discuss the stability for the memristive neural networks. In [48], Hu and Wang proposed a piecewise-linear mathematical model of the memristor to characterize the pinched hysteresis feature. Based on this model, the memristor-based recurrent neural network model with time delays was given and two sufficient conditions for the global uniform asymptotic stability of the memristor-based recurrent neural networks were obtained. In [49, 50], Wu et al. investigated the synchronization control issue of a general class of memristor-based recurrent neural networks with time delays, and some sufficient conditions were obtained to guarantee the exponential synchronization of the coupled networks based on drive-response concept. In [51], the dynamic behaviors for a class of memristor-based Hopfield networks were analyzed, and some sufficient conditions were obtained to ensure the essential bound of solutions and global exponential stability. In [52], the stability was considered for the memristor-based recurrent network with bounded activation functions and bounded time-varying delays in the presence of strong external stimuli, and a sufficient condition on the bounds of stimuli was derived for global exponential stability of memristor-based recurrent networks.

It should be noted that, very little attention has been paid to dealing with the periodicity issue, in particular; to the best of our knowledge, the almost periodic dynamics of memristive neural networks with time-varying delays have never been considered in the previous literature, which motivates the work of this paper.

In this paper, our aim is to study the exponential stability of almost periodic solution for memristive neural networks with time-varying delays. By using the concept of Filippov solutions for the differential equations with discontinuous right-hand sides and the inequality analysis techniques, the existence and asymptotically almost periodic behavior of solution will be discussed. Based on the differential inclusions theory, the proof of the existence of the almost periodic solution will be given. By applying Lyapunov functional approach, a sufficient condition will be established to ensure the uniqueness and global exponential stability of the almost periodic solution for the considered memristive neural networks. As a special case, the conditions of the global exponential stability of a unique periodic solution equilibrium point are also presented.

The rest of this paper is organized as follows. In Section 2, the model formulation and some preliminaries are given. In Section 3, the existence and asymptotically almost periodic behavior of solutions are analyzed, the existence of the almost periodic solution is proved, and the uniqueness and global exponential stability of the almost periodic solution are investigated. In Section 4, two numerical examples are presented to demonstrate the validity of the proposed results. Some conclusions are made in Section 5.

Notations. Throughout this paper, denotes the set of real numbers, denotes the -dimensional Euclidean space, and denotes the set of all real matrices. For any matrix , denotes the transpose of . If is a real symmetric matrix, means that is positive definite (negative definite). Given the column vectors ,  , , , and . represents the norm of , where is the maximum eigenvalue of . denotes the family of continuous function from to with the norm . denotes the derivative of . Matrices, if their dimensions are not explicitly stated, are assumed to have compatible dimensions for algebraic operations.

2. Model Description and Preliminaries

The KCL equation of the th subsystem of a general class of neural networks with time-varying delays can be written as where is the voltage of the capacitor ; denotes the resistor between the feedback function and ; denotes the resistor between the feedback function and ; corresponds to the transmission delay; represents the parallel resistor corresponding to the capacitor ; is the external input or bias; Let , ,  , and , then (1) can be rewritten as By replacing the resistors and in the primitive neural networks (1) or (3) with memristors whose memductances are and , respectively, then memristive neural networks with time-varying delays can be designed as where , , , and .

Combining the typical current-voltage characteristics of memristor (see Figure  1 in [48]), similarly to discussion in [49, 50], the coefficient parameters of the system (4) , , and can be modeled as where switching jumps , , , , , , and , , are constant numbers.

The initial value associated with the system (4) is .

Let , , , , , and . Notice that the system (4) is a differential equation with discontinuous right-hand sides, and based on the theory of differential inclusions [53], if is a solution of (4) in the sense of Filippov [54], then

The differential inclusion system (6) can be transformed into the vector form as where , , , , , and . Or equivalently, there exist measurable functions , , and , such that

To obtain the main results of this paper, some definitions and lemmas are introduced as follows.

Definition 1 (see [55]). A continuous function is said to be almost periodic on , if, for any , there exists a scalar and, for any interval with length , there exists a scalar in this interval, such that for all .

Definition 2. The almost periodic solution of the system (4) is said to be globally exponentially stable, if there exist scalars and , such that where is the solution of the system (4) with the initial value . is called as the exponential convergence rate.

Definition 3 (see [55]). The solution of the system (4) with the initial value is said to be asymptotically almost periodic, if, for any , there exist scalars , , and in any interval with length , such that , for all .

Lemma 4 (see [29]). For any , the following inequality holds: where ,  .

Lemma 5 (see [45]). Let scalar ,, and, then

Throughout this paper, the following assumptions are made on (4): is an almost periodic function. is a nondecreasing continuous function. is an almost periodic function, and , , and are constants.

3. Main Results

In this section, the main results concerned with the existence, uniqueness, and global exponential stability of the almost periodic solution are addressed for the memristive neural network in (4).

Theorem 6. Under the assumptions , if there exists a diagonal matrix such that where , is the identity matrix, , ,  , ,  and . then
(1) For any initial value , there exists a solution of the memristive neural network (4) on , and this solution is asymptotically almost periodic.
(2) The memristive neural network (4) has a unique almost periodic solution which is globally exponentially stable.

Proof. We should prove this theorem in four steps.
Step  1. In this step, we will prove the existence of the solution, that is, prove that the system (4) has a global solution for any initial value .
Similar to the proof of Lemma  1 in [37], under the assumptions of Theorem 6, it is easy to obtain the existence of the local solution of (4) with initial value on , where or , and is the maximal right-side existence interval of the local solution.
Due to , , by (12) we can choose constants and , such that ,  , , and
Without loss of generality, we suppose that . Let scalars,  . Consider a Lyapunov functional defined by where Calculate the time derivative of along the local solution of (4) on . By (8) and Lemma 5, we have By the assumptions and , for any constant , we have From (16) and (17) and by Lemma 4, one yields where , . By (13), we can choose suitable constants ,, , and , such that This implies that Moreover, by the assumption , we can obtain that is a bounded function. Hence, there exist a constant , such that By (20) and (21), it follows that From the definition of and (22), we have Thus, This shows that the local solution of (4) is bounded on and hence is defined on . That is, the system (4) has a global solution for any initial value .
Step  2. In this step, the global solution of the system (4) will be proved to be asymptotically almost periodic.
Let , then where Similar to , define Lyapunov functional as Calculate the derivative of along the solution of the system (25). Arguing as in Step  1, we can choose the appropriate positive constants , , , and , such that By the assumption , is an almost periodic function. Thus, by Definition 1, for any , there exists , and for any interval with length , there exists a scalar in this interval, such that It follows from (28) and (29) that , , which implies Therefore, there exists a constant ; when , we have This shows that the solution of the system (4) is asymptotically almost periodic.
Step  3. In this step, we will prove that the system (4) has an almost periodic solution.
Let be the solution of the system (4) with the initial value , then satisfies (8). Take a sequence , , . It is easy to derive that the function sequence is equicontinuous and uniformly bounded. Hence, by using Arzela-Ascoli theorem and diagonal selection principle, a subsequence of (still denoted by ) can be selected, such that uniformly converges to a continuous function on any compact set of .
By applying Lebesgue's dominated convergence theorem on (8), we can obtain that for any and . This implies that is a solution of (4). By the result obtained in Step  2, is asymptotically almost periodic. That is, for any , there exist , , and for any interval with length , there exists a scalar in this interval, such that for all . Thus, there exists a constant ; for all and , we can get that Let in (33), it follows that for all . This shows that is the almost periodic solution of (4). The proof of the existence of the almost periodic solution has been completed.
Step  4. In this step, we will prove that the uniqueness and global exponential stability of the almost periodic solution for the system (4).
Let be any solution of (4), and let be an almost periodic solution of (4). Set , then where Similar to , define Lyapunov functional as where Arguing as in Step  1, we have Thus By (13), we can choose appropriate constants ,  , , and , such that This implies that . Therefore, combined with the definition of , it follows that This shows that the almost periodic solution of the system (4) is globally exponentially stable. Consequently, the periodic solution is unique. This completes the proof of Theorem 6.

Notice that periodic function can be regarded as special almost periodic function. Hence, when is a periodic external input in the system (4), we can get the following corollary.

Corollary 7. Suppose that the assumptions - hold. If there exists a diagonal matrix such that where , is the identity matrix, and , ,  and , then one has the following.(1)For any initial value , there exists a solution of the memristive neural network (4) on , and this solution is asymptotically periodic.(2)The memristive neural network (4) has a unique periodic solution which is globally exponentially stable.

When is a constant external input , the system (4) changes as Since a constant can be also regarded as a special almost periodic function, by applying Theorem 6 on the neural network (43), we have the following.

Corollary 8. Suppose that the assumptions - hold. If there exists a diagonal matrix such that where , is the identity matrix, and , , and , then one has the following. (1)For any initial value , there exists a solution of the memristive neural network (43) on .(2)The memristive neural network (43) has a unique equilibrium point which is globally exponentially stable.

4. Illustrative Examples

In this section, two examples will be given to illustrate the effectiveness of the results obtained in this paper.

Example 1. Consider the second-order memristive neural network with time-varying delays in (4) described by where It is obvious that , . Choose the positive definite diagonal matrix , then . It is easy to check that
All conditions of Theorem 6 hold; hence the memristive neural network in this example has a unique almost periodic solution which is globally exponentially stable.
Figure 1 displays the state trajectory of the network with initial condition . It can be seen that this trajectory converges to the unique almost periodic solution of the network. This is in accordance with the conclusion of Theorem 6.

Example 2. Consider the third-order memristive neural network with time-varying delays in (4) described by where It is obvious that . Choose the positive definite diagonal matrix , then . It is easy to check that
All conditions of Theorem 6 hold; hence the memristive neural network in this example has a unique almost periodic solution which is globally exponentially stable.
Figure 2 displays the state trajectory of the network with initial condition . It can be seen that this trajectory converges to the unique almost periodic solution of the network. This is in accordance with the conclusion of Theorem 6.

5. Conclusion

In this paper, the exponential stability issue of the almost periodic solution for memristive neural networks with time-varying delays has been investigated. A sufficient condition has been obtained to ensure the existence, uniqueness, and global exponential stability of the almost periodic solution. As special cases, when the external input is a periodic or constant function in the network, the conditions which ensure the global exponential stability of a unique periodic solution or equilibrium point have been established for the considered memristive neural networks with time-varying delay. Two illustrative examples have been also given to demonstrate the effectiveness and validity of the proposed results in this paper.

In [56], the distributed filtering issue have been studied for a class of time-varying systems over sensor networks with quantization errors and successive packet dropouts. In [57], authors considered the the exponential stabilization of a class of stochastic system with Markovian jump parameters and mode-dependent mixed time delays. In [58], authors discussed the fuzzy-model-based robust fault detection with stochastic mixed time delays and successive packet dropouts. However, the issues of distributed filtering, stochastic stabilization, and robust fault detection have not been investigated for memristive neural networks in the existing literature. These will be the topic of our research on memristive neural networks with mode-dependent mixed time delays and Markovian jump parameters in future.

Acknowledgments

This work was supported by the Natural Science Foundation of Hebei Province of China (A2011203103) and the Hebei Province Education Foundation of China (2009157).