Abstract

The resistance of the memristor depends upon the past history of the input current or voltage; so it can function as synapse in neural networks. In this paper, a novel perceptron combined with the memristor is proposed to implement the combinational logic classification. The relationship between the memristive conductance change and the synapse weight update is deduced, and the memristive perceptron model and its synaptic weight update rule are explored. The feasibility of the novel memristive perceptron for implementing the combinational logic classification (NAND, NOR, XOR, and NXOR) is confirmed by MATLAB simulation.

1. Introduction

Artificial neural network is a simplification system which simulates the organization structure, the processing method, and the system function of biological neural system [1]. It draws a strong attraction to many researchers because of its performance, such as the parallel processing, distributed storage, self-learning, self-adapting, and high fault tolerance [2]. Perceptron is the simplest kind of neural network which can implement basic learning and parallel processing. It is widely studied to solve the classification problems of medical image [3], text [4], mode [5, 6], and fingerprint [7]. A large number of biological synapses respond to received signals to store a series of continuous weights; so it is hard to realize perceptron by hard circuits. Memristor, a new circuit component in electronics field, is suitably used as synapse in neural networks because of its nanoscale size, automatic information storage, and nonvolatile characteristic with respect to long periods of power-down. In recent years, more and more researchers pay attention to memristive neural networks. A novel chaotic neural network with memristor was built and applied to associative memory [8]. Long-term potentiation and long-term depression of synapses using memristor were realized [9]. Memristors as main modules of complex bionic information processing networks were utilized [10], and then memristive neural networks were gradually introduced to build memristive cellular automata and discrete-time memristive cellular neural networks [11]. The PID controller based on memristive CMAC network was built [12]. In addition, memristive crossbar arrays are used to realize pulse neural networks [13] and memristor is utilized to build synapse template of cellular neural networks [14]. These research results show that memristor devices are capable of emulating the biological synapses through building up the correspondence between memristive conductance and synapse weight. Memristor is an ideal choice of electric synapse of new neuromorphic system. In this paper, a new perceptron is built to realize combinational logic classification problems. The effectiveness of the memristive perceptron for combinational logic classification is demonstrated through MATLAB simulations.

2. Memristive Perceptron

From 2008, the memristor was under development by Hewlett-Packard team [15]. The change curve of I-V of memristor under sine-type input voltage can be seen in Figure 1.

The memristive conductance change dg/dt is a function of voltage v in the voltage-controlled memristor [16]. The function is similar to sinh curve, and the correspondence between memristive conductance and synapse weight update is set as where the values of and depend on the kind of memristor, such as material, size, and manufacturing process. The relationship curve between the memristive conductance change and the voltage is shown in Figure 2. It can be seen that memristive conductance change dg/dt increases along with the increasing of applied voltage v. If the learning error e of perceptronis regarded as voltage v, memristive conductance change can correspondingly describe as synapse weight. The correspondence between memristive conductance change and synapse weight update can be built.

The memristor model is used as synapse of perceptron to build memristive perceptron model shown in Figure 3, whose mathematical description is as follows: where is the input of neuron ; is output of neuron ; , , are the numbers of the input layer neurons, the hidden layer neurons, and the output layer neurons, respectively. represents the memristive synapse weights from the input layer neuron to the hidden layer neuron . represents the memristive synapse weights from the hidden layer neuron to the output layer neuron , respectively. and are thresholds of the hidden layer neuron and output layer neuron , respectively. is the step function, whose mathematical expression is

3. Weight Update Rule of Memristive Synapses

If memristive perceptron is applied to combinational logic classification, weight update rule should be set. In expression (1), when is very small, dg can be replaced by :

Let the memristive conductance correspond to synapse weight of perceptron and combine BP learning rules which are used as the prototype. Then, memristive perceptron weight-update rule by following the rule through a given monitoring signal can be set: where is learning rate, is monitoring signal, is output of memristive perceptron, is error of memristive perceptron, and is input of memristive perceptron; let , .

4. Memristive Perceptron Implements Combinational Logic Classification

The combinational logic operators include the NAND, NOR, XOR, and NXOR operations. Their graphic symbols are shown in Figure 4 and the corresponding truth table is shown in Table 1.

4.1. Realization of Combinational Logic NAND and NOR Classification

It can be seen from Table 1 that the combinational logic NAND and NOR classification belongs to the linear separable problems; so they can be realized by a two-input single-layer memristive perceptron model. Let the threshold and the learning rate . When the number of cycles is set to 30 and the initial weight is set to a random nonzero number, the experiment result is shown in Figure 5(a), where “△” denotes the output is logic “1” and “” denotes the output is logic “0.” The classification of logic NAND operator is realized efficiently. Similarly, let the threshold and the learning rate . The number of cycles is set to 30; so the classification of logic NOR operator is realized efficiently, whose experiment result is shown in Figure 5(b).

4.2. Realization of Combinational Logic XOR and NXOR Classification

It can be seen from Table 1 that the combinational logic XOR and NXOR classification belongs to the linear inseparable problems. As the single-layer memristive perceptron is unable to realize the nonlinear classification problems [17], a double-layer memristive perceptron model is needed to realize logic XOR and NXOR classification, which is equivalent to the two paralleling single-layer memristive perceptrons. So, it can be used to implement logic XOR and NXOR classification problems.

Truth table of XOR and NXOR with the monitoring signal is shown in Table 2. is the monitoring signal of the first perceptron in the hidden layer and is the monitoring signal of the second perceptron in the hidden layer. Let , ; both of them are linear separable problems. So, the memristive perceptron in the hidden layer can be used to realize classification. Let thresholds , , and learning rate , and let the number of cycles be 30; the initial weights are the random nonzero numbers. The experiment results are shown in Figures 6(a) and 6(b). Then and are looked as the inputs of memristive perceptron of output layer; so can be taken as the output. In MATLAB program, let threshold , learning rate , and the number of cycles be 30; the experiment result is shown in Figure 6(c). The XOR classification is realized. From the above analysis, the procedure of the logic NXOR is the same as the one of logic XOR. Let threshold , learning rate , and the number of cycles be 30; the experiment result is shown in Figure 6(d). The NXOR classification is realized.

Through the above theoretical analysis and experiment simulation, the results of whole combinational logic classification are shown in Table 3. It can be seen that NAND and NOR classifications are realized in the single-layer memristive perceptron after learning once. The XOR and NXOR classifications are realized in the double-layer memristive perceptron after learning eight times. The combinational logic classification can be realized effectively by the memristive perceptron we proposed.

5. Test Validity of Memristive Perceptron in Combinational Logic Classification

The experiments have confirmed that the memristive perceptrons can achieve the standard combinational logic classification. However, in fact, the standard inputs cannot be obtained; so the test signals with the error are introduced to test the validity of memristive perceptrons in combinational logic classification.

Let the test input signals be as follows: = [−0.122, 0.128, 1.135, 0.998, 0.102, 0.125, 1.023, 0.962, 0, 0.15, 1.02, 1.08], = [−0.181, 1.028, 0.126, 0.988, −0.021, 1.132, 0.036, 1.168, 0.111, 1.021, 0.041, 1.068]; the memristive perceptrons in Part 4 are tested. By MATLAB simulations, the classification results are shown in Figure 7.

It can be known from the above experiments that when the test signals with the error are used as inputs, the memristive perceptrons also can effectively realize the combinational logic classification.

6. Conclusions

The relationship between the synaptic weight update in perceptron and memristive conductance change is linked by the theoretical deduction. A new synaptic weight update rule is proposed in this paper. Single-layer memristive perceptron is proposed to realize the classification of the linear separable logic NAND and NOR. Double-layer memristive perceptron is built to realize the classification of linear inseparable logic XOR and NXOR. The experiments exhibit that memristive perceptron can effectively implement the combinational logic classification. The memristive perceptron has simple structure and high integration so that it is easier to be implemented by the low-power circuit. The more complex memristive neural networks we used the more complex problems can be solved in artificial intelligence field.

Acknowledgments

The work was supported by the National Natural Science Foundation of China (Grant nos. 60972155, 61101233), Fundamental Research Funds for the Central Universities (Grant nos. XDJK2012A007, XDJK2013B011), University Excellent Talents Supporting Foundations in of Chongqing (Grant no. 2011-65), University Key Teacher Supporting Foundations of Chongqing (Grant no. 2011-65), Technology Foundation for Selected Overseas Chinese Scholars, Ministry of Personnel in China (Grant no. 2012-186), Spring Sunshine Plan Research Project of the Ministry of Education of China (Grant no. z2011148).