About this Journal Submit a Manuscript Table of Contents
Mathematical Problems in Engineering
Volume 2013 (2013), Article ID 625790, 7 pages
http://dx.doi.org/10.1155/2013/625790
Research Article

Memristive Perceptron for Combinational Logic Classification

School of Electronic and Information Engineering, Southwest University, Chongqing 400715, China

Received 1 February 2013; Accepted 22 April 2013

Academic Editor: Chuandong Li

Copyright © 2013 Lidan Wang et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Abstract

The resistance of the memristor depends upon the past history of the input current or voltage; so it can function as synapse in neural networks. In this paper, a novel perceptron combined with the memristor is proposed to implement the combinational logic classification. The relationship between the memristive conductance change and the synapse weight update is deduced, and the memristive perceptron model and its synaptic weight update rule are explored. The feasibility of the novel memristive perceptron for implementing the combinational logic classification (NAND, NOR, XOR, and NXOR) is confirmed by MATLAB simulation.

1. Introduction

Artificial neural network is a simplification system which simulates the organization structure, the processing method, and the system function of biological neural system [1]. It draws a strong attraction to many researchers because of its performance, such as the parallel processing, distributed storage, self-learning, self-adapting, and high fault tolerance [2]. Perceptron is the simplest kind of neural network which can implement basic learning and parallel processing. It is widely studied to solve the classification problems of medical image [3], text [4], mode [5, 6], and fingerprint [7]. A large number of biological synapses respond to received signals to store a series of continuous weights; so it is hard to realize perceptron by hard circuits. Memristor, a new circuit component in electronics field, is suitably used as synapse in neural networks because of its nanoscale size, automatic information storage, and nonvolatile characteristic with respect to long periods of power-down. In recent years, more and more researchers pay attention to memristive neural networks. A novel chaotic neural network with memristor was built and applied to associative memory [8]. Long-term potentiation and long-term depression of synapses using memristor were realized [9]. Memristors as main modules of complex bionic information processing networks were utilized [10], and then memristive neural networks were gradually introduced to build memristive cellular automata and discrete-time memristive cellular neural networks [11]. The PID controller based on memristive CMAC network was built [12]. In addition, memristive crossbar arrays are used to realize pulse neural networks [13] and memristor is utilized to build synapse template of cellular neural networks [14]. These research results show that memristor devices are capable of emulating the biological synapses through building up the correspondence between memristive conductance and synapse weight. Memristor is an ideal choice of electric synapse of new neuromorphic system. In this paper, a new perceptron is built to realize combinational logic classification problems. The effectiveness of the memristive perceptron for combinational logic classification is demonstrated through MATLAB simulations.

2. Memristive Perceptron

From 2008, the memristor was under development by Hewlett-Packard team [15]. The change curve of I-V of memristor under sine-type input voltage can be seen in Figure 1.

fig1
Figure 1: Characteristic I-V curve of memristor under sine-type input voltage.

The memristive conductance change dg/dt is a function of voltage v in the voltage-controlled memristor [16]. The function is similar to sinh curve, and the correspondence between memristive conductance and synapse weight update is set as where the values of and depend on the kind of memristor, such as material, size, and manufacturing process. The relationship curve between the memristive conductance change and the voltage is shown in Figure 2. It can be seen that memristive conductance change dg/dt increases along with the increasing of applied voltage v. If the learning error e of perceptronis regarded as voltage v, memristive conductance change can correspondingly describe as synapse weight. The correspondence between memristive conductance change and synapse weight update can be built.

625790.fig.002
Figure 2: The relationship curve of memristive conductance change and voltage.

The memristor model is used as synapse of perceptron to build memristive perceptron model shown in Figure 3, whose mathematical description is as follows: where is the input of neuron ; is output of neuron ; , , are the numbers of the input layer neurons, the hidden layer neurons, and the output layer neurons, respectively. represents the memristive synapse weights from the input layer neuron to the hidden layer neuron . represents the memristive synapse weights from the hidden layer neuron to the output layer neuron , respectively. and are thresholds of the hidden layer neuron and output layer neuron , respectively. is the step function, whose mathematical expression is

625790.fig.003
Figure 3: Memristive perceptron model.

3. Weight Update Rule of Memristive Synapses

If memristive perceptron is applied to combinational logic classification, weight update rule should be set. In expression (1), when is very small, dg can be replaced by :

Let the memristive conductance correspond to synapse weight of perceptron and combine BP learning rules which are used as the prototype. Then, memristive perceptron weight-update rule by following the rule through a given monitoring signal can be set: where is learning rate, is monitoring signal, is output of memristive perceptron, is error of memristive perceptron, and is input of memristive perceptron; let , .

4. Memristive Perceptron Implements Combinational Logic Classification

The combinational logic operators include the NAND, NOR, XOR, and NXOR operations. Their graphic symbols are shown in Figure 4 and the corresponding truth table is shown in Table 1.

tab1
Table 1: Truth table of the combinational logic operators.
fig4
Figure 4: Graphic symbols of combinational logic.

4.1. Realization of Combinational Logic NAND and NOR Classification

It can be seen from Table 1 that the combinational logic NAND and NOR classification belongs to the linear separable problems; so they can be realized by a two-input single-layer memristive perceptron model. Let the threshold and the learning rate . When the number of cycles is set to 30 and the initial weight is set to a random nonzero number, the experiment result is shown in Figure 5(a), where “△” denotes the output is logic “1” and “” denotes the output is logic “0.” The classification of logic NAND operator is realized efficiently. Similarly, let the threshold and the learning rate . The number of cycles is set to 30; so the classification of logic NOR operator is realized efficiently, whose experiment result is shown in Figure 5(b).

fig5
Figure 5: The classification results of NAND and NOR.
4.2. Realization of Combinational Logic XOR and NXOR Classification

It can be seen from Table 1 that the combinational logic XOR and NXOR classification belongs to the linear inseparable problems. As the single-layer memristive perceptron is unable to realize the nonlinear classification problems [17], a double-layer memristive perceptron model is needed to realize logic XOR and NXOR classification, which is equivalent to the two paralleling single-layer memristive perceptrons. So, it can be used to implement logic XOR and NXOR classification problems.

Truth table of XOR and NXOR with the monitoring signal is shown in Table 2. is the monitoring signal of the first perceptron in the hidden layer and is the monitoring signal of the second perceptron in the hidden layer. Let , ; both of them are linear separable problems. So, the memristive perceptron in the hidden layer can be used to realize classification. Let thresholds , , and learning rate , and let the number of cycles be 30; the initial weights are the random nonzero numbers. The experiment results are shown in Figures 6(a) and 6(b). Then and are looked as the inputs of memristive perceptron of output layer; so can be taken as the output. In MATLAB program, let threshold , learning rate , and the number of cycles be 30; the experiment result is shown in Figure 6(c). The XOR classification is realized. From the above analysis, the procedure of the logic NXOR is the same as the one of logic XOR. Let threshold , learning rate , and the number of cycles be 30; the experiment result is shown in Figure 6(d). The NXOR classification is realized.

tab2
Table 2: Truth table of XOR and NXOR with monitoring signal.
fig6
Figure 6: Classification results of the combinational logic XOR and NXOR with the monitoring signal.

Through the above theoretical analysis and experiment simulation, the results of whole combinational logic classification are shown in Table 3. It can be seen that NAND and NOR classifications are realized in the single-layer memristive perceptron after learning once. The XOR and NXOR classifications are realized in the double-layer memristive perceptron after learning eight times. The combinational logic classification can be realized effectively by the memristive perceptron we proposed.

tab3
Table 3: The results of whole combinational logic classification.

5. Test Validity of Memristive Perceptron in Combinational Logic Classification

The experiments have confirmed that the memristive perceptrons can achieve the standard combinational logic classification. However, in fact, the standard inputs cannot be obtained; so the test signals with the error are introduced to test the validity of memristive perceptrons in combinational logic classification.

Let the test input signals be as follows: = [−0.122, 0.128, 1.135, 0.998, 0.102, 0.125, 1.023, 0.962, 0, 0.15, 1.02, 1.08], = [−0.181, 1.028, 0.126, 0.988, −0.021, 1.132, 0.036, 1.168, 0.111, 1.021, 0.041, 1.068]; the memristive perceptrons in Part 4 are tested. By MATLAB simulations, the classification results are shown in Figure 7.

fig7
Figure 7: Classification results of combinational logic with test input signals.

It can be known from the above experiments that when the test signals with the error are used as inputs, the memristive perceptrons also can effectively realize the combinational logic classification.

6. Conclusions

The relationship between the synaptic weight update in perceptron and memristive conductance change is linked by the theoretical deduction. A new synaptic weight update rule is proposed in this paper. Single-layer memristive perceptron is proposed to realize the classification of the linear separable logic NAND and NOR. Double-layer memristive perceptron is built to realize the classification of linear inseparable logic XOR and NXOR. The experiments exhibit that memristive perceptron can effectively implement the combinational logic classification. The memristive perceptron has simple structure and high integration so that it is easier to be implemented by the low-power circuit. The more complex memristive neural networks we used the more complex problems can be solved in artificial intelligence field.

Acknowledgments

The work was supported by the National Natural Science Foundation of China (Grant nos. 60972155, 61101233), Fundamental Research Funds for the Central Universities (Grant nos. XDJK2012A007, XDJK2013B011), University Excellent Talents Supporting Foundations in of Chongqing (Grant no. 2011-65), University Key Teacher Supporting Foundations of Chongqing (Grant no. 2011-65), Technology Foundation for Selected Overseas Chinese Scholars, Ministry of Personnel in China (Grant no. 2012-186), Spring Sunshine Plan Research Project of the Ministry of Education of China (Grant no. z2011148).

References

  1. M.-Y. He, Principles, Language, Design and Application of Neural Computing, Press of Xidian University, Xian, China, 1992.
  2. Y. Zhang, J. Wang, and Y. Xia, “A dual neural network for redundancy resolution of kinematically redundant manipulators subject to joint limits and joint velocity limits,” IEEE Transactions on Neural Networks, vol. 14, no. 3, pp. 658–667, 2003. View at Publisher · View at Google Scholar · View at Scopus
  3. A. Datta and B. Biswas, “A fuzzy multilayer perceptron network based detection and classification of lobar intra-cerebral hemorrhage from computed tomography images of brain,” in Proceedings of the International Conference on Recent Trends in Information Systems, pp. 257–262, 2011.
  4. A. Gkanogiannis and T. Kalamboukis, “A perceptron-like linear supervised algorithm for text classification,” in Advanced Data Mining and Applications, vol. 6440 of Lecture Notes in Computer Science, pp. 86–97, 2010. View at Google Scholar
  5. Y. C. Hu, “Pattern classification by multi-layer perceptron using fuzzy integral-based activation function,” Applied Soft Computing Journal, vol. 10, no. 3, pp. 813–819, 2010. View at Publisher · View at Google Scholar · View at Scopus
  6. M. Fernández-Delgado, J. Ribeiro, E. Cernadas, and S. Barro, “Fast weight calculation for kernel-based perceptron in two-class classification problems,” in Proceedings of the International Joint Conference on Neural Networks (IJCNN '10), pp. 1–6, July 2010. View at Publisher · View at Google Scholar · View at Scopus
  7. I. EI-Feghi, A. Tahar, and M. Ahmadi, “Efficient features extraction for fingerprint classification with multi layer perceptron neural network,” in Proceedings of International Symposium on Signals, Circuits and Systems, pp. 1–4, 2011.
  8. X.-F. Hu, S.-K. Duan, and L.-D. Wang, “A novel chaotic neural network using memristors with applications in associative memory,” Abstract and Applied Analysis, vol. 2012, Article ID 405739, 19 pages, 2012. View at Publisher · View at Google Scholar
  9. S. Choi and G. Kim, “Synaptic behaviors and modelling of a metal oxide memristive device,” Applied Physics A: Materials Science & Processing, vol. 102, no. 4, pp. 1019–1025, 2011. View at Google Scholar
  10. A. Mittal and S. Swaminathan, “Emulating reflex actions through memristors,” in Proceedings of IEEE Conference and Exhibition (GCC '11), pp. 469–472, 2011. View at Publisher · View at Google Scholar
  11. M. Itoh and L. O. Chua, “Memristor cellular automata and memristor discrete-time cellular neural networks,” International Journal of Bifurcation and Chaos, vol. 19, no. 11, pp. 3605–3656, 2009. View at Publisher · View at Google Scholar · View at Scopus
  12. L.-D. Wang, X.-Y. Fang, S.-K. Duan, et al., “PID controller based on memristive CMAC network,” Abstract and Applied Analysis, vol. 2013, Article ID 510238, 6 pages, 2013. View at Publisher · View at Google Scholar
  13. A. Afifi, A. Ayatollahi, and F. Raissi, “Implementation of biologically plausible spiking neural network models on the memristor crossbar-based CMOS/nano circuits,” in Proceedings of European Conference on Circuit Theory and Design Conference Program (ECCTD '09), pp. 563–566, August 2009. View at Publisher · View at Google Scholar · View at Scopus
  14. S.-Y. Gao, S.-K. Duan, and L.-D. Wang, “On memristive cellular neural network and its applications in noise removal and edge extraction,” Journal of Southwest University, vol. 33, no. 11, pp. 63–70, 2011. View at Google Scholar
  15. D. B. Strukov, G. S. Snider, D. R. Stewart, and R. S. Williams, “The missing memristor found,” Nature, vol. 453, no. 7191, pp. 80–83, 2008. View at Publisher · View at Google Scholar · View at Scopus
  16. A. Afifi, A. Ayatollahi, and F. Raissi, “STDP implementation using memristive nanodevice in CMOS-Nano neuromorphic networks,” IEICE Electronics Express, vol. 6, no. 3, pp. 148–153, 2009. View at Publisher · View at Google Scholar · View at Scopus
  17. J.-G. Yang, Artificial Neural Network Practical Guide, Press of Zhejiang University, Hangzhou, China, 2001.