Table of Contents Author Guidelines Submit a Manuscript
Computational Intelligence and Neuroscience
Volume 2018, Article ID 1275290, 5 pages
https://doi.org/10.1155/2018/1275290
Research Article

Storage Capacities of Twin-Multistate Quaternion Hopfield Neural Networks

Mathematical Science Center, University of Yamanashi, Takeda 4-3-11, Kofu, Yamanashi 400-8511, Japan

Correspondence should be addressed to Masaki Kobayashi; pj.ca.ihsanamay@ikasam-k

Received 15 August 2018; Accepted 18 September 2018; Published 1 November 2018

Academic Editor: Reinoud Maex

Copyright © 2018 Masaki Kobayashi. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Abstract

A twin-multistate quaternion Hopfield neural network (TMQHNN) is a multistate Hopfield model and can store multilevel information, such as image data. Storage capacity is an important problem of Hopfield neural networks. Jankowski et al. approximated the crosstalk terms of complex-valued Hopfield neural networks (CHNNs) by the 2-dimensional normal distributions and evaluated their storage capacities. In this work, we evaluate the storage capacities of TMQHNNs based on their idea.

1. Introduction

A complex-valued Hopfield neural network (CHNN) is a multistate model of Hopfield neural network. CHNNs can deal with multilevel information and have been applied to the storage of image data [17]. They have also been extended using Clifford algebra, which includes a complex field, hyperbolic algebra, and quaternion field. Several models of hyperbolic Hopfield neural network have been proposed [811]. Isokawa et al. [12] proposed quaternion Hopfield neural networks (QHNNs) employing the split activation function. Minemoto et al. [13] studied QHNNs using the polar-represented activation function. Several other models of QHNNs have also been proposed [1417]. In this work, we study the twin-multistate quaternion Hopfield neural networks (TMQHNNs) [18]. The neuron of a TMQHNN consists of a pair of complex-valued multistate neurons. The TMQHNN requires only half the connection weight parameters of the CHNN.

Storage capacity is an important issue in Hopfield neural networks. When a Hopfield neural network is given a training pattern, the weighted sum input is decomposed into main and crosstalk terms. The main term enables the Hopfield neural network to memorize the training patterns. The crosstalk term interferes with the storage of training patterns. The storage capacity of conventional Hopfield neural network has been investigated by evaluating the crosstalk term [19]. Jankowski et al. [1] and Kobayashi [20] applied this technique to CHNNs and rotor Hopfield neural networks (RHNNs), respectively. The RHNN is an extension of the CHNN using vectors and matrices [21]. In this work, we provide the Hebbian learning rule for TMQHNNs and evaluate the storage capacity based on Jankowski’s concept. In the case of TMQHNNs, the cross talk term is decomposed into two complex parts. By evaluating both parts, we determine the storage capacity of TMQHNNs. The theory suggests that TMQHNNs have half the storage capacities of CHNNs. In addition, we compared the storage capacities of CHNNs and TMQHNNs by computer simulation.

The rest of this paper is organized as follows: Sections 2 and 3 introduce CHNNs and TMQHNNs, respectively. Section 4 provides the Hebbian learning rule for the TMQHNN and evaluates the storage capacity. It also contains descriptions of the computer simulations conducted to verify our analysis. Section 5 concludes this paper.

2. Complex-Valued Hopfield Neural Networks

The CHNNs are briefly described [1]. Let and be the state of neuron a and the connection weight from neuron b to neuron a, respectively. The weighted sum input to neuron a is given bywhere is the number of neurons. For the resolution factor K, we define . For the weighted sum, input , where and ; the complex-valued multistate activation function is defined by

We define the set of neuron states and denote it as follows: . The connection weights must satisfy the following conditions:

Then, the CHNN converges to a fixed point.

Let be the pth training pattern, where is the number of training patterns. The Hebbian learning rule is defined as

Then, the connection weights satisfy . Giving the qth training pattern to the CHNN, the weighted sum input to neuron a is

The second term of (7) is referred to as the crosstalk term. The crosstalk term interferes with the storage of training patterns. We define

Then, we have

If , then we have . Therefore, if for all a, the qth training pattern is a fixed point. We regard as the summation of random variables of V for simplicity, although the summation consists of exactly terms. The real and imaginary parts of each random variable have the equal variance σ and do not have correlations. Setting , is regarded as the summation of random variables. Then, we have

Let and be the real and imaginary parts of , respectively. From the central limit theorem, we have

For , we have

3. Twin-Multistate Quaternion Hopfield Neural Networks

A quaternion is expressed by using real numbers , and . The imaginary units , and k satisfy the following properties:

The quaternions satisfy the associative and distributive laws. For a complex number c, we have the important equality:

Putting and , the quaternion q is described as . For quaternions and , the addition and multiplication are described as

The conjugate of q is defined as

Then, we have the equality

In the TMQHNNs, the neuron states and connection weights are represented by quaternions. These neuron states and connection weights are denoted in the same way as those of CHNNs. The number of neurons in a TMQHNN is denoted as . The weighted sum input to neuron a is given by

For the weighted sum, input , the activation function is defined as

Therefore, the set of neuron states is . The connection weights must satisfy conditions (3) and (4). Then, the TMQHNN converges to a fixed point.

4. Storage Capacity of Twin-Multistate Quaternion Hopfield Neural Networks

We provide the Hebbian learning rule for TMQHNNs. Let be the pth training pattern, where is the number of training patterns. The Hebbian learning rule is given by

Then, the connection weights satisfy . Giving the qth training pattern to the TMQHNN, the weighted sum input to neuron a is

The second term of (22) is also referred to as the crosstalk term and interferes the storage of training patterns. We decompose the quaternion into a pair of complex numbers by to investigate the storage capacity. Then, we have

We define

Then, we have

If and , then we have . We regard and as the summations of random variables of S. Then, and follow the same distributions. Thus, we can discuss only . If the TMQHNN is used instead of the CHNN, is required, since a twin-multistate quaternion neuron consists of two complex-valued multistate neurons. Setting , is regarded as the summation of random variables, and we have

Let and be the real and imaginary parts of , respectively. From the central limit theorem, we have

Putting , we have

We require the same distributions for (12) and (22) and obtained . Thus, the CHNN has double the storage capacity of the TMQHNN.

Computer simulations were conducted to verify our analysis. K was varied from in steps of 2, and P was varied from . For each K and P, 100 sets of training patterns were generated randomly; the number of trials was 100. The CHNN and TMQHNN attempted to store the training patterns by the Hebbian learning rule. If all the training patterns were fixed, the trial was regarded as successful, otherwise, as failed. Figure 1 shows the simulation results. The horizontal and vertical axes indicate the number of training patterns and success rate, respectively. The simulation results showed that the storage capacity of the TMQHNN was a bit larger than half that of the CHNN.

Figure 1: The simulation result for storage capacities.

5. Conclusions

A TMQHNN needs only half the connection weight parameters of CHNN and is expected to have a smaller storage capacity. However, the storage capacity had not yet been analyzed. In this work, we defined the Hebbian learning rule for TMQHNNs and analyzed their storage capacity. The analysis demonstrated that a TMQHNN had half of the storage capacity of a CHNN. In addition, a computer simulation was conducted to verify our analysis. The simulation results confirmed our analysis. In future, we intend to study the storage capacity using different methods [19, 22, 23].

Data Availability

No data were used to support this study.

Conflicts of Interest

The author declares that there are no conflicts of interest regarding the publication of this article.

References

  1. S. Jankowski, A. Lozowski, and J. M. Zurada, “Complex-valued multistate neural associative memory,” IEEE Transactions on Neural Networks, vol. 7, no. 6, pp. 1491–1496, 1996. View at Publisher · View at Google Scholar · View at Scopus
  2. H. Aoki and Y. Kosugi, “An image storage system using complex-valued associative memories,” in Proceedings of International Conference on Pattern Recognition, vol. 2, pp. 626–629, Barcelona, Spain, September 2000.
  3. H. Aoki, “A complex-valued neuron to transform gray level images to phase information,” in Proceedings of the International Conference on Neural Information Processing, vol. 3, pp. 1084–1088, Singapore, November 2002.
  4. G. Tanaka and K. Aihara, “Complex-valued multistate associative memory with nonlinear multilevel functions for gray-level image reconstruction,” IEEE Transactions on Neural Networks, vol. 20, no. 9, pp. 1463–1473, 2009. View at Publisher · View at Google Scholar · View at Scopus
  5. M. K. Muezzinoglu, C. Guzelis, and J. M. Zurada, “A new design method for the complex-valued multistate Hopfield associative memory,” IEEE Transactions on Neural Networks, vol. 14, no. 4, pp. 891–899, 2003. View at Publisher · View at Google Scholar · View at Scopus
  6. P. Zheng, “Threshold complex-valued neural associative memory,” IEEE Transactions on Neural Networks and Learning Systems, vol. 25, no. 9, pp. 1714–1718, 2014. View at Publisher · View at Google Scholar · View at Scopus
  7. T. Isokawa, H. Yamamoto, H. Nishimura, T. Yumoto, N. Kamiura, and N. Matsui, “Complex-valued associative memories with projection and iterative learning rules,” Journal of Artificial Intelligence and Soft Computing Research, vol. 8, no. 3, pp. 237–249, 2018. View at Publisher · View at Google Scholar
  8. Y. Kuroe, S. Tanigawa, and H. Iima, “Models of Hopfield-type Clifford neural networks and their energy functions-hyperbolic and dual valued networks,” in Proceedings of International Conference on Neural Information Processing, pp. 560–569, Doha, Qatar, November 2011.
  9. M. Kobayashi, “Hyperbolic Hopfield neural networks,” IEEE Transactions on Neural Networks and Learning Systems, vol. 24, no. 2, pp. 335–341, 2013. View at Publisher · View at Google Scholar · View at Scopus
  10. M. Kobayashi, “Global hyperbolic Hopfield neural networks,” IEICE Transactions on Fundamentals of Electronics, Communications and Computer Sciences, vol. E99-A, no. 12, pp. 2511–2516, 2016. View at Publisher · View at Google Scholar · View at Scopus
  11. M. Kobayashi, “Hyperbolic Hopfield neural networks with directional multistate activation function,” Neurocomputing, vol. 275, pp. 2217–2226, 2018. View at Publisher · View at Google Scholar · View at Scopus
  12. T. Isokawa, H. Nishimura, N. Kamiura, and N. Matsui, “Associative memory in quaternionic Hopfield neural network,” International Journal of Neural Systems, vol. 18, no. 2, pp. 135–145, 2008. View at Publisher · View at Google Scholar · View at Scopus
  13. T. Minemoto, T. Isokawa, H. Nishimura, and N. Matsui, “Quaternionic multistate Hopfield neural network with extended projection rule,” Artificial Life and Robotics, vol. 21, no. 1, pp. 106–111, 2016. View at Publisher · View at Google Scholar · View at Scopus
  14. T. Minemoto, T. Isokawa, H. Nishimura, and N. Matsui, “Pseudo-orthogonalization of memory patterns for complex-valued and quaternionic associative memories,” Journal of Artificial Intelligence and Soft Computing Research, vol. 7, no. 4, pp. 257–264, 2017. View at Publisher · View at Google Scholar · View at Scopus
  15. M. Kobayashi, “Hybrid quaternionic Hopfield neural network,” IEICE Transactions on Fundamentals of Electronics, Communications and Computer Sciences, vol. E98-A, no. 7, pp. 1512–1518, 2015. View at Publisher · View at Google Scholar · View at Scopus
  16. M. Kobayashi, “Symmetric quaternionic Hopfield neural networks,” Neurocomputing, vol. 240, pp. 110–114, 2017. View at Publisher · View at Google Scholar · View at Scopus
  17. M. Kobayashi, “Three-dimensional quaternionic Hopfield neural networks,” IEICE Transactions on Fundamentals of Electronics, Communications and Computer Sciences, vol. E100-A, no. 7, pp. 1575–1577, 2017. View at Publisher · View at Google Scholar · View at Scopus
  18. M. Kobayashi, “Quaternionic Hopfield neural networks with twin-multistate activation function,” Neurocomputing, vol. 267, pp. 304–310, 2017. View at Publisher · View at Google Scholar · View at Scopus
  19. J. Hertz, A. Krogh, and R. G. Palmer, Introduction to the Theory of Neural Computation, Addison-Weasley, Redwood City, CA, USA, 1991.
  20. M. Kobayashi, “Storage capacity of rotor Hopfield neural networks,” Neurocomputing, vol. 316, pp. 30–33, 2018. View at Publisher · View at Google Scholar
  21. M. Kitahara and M. Kobayashi, “Projection rule for rotor Hopfield neural networks,” IEEE Transactions on Neural Networks and Learning Systems, vol. 25, no. 7, pp. 1298–1307, 2014. View at Publisher · View at Google Scholar · View at Scopus
  22. J. Cook, “The mean-field theory of a Q-state neural network model,” Journal of Physics A: Mathematical and General, vol. 22, no. 12, pp. 2057–2067, 1989. View at Publisher · View at Google Scholar · View at Scopus
  23. Y. Nakamura, K. Torii, and T. Munakata, “Neural-network model composed of multidimensional spin neurons,” Physical Review E, vol. 51, no. 2, pp. 1538–1546, 1995. View at Publisher · View at Google Scholar · View at Scopus