Abstract

A twin-multistate quaternion Hopfield neural network (TMQHNN) is a multistate Hopfield model and can store multilevel information, such as image data. Storage capacity is an important problem of Hopfield neural networks. Jankowski et al. approximated the crosstalk terms of complex-valued Hopfield neural networks (CHNNs) by the 2-dimensional normal distributions and evaluated their storage capacities. In this work, we evaluate the storage capacities of TMQHNNs based on their idea.

1. Introduction

A complex-valued Hopfield neural network (CHNN) is a multistate model of Hopfield neural network. CHNNs can deal with multilevel information and have been applied to the storage of image data [17]. They have also been extended using Clifford algebra, which includes a complex field, hyperbolic algebra, and quaternion field. Several models of hyperbolic Hopfield neural network have been proposed [811]. Isokawa et al. [12] proposed quaternion Hopfield neural networks (QHNNs) employing the split activation function. Minemoto et al. [13] studied QHNNs using the polar-represented activation function. Several other models of QHNNs have also been proposed [1417]. In this work, we study the twin-multistate quaternion Hopfield neural networks (TMQHNNs) [18]. The neuron of a TMQHNN consists of a pair of complex-valued multistate neurons. The TMQHNN requires only half the connection weight parameters of the CHNN.

Storage capacity is an important issue in Hopfield neural networks. When a Hopfield neural network is given a training pattern, the weighted sum input is decomposed into main and crosstalk terms. The main term enables the Hopfield neural network to memorize the training patterns. The crosstalk term interferes with the storage of training patterns. The storage capacity of conventional Hopfield neural network has been investigated by evaluating the crosstalk term [19]. Jankowski et al. [1] and Kobayashi [20] applied this technique to CHNNs and rotor Hopfield neural networks (RHNNs), respectively. The RHNN is an extension of the CHNN using vectors and matrices [21]. In this work, we provide the Hebbian learning rule for TMQHNNs and evaluate the storage capacity based on Jankowski’s concept. In the case of TMQHNNs, the cross talk term is decomposed into two complex parts. By evaluating both parts, we determine the storage capacity of TMQHNNs. The theory suggests that TMQHNNs have half the storage capacities of CHNNs. In addition, we compared the storage capacities of CHNNs and TMQHNNs by computer simulation.

The rest of this paper is organized as follows: Sections 2 and 3 introduce CHNNs and TMQHNNs, respectively. Section 4 provides the Hebbian learning rule for the TMQHNN and evaluates the storage capacity. It also contains descriptions of the computer simulations conducted to verify our analysis. Section 5 concludes this paper.

2. Complex-Valued Hopfield Neural Networks

The CHNNs are briefly described [1]. Let and be the state of neuron a and the connection weight from neuron b to neuron a, respectively. The weighted sum input to neuron a is given bywhere is the number of neurons. For the resolution factor K, we define . For the weighted sum, input , where and ; the complex-valued multistate activation function is defined by

We define the set of neuron states and denote it as follows: . The connection weights must satisfy the following conditions:

Then, the CHNN converges to a fixed point.

Let be the pth training pattern, where is the number of training patterns. The Hebbian learning rule is defined as

Then, the connection weights satisfy . Giving the qth training pattern to the CHNN, the weighted sum input to neuron a is

The second term of (7) is referred to as the crosstalk term. The crosstalk term interferes with the storage of training patterns. We define

Then, we have

If , then we have . Therefore, if for all a, the qth training pattern is a fixed point. We regard as the summation of random variables of V for simplicity, although the summation consists of exactly terms. The real and imaginary parts of each random variable have the equal variance σ and do not have correlations. Setting , is regarded as the summation of random variables. Then, we have

Let and be the real and imaginary parts of , respectively. From the central limit theorem, we have

For , we have

3. Twin-Multistate Quaternion Hopfield Neural Networks

A quaternion is expressed by using real numbers , and . The imaginary units , and k satisfy the following properties:

The quaternions satisfy the associative and distributive laws. For a complex number c, we have the important equality:

Putting and , the quaternion q is described as . For quaternions and , the addition and multiplication are described as

The conjugate of q is defined as

Then, we have the equality

In the TMQHNNs, the neuron states and connection weights are represented by quaternions. These neuron states and connection weights are denoted in the same way as those of CHNNs. The number of neurons in a TMQHNN is denoted as . The weighted sum input to neuron a is given by

For the weighted sum, input , the activation function is defined as

Therefore, the set of neuron states is . The connection weights must satisfy conditions (3) and (4). Then, the TMQHNN converges to a fixed point.

4. Storage Capacity of Twin-Multistate Quaternion Hopfield Neural Networks

We provide the Hebbian learning rule for TMQHNNs. Let be the pth training pattern, where is the number of training patterns. The Hebbian learning rule is given by

Then, the connection weights satisfy . Giving the qth training pattern to the TMQHNN, the weighted sum input to neuron a is

The second term of (22) is also referred to as the crosstalk term and interferes the storage of training patterns. We decompose the quaternion into a pair of complex numbers by to investigate the storage capacity. Then, we have

We define

Then, we have

If and , then we have . We regard and as the summations of random variables of S. Then, and follow the same distributions. Thus, we can discuss only . If the TMQHNN is used instead of the CHNN, is required, since a twin-multistate quaternion neuron consists of two complex-valued multistate neurons. Setting , is regarded as the summation of random variables, and we have

Let and be the real and imaginary parts of , respectively. From the central limit theorem, we have

Putting , we have

We require the same distributions for (12) and (22) and obtained . Thus, the CHNN has double the storage capacity of the TMQHNN.

Computer simulations were conducted to verify our analysis. K was varied from in steps of 2, and P was varied from . For each K and P, 100 sets of training patterns were generated randomly; the number of trials was 100. The CHNN and TMQHNN attempted to store the training patterns by the Hebbian learning rule. If all the training patterns were fixed, the trial was regarded as successful, otherwise, as failed. Figure 1 shows the simulation results. The horizontal and vertical axes indicate the number of training patterns and success rate, respectively. The simulation results showed that the storage capacity of the TMQHNN was a bit larger than half that of the CHNN.

5. Conclusions

A TMQHNN needs only half the connection weight parameters of CHNN and is expected to have a smaller storage capacity. However, the storage capacity had not yet been analyzed. In this work, we defined the Hebbian learning rule for TMQHNNs and analyzed their storage capacity. The analysis demonstrated that a TMQHNN had half of the storage capacity of a CHNN. In addition, a computer simulation was conducted to verify our analysis. The simulation results confirmed our analysis. In future, we intend to study the storage capacity using different methods [19, 22, 23].

Data Availability

No data were used to support this study.

Conflicts of Interest

The author declares that there are no conflicts of interest regarding the publication of this article.