Elsevier

Neurocomputing

Volume 320, 3 December 2018, Pages 150-156
Neurocomputing

Twin-multistate commutative quaternion Hopfield neural networks

https://doi.org/10.1016/j.neucom.2018.09.023Get rights and content

Abstract

The complex-valued Hopfield neural network (CHNN) can deal with multi-level information, and has often been applied to the storage of image data. The quaternion Hopfield neural network (QHNN) is a multistate model of a Hopfield neural network, and requires half the connection weight parameters of CHNN. In this study, we propose a commutative quaternion Hopfield neural network (CQHNN) as the analogy of QHNN. The multiplication of commutative quaternions is commutative and convenient, unlike that of quaternions. We compared the noise tolerance of CQHNNs and QHNNs by computer simulation, and discuss the simulation results from the perspective of rotational invariance and self loops.

Introduction

Complex-valued Hopfield neural networks (CHNNs) employing multistate activation functions can deal with multi-level information, and have often been applied to the storage of image data [1], [2], [3], [4], [5], [6]. Many extensions of CHNN have been proposed. Hyperbolic algebra is a 2-dimensional (2-D) number system, like a complex number field. Several models of hyperbolic Hopfield neural network (HHNN) have been proposed [7], [8], [9]. The HHNNs that employ directional multistate activation function improve noise tolerance [10]. Dual-numbered Hopfield neural networks have also been proposed [11]. Quaternion algebra is a 4-D number system, and includes a complex number field. Several models of quaternion Hopfield neural networks (QHNNs) have also been proposed [12], [13], [14], [15], [16]. Twin-multistate QHNNs can deal with multi-level information, and require only half the connection weight parameters of CHNNs [17]. The commutative quaternion is another 4-D number system [18]. The multiplication of quaternions is not commutative, unlike that of commutative quaternions. Commutative quaternions include both complex and hyperbolic number systems, and are helpful for processing non-Euclidean space. Commutative quaternions are easy to calculate. However, it is difficult to define Hopfield neural networks using commutative quaternions, since commutative quaternions include zero divisors like hyperbolic numbers. Isokawa et al. attempted to define commutative quaternion Hopfield neural networks (CQHNNs) [19]. As they used polar representation, their models were complicated to handle. In fact, the learning algorithms and computer simulations have never been provided. Here, we define CQHNN as an analogy of twin-multistate QHNN.

The projection rule is an excellent learning algorithm, as it is a one-shot learning algorithm with large storage capacity [20]. However, it requires fully connected networks. This problem is severe under the restricted memory resource. The projection rule for CQHNNs has been provided, and requires only half the connection-weight parameters of CHNNs, similar to QHNNs. Thus, the storage capacities of QHNNs and CQHNN are half of that of CHNNs. Computer simulations were performed to compare the noise tolerance of QHNNs and CQHNNs. CQHNNs underperformed QHNNs. We discuss the reasons why from the perspective of rotational invariance and self loops.

The rest of this paper is organized as follows. Section 2 describes quaternions and commutative quaternions. CHNNs and QHNNs are defined in Sections 3 and dummyTXdummy- 4, respectively. In Section 5, we propose the CQHNNs. Rotational invariance is described in Section 6, and computer simulations are conducted in Section 7. We discuss the simulation results in Section 8, and Section 9 concludes this paper.

Section snippets

Quaternions and commutative quaternions

We briefly describe quaternions. A quaternion consists of one real and three imaginary parts. The three imaginary units are denoted as i, j and k, and satisfy the relations:

  • 1.

    i2=j2=k2=1,

  • 2.

    ij=ji=k,

  • 3.

    jk=kj=i,

  • 4.

    ki=ik=j.

A quaternion is represented as q=q0+q1i+q2j+q3k, using real numbers q0, q1, q2, and q3. For the quaternion q=q0+q1i+q2j+q3k, we can uniquely describe q=α+βj using complex numbers α=q0+q1i and β=q2+q3i. For two quaternions q=α+βj and q=α+βj, the addition and multiplication are

Complex-valued Hopfield neural networks

Our proposed model is defined based on a CHNN [1]. Let K be the resolution factor and an integer greater than 2. We denote θK=πK, and define the set of neuron states as SC={e2liθK|l=0,1,,K1}. Let m be the number of neurons. To store n-tuple K-level information, the CHNNs requires n neurons; we have m=n. The state of neuron a and the connection weight from neuron b to neuron a are denoted as za and wab, respectively. The connection weights must satisfy the stability conditions: wab=wba¯,waa=0.

Quaternion Hopfield neural networks

Our proposed model is defined as an analogy of QHNN. The set of neuron states is defined as SQ={α+βj|α,βSC}. A twin-multistate quaternionic neuron consists of two complex-valued multistate neurons. Therefore, for n-tuple multi-level information, the QHNNs require n/2, and we have m=n/2. QHNNs require approximately n2/2 connection weight parameters; meanwhile, CHNNs require approximately n2 connection weight parameters. The connection weights have to satisfy stability conditions (9) and (10).

Commutative quaternion Hopfield neural networks

We propose CQHNNs as an analogy of QHNN. The set of neuron states is defined as SCQ={α+βj|α,βSC}. A commutative quaternion neuron also consists of two complex-valued multistate neurons. The connection weights have to satisfy the stability conditions (9) and (10). For wab=uab+vabj, we have uab=uba¯ and vab=vba¯. For a weighted sum input I=A+Bj(A,BC), the activation function fCQ(I) is defined as fCQ(I)=fC(A)+fC(B)j.The activation function can also be described as fCQ(I)=argmaxsSCRe(s¯A)+(argmax

Rotational invariance

Rotational invariance is important for explaining noise tolerance, since it makes many pseudomemories. Suppose that the pattern z=(z1,z2,,zm)T is a fixed point of a CHNN [22]. For γ ∈ S, γz is a fixed point of a CHNN, and γz is referred to as a rotated pattern of v. There exist K rotated patterns of z. They are pseudomemories, and deteriorate noise tolerance. In QHNNs, for a fixed point z=x+yj, zγ=γx+γ¯yj and z(γj)=γ¯y+γxj are fixed points, and there exist 2K rotated patterns of z [17]. QHNNs

Computer simulations

Computer simulations were performed to investigate the noise tolerance. Randomly generated training patterns and impulsive noise were first employed for the computer simulations. To add impulsive noise, each neuron state was replaced with a new state at rate r. The new state was randomly selected from SQ or SCQ corresponding to the QHNN or CQHNN. The computer simulation procedure is described below.

  • 1.

    A training pattern was selected randomly from P training patterns, and noise was added.

  • 2.

    If the

Discussion

Figs. 1 and 4 reveal that the CQHNNs underperformed the QHNNs in the computer simulations. The CQHNNs have low noise tolerance due to rotational invariance, like CHNNs and QHNNs. The computer simulations suggested that local minima existed. In the case of one training pattern, the CHNNs theoretically have no pseudomemories except for the rotated patterns. Computer simulations showed that the QHNNs had local minima. Fig. 6 suggests that the CQHNNs also have local minima, similar to QHNNs. Here,

Conclusion

In this study, we defined CQHNNs as an analogy of twin-multistate QHNNs. Fig. 7 presents a flowchart of the construction of CQHNNs. The current version of CQHNNs underperformed QHNNs in our computer simulations. We discussed the reason for low noise tolerance in CQHNNs from the viewpoint of rotational invariance and self loops. We analyzed the rotational invariance of CQHNNs, and concluded that 2K rotated patterns exist corresponding to each training pattern, as in QHNNs. In addition, we

Masaki Kobayashi is a professor at University of Yamanashi. He received the B.S., M.S., and D.S. degrees in mathematics from Nagoya University in 1989, 1991, and 1996, respectively. He became a research associate and an associate professor at the University of Yamanashi in 1993 and 2006, respectively. He has been a professor since 2014.

References (26)

  • ZhengP.

    Threshold complex-valued neural associative memory

    IEEE Trans. Neural Netw. Learn. Syst.

    (2014)
  • Y. Kuroe et al.

    Models of Hopfield-type clifford neural networks and their energy functions – hyperbolic and dual valued networks

    Proceedings of International Conference on Neural Information Processing

    (2011)
  • M. Kobayashi

    Hyperbolic Hopfield neural networks

    IEEE Trans. Neural Netw. Learning Syst.

    (2013)
  • Cited by (0)

    Masaki Kobayashi is a professor at University of Yamanashi. He received the B.S., M.S., and D.S. degrees in mathematics from Nagoya University in 1989, 1991, and 1996, respectively. He became a research associate and an associate professor at the University of Yamanashi in 1993 and 2006, respectively. He has been a professor since 2014.

    View full text