Twin-multistate commutative quaternion Hopfield neural networks
Introduction
Complex-valued Hopfield neural networks (CHNNs) employing multistate activation functions can deal with multi-level information, and have often been applied to the storage of image data [1], [2], [3], [4], [5], [6]. Many extensions of CHNN have been proposed. Hyperbolic algebra is a 2-dimensional (2-D) number system, like a complex number field. Several models of hyperbolic Hopfield neural network (HHNN) have been proposed [7], [8], [9]. The HHNNs that employ directional multistate activation function improve noise tolerance [10]. Dual-numbered Hopfield neural networks have also been proposed [11]. Quaternion algebra is a 4-D number system, and includes a complex number field. Several models of quaternion Hopfield neural networks (QHNNs) have also been proposed [12], [13], [14], [15], [16]. Twin-multistate QHNNs can deal with multi-level information, and require only half the connection weight parameters of CHNNs [17]. The commutative quaternion is another 4-D number system [18]. The multiplication of quaternions is not commutative, unlike that of commutative quaternions. Commutative quaternions include both complex and hyperbolic number systems, and are helpful for processing non-Euclidean space. Commutative quaternions are easy to calculate. However, it is difficult to define Hopfield neural networks using commutative quaternions, since commutative quaternions include zero divisors like hyperbolic numbers. Isokawa et al. attempted to define commutative quaternion Hopfield neural networks (CQHNNs) [19]. As they used polar representation, their models were complicated to handle. In fact, the learning algorithms and computer simulations have never been provided. Here, we define CQHNN as an analogy of twin-multistate QHNN.
The projection rule is an excellent learning algorithm, as it is a one-shot learning algorithm with large storage capacity [20]. However, it requires fully connected networks. This problem is severe under the restricted memory resource. The projection rule for CQHNNs has been provided, and requires only half the connection-weight parameters of CHNNs, similar to QHNNs. Thus, the storage capacities of QHNNs and CQHNN are half of that of CHNNs. Computer simulations were performed to compare the noise tolerance of QHNNs and CQHNNs. CQHNNs underperformed QHNNs. We discuss the reasons why from the perspective of rotational invariance and self loops.
The rest of this paper is organized as follows. Section 2 describes quaternions and commutative quaternions. CHNNs and QHNNs are defined in Sections 3 and dummyTXdummy- 4, respectively. In Section 5, we propose the CQHNNs. Rotational invariance is described in Section 6, and computer simulations are conducted in Section 7. We discuss the simulation results in Section 8, and Section 9 concludes this paper.
Section snippets
Quaternions and commutative quaternions
We briefly describe quaternions. A quaternion consists of one real and three imaginary parts. The three imaginary units are denoted as i, j and k, and satisfy the relations:
- 1.
- 2.
- 3.
- 4.
.
A quaternion is represented as using real numbers q0, q1, q2, and q3. For the quaternion we can uniquely describe using complex numbers and . For two quaternions and the addition and multiplication are
Complex-valued Hopfield neural networks
Our proposed model is defined based on a CHNN [1]. Let K be the resolution factor and an integer greater than 2. We denote and define the set of neuron states as . Let m be the number of neurons. To store n-tuple K-level information, the CHNNs requires n neurons; we have . The state of neuron a and the connection weight from neuron b to neuron a are denoted as za and wab, respectively. The connection weights must satisfy the stability conditions:
Quaternion Hopfield neural networks
Our proposed model is defined as an analogy of QHNN. The set of neuron states is defined as . A twin-multistate quaternionic neuron consists of two complex-valued multistate neurons. Therefore, for n-tuple multi-level information, the QHNNs require n/2, and we have . QHNNs require approximately n2/2 connection weight parameters; meanwhile, CHNNs require approximately n2 connection weight parameters. The connection weights have to satisfy stability conditions (9) and (10).
Commutative quaternion Hopfield neural networks
We propose CQHNNs as an analogy of QHNN. The set of neuron states is defined as . A commutative quaternion neuron also consists of two complex-valued multistate neurons. The connection weights have to satisfy the stability conditions (9) and (10). For we have and . For a weighted sum input the activation function fCQ(I) is defined as The activation function can also be described as
Rotational invariance
Rotational invariance is important for explaining noise tolerance, since it makes many pseudomemories. Suppose that the pattern is a fixed point of a CHNN [22]. For γ ∈ S, γz is a fixed point of a CHNN, and γz is referred to as a rotated pattern of v. There exist K rotated patterns of z. They are pseudomemories, and deteriorate noise tolerance. In QHNNs, for a fixed point and are fixed points, and there exist 2K rotated patterns of z [17]. QHNNs
Computer simulations
Computer simulations were performed to investigate the noise tolerance. Randomly generated training patterns and impulsive noise were first employed for the computer simulations. To add impulsive noise, each neuron state was replaced with a new state at rate r. The new state was randomly selected from SQ or SCQ corresponding to the QHNN or CQHNN. The computer simulation procedure is described below.
- 1.
A training pattern was selected randomly from P training patterns, and noise was added.
- 2.
If the
Discussion
Figs. 1 and 4 reveal that the CQHNNs underperformed the QHNNs in the computer simulations. The CQHNNs have low noise tolerance due to rotational invariance, like CHNNs and QHNNs. The computer simulations suggested that local minima existed. In the case of one training pattern, the CHNNs theoretically have no pseudomemories except for the rotated patterns. Computer simulations showed that the QHNNs had local minima. Fig. 6 suggests that the CQHNNs also have local minima, similar to QHNNs. Here,
Conclusion
In this study, we defined CQHNNs as an analogy of twin-multistate QHNNs. Fig. 7 presents a flowchart of the construction of CQHNNs. The current version of CQHNNs underperformed QHNNs in our computer simulations. We discussed the reason for low noise tolerance in CQHNNs from the viewpoint of rotational invariance and self loops. We analyzed the rotational invariance of CQHNNs, and concluded that 2K rotated patterns exist corresponding to each training pattern, as in QHNNs. In addition, we
Masaki Kobayashi is a professor at University of Yamanashi. He received the B.S., M.S., and D.S. degrees in mathematics from Nagoya University in 1989, 1991, and 1996, respectively. He became a research associate and an associate professor at the University of Yamanashi in 1993 and 2006, respectively. He has been a professor since 2014.
References (26)
Hyperbolic Hopfield neural networks with directional multistate activation function
Neurocomputing
(2018)Fixed points of split quaternionic Hopfield neural networks
Signal Process.
(2017)Symmetric quaternionic Hopfield neural networks
Neurocomputing
(2017)Quaternionic Hopfield neural networks with twin-multistate activation function
Neurocomputing
(2017)Gradient descent learning for quaternionic Hopfield neural network
Neurocomputing
(2017)- et al.
Complex-valued multistate neural associative memory
IEEE Trans. Neural Netw.
(1996) - et al.
An image storage system using complex-valued associative memories
Proceedings of International Conference on Pattern Recognition
(2000) A complex-valued neuron to transform gray level images to phase information
Proceedings of the International Conference on Neural Information Processing
(2002)- et al.
Complex-valued multistate associative memory with nonlinear multilevel functions for gray-level image reconstruction
IEEE Trans. Neural Netw.
(2009) - et al.
A new design method for the complex-valued multistate Hopfield associative memory
IEEE Trans. Neural Netw.
(2003)
Threshold complex-valued neural associative memory
IEEE Trans. Neural Netw. Learn. Syst.
Models of Hopfield-type clifford neural networks and their energy functions – hyperbolic and dual valued networks
Proceedings of International Conference on Neural Information Processing
Hyperbolic Hopfield neural networks
IEEE Trans. Neural Netw. Learning Syst.
Cited by (0)
Masaki Kobayashi is a professor at University of Yamanashi. He received the B.S., M.S., and D.S. degrees in mathematics from Nagoya University in 1989, 1991, and 1996, respectively. He became a research associate and an associate professor at the University of Yamanashi in 1993 and 2006, respectively. He has been a professor since 2014.