Elsevier

Neural Networks

Volume 10, Issue 2, March 1997, Pages 335-342
Neural Networks

Multilayer Perceptrons to Approximate Quaternion Valued Functions

https://doi.org/10.1016/S0893-6080(96)00048-2Get rights and content

Abstract

In this paper a new type of multilayer feedforward neural network is introduced. Such a structure, called hypercomplex multilayer perceptron (HMLP), is developed in quaternion algebra and allows quaternionic input and output signals to be dealt with, requiring a lower number of neurons than the real MLP, thus providing a reduced computational complexity. The structure introduced represents a generalization of the multilayer perceptron in the complex space (CMLP) reported in the literature. The fundamental result reported in the paper is a new density theorem which makes HMLPs universal interpolators of quaternion valued continuous functions. Moreover the proof of the density theorem can be restricted in order to formulate a density theorem in the complex space. Due to the identity between the quaternion and the four-dimensional real space, such a structure is also useful to approximate multidimensional real valued functions with a lower number of real parameters, decreasing the probability of being trapped in local minima during the learning phase. A numerical example is also reported in order to show the efficiency of the proposed structure. © 1997 Elsevier Science Ltd. All Rights Reserved.

Section snippets

INTRODUCTION

In the last decade the use of artificial neural networks has received a growing interest in a wide variety of scientific fields. A great number of neural structures have been proposed in the literature, in order to deal with different kinds of applications. Among these structures, our interest in this paper focuses on multilayer perceptron based structures, able to deal with quaternionic input and output signals.

Quaternion algebra was invented in 1843 by the Irish mathematician W.R. Hamilton (

MLPS IN QUATERNION ALGEBRA

Let us define an HMLP (hypercomplex multilayer perceptron) as a multilayer perceptron in which input and output values, weights and biases are quaternions, and the activation functions in the hidden layer neurons are quaternion-valued sigmoidal functions. In the following, only HMLPs with one hidden layer will be considered, since it will be proven that this structure is a universal interpolator in H. However, the learning algorithm has been also developed for a multi-hidden layer structure (

DENSITY THEOREMS FOR HMLPS

In this section a density theorem for continuous functions f: XH, where X is a compact subset of Hn will be proven. As for the real MLP (Cybenko, 1989) and the CMLP (Arena et al., 1993) such a result states that HMLPs with the activation function (2) are universal interpolators of continuous quaternion valued functions. In order to prove the theorem the following notation is introduced: let σ: RR be the real valued sigmoidal function:σ(t)=11+exp(−t) and let q = q0 + iq1 + jq2 + kq3H.

A NUMERICAL EXAMPLE

In this section a numerical example is reported to remark the efficiency of the HMLP. The considered task is the short term prediction of a time series generated from a chaotic electronic circuit, namely the hyperchaotic Saito circuit (Mitsubori and Saito, 1994). The system is characterized by four state variables; the state equations, numerically simulated to generate the learning patterns, are the following:ẋ1ẏ1=−11−α1α1β1x1−ηp1h(z)y1−ηp1β1h(z)ẋ2ẏ2=−11−α2α2β2x2−ηp2h(z)y2−ηp2β2h(z) where

CONCLUSIONS

The approximation capabilities of a new type of multilayer feedforward neural network defined in the quaternion algebra (HMLP) are theoretically analyzed in this paper. For this structure a new density theorem is proven in the quaternion set. It is therefore possible to state that HMLPs are able to approximate quaternionic continuous functions with the desired degree of accuracy. Moreover HMLP approximation capabilities are useful also for real and complex functions: in such cases HMLPs allow

References (37)

  • Arena, P., Fortuna, L., Re, R., & Xibilia, M.G. (1996b). Multilayer perceptrons to approximate complex valued...
  • Benvenuto, N., & Piazza, F. (1992). On the complex backpropagation algorithm. IEEE Transactions On signal Processing 40...
  • Benvenuto, N., Marchesi, M., Piazza, F., & Uncini, A. (1991a). Nonlinear satellite radio links equalized using blind...
  • Benvenuto, N., Marchesi, M., Piazza, F., & Uncini, A. (1991b). A comparison between real and complex valued neural...
  • J. Chou

    Quaternion kinematic and dynamic differential equation

    IEEE Transactions on Robotics and Automation

    (1992)
  • J. Chou et al.

    Finding the position and orientation of a sensor on a robot manipulator using quaternions

    International Journal of Robotics Research

    (1991)
  • Cui, X., & Shin, K. (1991). Intelligent coordination of multiple systems with neural networks. IEEE Transactions on...
  • G. Cybenko

    Approximation by superposition of a sigmoidal function

    Mathematics of Control Signals and Systems

    (1989)
  • Cited by (140)

    • Information geometry of hyperbolic-valued Boltzmann machines

      2021, Neurocomputing
      Citation Excerpt :

      Nitta showed that quaternion-valued MLPs accelerated the learning speed by computer simulations [17]. Arena et al. applied the quaternion-valued MLPs to the function approximation and robotics [18,19]. Many versions of hypercomplex-valued Hopfield networks have been proposed.

    • Clifford Convolutional Neural Networks for Lymphoblast Image Classification

      2024, Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
    • Quaternion Quantum Neural Network for Classification

      2023, Advances in Applied Clifford Algebras
    View all citing articles on Scopus
    View full text