Skip to main content
Log in

Auto-Regressive Moving Average Models on Complex-Valued Matrix Lie Groups

  • Published:
Circuits, Systems, and Signal Processing Aims and scope Submit manuscript

Abstract

The present contribution aims at extending the classical scalar autoregressive moving average (ARMA) model to generate random (as well as deterministic) paths on complex-valued matrix Lie groups. The numerical properties of the developed ARMA model are studied by recurring to a tailored version of the Z-transform on Lie groups and to statistical indicators tailored to Lie groups, such as correlation functions on tangent bundles. The numerical behavior of the devised ARMA model is also illustrated by numerical simulations.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5

Similar content being viewed by others

Notes

  1. The fact that the variable \(z\) be scalar is inherently related to the nature of the ‘temporal variable’ \(k\): Since the time is a scalar and since the operator \(z^{-1}\) may be interpreted as a unit delay or as a time marker, it is sufficient to take \(z\) as a scalar entity.

References

  1. T.W. Anderson, I. Olkin, L.G. Underhill, Generation of random orthogonal matrices. SIAM J. Sci. Stat. Comput. 8(4), 625–629 (1987)

    Article  MATH  MathSciNet  Google Scholar 

  2. L. Arnaud, D. Braun, Efficiency of producing random unitary matrices with quantum circuits. Phys. Rev. A 78, 062329 (2008)

    Article  Google Scholar 

  3. S.R. Bahcall, Random matrix model for superconductors in a magnetic field. Phys. Rev. Lett. 77(26), 5276–5279 (December 1996)

  4. E. Bingham, A. Hyvärinen, A fast fixed-point algorithm for independent component analysis of complex valued signals. Int. J. Neural Syst. 10(1), 1–8 (2000)

    Article  Google Scholar 

  5. D.J.E. Callaway, Random matrices, fractional statistics and the quantum Hall effect. Phys. Rev. B Condens. Matter Mater. Phys. 43(10), 8641–8643 (April 1991)

    Google Scholar 

  6. E. Celledoni, S. Fiori, Descent methods for optimization on homogeneous manifolds. J. Math. Comput. Simul. (special issue on “Structural Dynamical Systems: Computational Aspects”, Guest Editors: N. Del Buono, L. Lopez and T. Politi), 79(4), 1298–1323 (December 2008)

  7. Y. Chikuse, Density estimation on the Stiefel manifold. J. Multivar. Anal. 66, 188–206 (1998)

    Article  MATH  MathSciNet  Google Scholar 

  8. G.S. Chirikjian, Stochastic Models, Information Theory, and Lie Groups, Volume 2: Classical Results and Geometric Methods (Birkhäuser, Boston, 2009)

    Book  Google Scholar 

  9. G.S. Chirikjian, Stochastic Models, Information Theory, and Lie Groups, Volume 2: Analytic Methods and Modern Applications, 1st edn. (Birkhäuser, Boston, 2011)

    Google Scholar 

  10. S.-C.T. Choi, Minimal residual methods for complex symmetric, skew symmetric, and skew Hermitian systems. Report ANL/MCS-P3028-0812 (Computation Institute, University of Chicago, 2013)

  11. S.R. Cloude, Special unitary groups in polarimetry theory. in Proceedings of SPIE, vol. 2265 (Mathematics of Polarization and Scattering, 1994) p. 292

  12. P.I. Davies, N.J. Higham, Numerically stable generation of correlation matrices and their factors. BIT Numer. Math. 40(4), 640–651 (2000)

    Article  MATH  MathSciNet  Google Scholar 

  13. A. Edelman, N.R. Rao, Random matrix theory. Acta Numer. 14, 233–297 (2005)

    Article  MATH  MathSciNet  Google Scholar 

  14. S. Fiori, Non-linear complex-valued extensions of Hebbian learning: an essay. Neural Comput. 17(4), 779–838 (2005)

    Article  MATH  MathSciNet  Google Scholar 

  15. S. Fiori, Geodesic-based and projection-based neural blind deconvolution algorithms. Sig. Process. 88(3), 521–538 (March 2008)

  16. S. Fiori, A Study on neural learning on manifold foliations: the case of the Lie group SU(3). Neural Comput. 20(4), 1091–1117 (April 2008)

  17. S. Fiori, Leap-frog-type learning algorithms over the Lie group of unitary matrices. Neurocomputing (Special issue on “Advances in Blind Signal Processing”), 71(10–12), 2224–2244 (June 2008)

  18. S. Fiori, Learning by natural gradient on noncompact matrix-type pseudo-Riemannian manifolds. IEEE Trans. Neural Netw. 21(5), 841–852 (May 2010)

  19. S. Fiori, Solving minimal-distance problems over the manifold of real symplectic matrices. SIAM J. Matrix Anal. Appl. 32(3), 938–968 (2011)

    Article  MATH  MathSciNet  Google Scholar 

  20. S. Fiori, Blind deconvolution by a Newton method on the non-unitary hypersphere. Int. J. Adapt. Control Signal Process. 27(6), 488–518 (June 2013)

  21. P.T. Fletcher, S. Venkatasubramanian, S. Joshi, Robust statistics on Riemannian manifolds via the geometric median. in Proceedings of the 2008 IEEE Conference on Computer Vision and Pattern Recognition (CVPR, Anchorage (Alaska, USA), June 24–26, 2008), pp. 1–8

  22. K. Fujii, K. Funahashi, T. Kobayashi, Jarlskogs parametrization of unitary matrices and qudit theory. Int. J. Geom. Methods Mod. Phys. 3(2), 269–283 (March 2006)

  23. A. Genz, Methods for generating random orthogonal matrices, in Monte Carlo and Quasi-Monte Carlo Methods 1998, ed. by H. Niederreiter, J. Spanier (Springer, Berlin, 1999), pp. 199–213

    Google Scholar 

  24. G. Han, K. Portman, J. Rosenthal, Unitary matrices with maximal or near maximal diversity product, in Proceedings of the 39th Allerton Conference on Communication, Control and Computing (Allerton House, Monticello (Illinois, USA), 3–5 October, 2001), pp. 82–91

  25. H. Hendriks, Nonparametric estimation of a probability density on a Riemannian manifold using Fourier expansions. Ann. Stat. 18(2), 832–849 (1990)

    Article  MATH  MathSciNet  Google Scholar 

  26. J.-M. Loubes, B. Pelletier, A kernel-based classifier on a Riemannian manifold. Stat. Decis. 26, 35–51 (2008)

    MATH  MathSciNet  Google Scholar 

  27. A. Peyrache, K. Benchenane, M. Khamassi, S.I. Wiener, F.P. Battaglia, Principal component analysis of ensemble recordings reveals cell assemblies at high temporal resolution. J. Comput, Neurosci. 29(1–2), 309–325 (2009)

    Google Scholar 

  28. Q. Rentmeesters, P. Absil, P. Van Dooren, K. Gallivan, A. Srivastava, An efficient particle filtering technique on the Grassmann manifold, in Proceedings of the 2010 IEEE International Conference on Acoustics Speech and Signal Processing (ICASSP’10, Dallas (Texas, USA), March 14–19, 2010) pp. 3838–3841

  29. M. Spivak, A Comprehensive Introduction to Differential Geometry, 2nd edn. (Publish or Perish Press, Berkeley, CA, 1979)

    Google Scholar 

  30. T. Tao, V. Vu, Random matrices: universality of ESD and the circular law. Ann. Probab. 38(5), 2023–2065 (2010)

    Article  MATH  MathSciNet  Google Scholar 

  31. F. Tompkins, P.J. Wolfe, Bayesian filtering on the Stiefel manifold. in Proceedings of the 2nd IEEE International Workshop on Computational Advances in Multi-Sensor Adaptive Processing (CAMSAP, St. Thomas (U.S. Virgin Islands), December 12–14, 2007), pp. 261–264

  32. A.M. Tulino, S. Verdú, Random matrix theory and wireless communications. Found. Trends Commun. Inf. Theory 1(1), 1–182 (2004)

    Article  Google Scholar 

  33. T. Yang, W.B. Mikhael, Baseband image rejection for diversity superheterodyne receivers, in Proceedings of the 2004 IEEE Wireless Communications and Networking Conference (Atlanta (Georgia, USA), March 21–25, 2004) pp. 2232–2234

  34. B. Zhang, D.J. Miller, Y. Wang, Nonlinear system modeling with random matrices: echo state networks revisited. IEEE Trans Neural Netw. Learn. Syst. 23(1), 175–182 (January 2012)

    Google Scholar 

  35. X. Zhang, G. Berger, M. Dietz, C. Denz, Cross-talk in phase encoded volume holographic memories employing unitary matrices. Appl. Phys. B Lasers Opt. 85(4), 575–579 (2006)

    Article  Google Scholar 

Download references

Acknowledgments

The author wishes to gratefully thank the anonymous reviewers and the associate editor who coordinated the review of the present paper, Prof. Robert Newcomb, for the detailed, thoughtful and constructive comments to the manuscript that contributed substantially to improve the presentation of its scientific content. The author also wishes to thank Dr. Yacine Chitour (Supélec, Gif-sur-Yvette, France) for fruitful discussions about the extension of the classical notion of system transfer function to the Lie algebra setting.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Simone Fiori.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Fiori, S. Auto-Regressive Moving Average Models on Complex-Valued Matrix Lie Groups. Circuits Syst Signal Process 33, 2449–2473 (2014). https://doi.org/10.1007/s00034-014-9745-1

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00034-014-9745-1

Keywords

Navigation