Skip to main content

High Dimensional Non-linear Modeling with Bayesian Mixture of CCA

  • Conference paper
  • 2435 Accesses

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 6443))

Abstract

In the high dimensional regression and classification, we often need the feature selection and the dimensionality reduction to cope with the huge computational cost and the over-fitting of the parameters. Canonical Correlation Analysis (CCA) and their hierarchical extension (includes Bayesian method) was proposed for this purpose. However, the real data set often violates the assumption of the linearity of CCA. Thus, we need the non-linear extension of them. To solve this problem, we propose the Bayesian mixture of CCA and give the efficient inference algorithm by Gibbs sampling. We show that the proposed method is the scalable natural extension of CCA and RBF type neural networks for the high dimensional non-linear problems.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   84.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Bishop, C.M.: Bayesian PCA. Advances in Neural Information. In: Processing Systems, vol. 11, pp. 382–388 (1999)

    Google Scholar 

  2. Katahira, K., Matsumoto, N., Sugase-Miyamoto, Y., Okanoya, K., Okada, M.: Doubly Sparse Factor Models for Unifying Feature Transformation and Feature Selection. Journal of Physics: Conference Series (in press)

    Google Scholar 

  3. Watanabe, S.: Equations of states in singular statistical estimation. Neural Networks 23, 1 (2010)

    Article  Google Scholar 

  4. Furguson, T.S.: A Bayesian analysis of some nonparametric problems. Annals of Statistics 1, 209–230 (1973)

    Article  MathSciNet  Google Scholar 

  5. Griffiths, T., Ghahramani, Z.: Infinite latent feature models and the Indian buffet process. In: Advances in Neural Information Processing System (NIPS), vol. 18 (2005)

    Google Scholar 

  6. Ghahramani, Z., Hinton, G.E.: The EM algorithm for mixtures of factor analyzers. Technical Report CRG-TR-96-1, Dept. of Comp. Sci., Univ. of Toronto (1996)

    Google Scholar 

  7. Ghahramani, Z., Beal, M.J.: Variational Inference for Bayesian Mixtures of Factor Analyzers. In: Advances in Neural Information Processing System (NIPS), vol. 12 (2000)

    Google Scholar 

  8. Xu, L., Jordan, M.I., Hinton, G.E.: An Alternative Model for Mixtures of Experts. In: Advances in Neural Information Processing System (NIPS), vol. 7 (1995)

    Google Scholar 

  9. Sato, M., Ishii, S.: On-line EM Algorithm for the Normalized Gaussian Network. Neural Computation 12(2), 407–432 (2000)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2010 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Hosino, T. (2010). High Dimensional Non-linear Modeling with Bayesian Mixture of CCA. In: Wong, K.W., Mendis, B.S.U., Bouzerdoum, A. (eds) Neural Information Processing. Theory and Algorithms. ICONIP 2010. Lecture Notes in Computer Science, vol 6443. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-17537-4_55

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-17537-4_55

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-17536-7

  • Online ISBN: 978-3-642-17537-4

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics