Skip to main content

The Hard-Cut EM Algorithm for Mixture of Sparse Gaussian Processes

  • Conference paper
  • First Online:

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 9227))

Abstract

The mixture of Gaussian Processes (MGP) is a powerful and fast developed machine learning framework. In order to make its learning more efficient, certain sparsity constraints have been adopted to form the mixture of sparse Gaussian Processes (MSGP). However, the existing MGP and MSGP models are rather complicated and their learning algorithms involve various approximation schemes. In this paper, we refine the MSGP model and develop the hard-cut EM algorithm for MSGP from its original version for MGP. It is demonstrated by the experiments on both synthetic and real datasets that our refined MSGP model and the hard-cut EM algorithm are feasible and can outperform some typical regression algorithms on prediction. Moreover, with sparse technique, the parameter learning of our proposed MSGP model is much more efficient than that of the MGP model.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

References

  1. Rasmussen, C.E.: Evaluation of Gaussian processes and other methods for non-linear regression. The University of Toronto (1996)

    Google Scholar 

  2. Williiams, C.K.I., Barber, D.: Bayesian classification with Gaussian processes. IEEE Trans. Pattern Anal. Mach. Intell. 20(12), 1342–1351 (1998)

    Article  Google Scholar 

  3. Gao, X.B., Wang, X.M., Tao, D.C.: Supervised Gaussian process latent variable model for dimensionality reduction. IEEE Trans. Syst. Man Cybern. B Cyberne. 41(2), 425–434 (2011)

    Article  Google Scholar 

  4. Rasmussen, C.E., Kuss, M.: Gaussian processes in reinforcement learning. In: Thrun, S., Saul, L., Schölkopf, B. (eds.) Advances in Neural Information Processing Systems, vol. 16, pp. 751–759. MIT Press, Cambridge (2003)

    Google Scholar 

  5. Yuan, C., Neubauer, C.: Variational mixture of Gaussian process experts. In: Advances in Neural Information Processing Systems, vol. 21, pp. 1897–1904 (2008)

    Google Scholar 

  6. Stachniss, C., Plagemann, C., Lilienthal, A.J., et al.: Gas distribution modeling using sparse Gaussian process mixture models. In: Proceedings of Robotics: Science and Systems, pp. 310–317 (2008)

    Google Scholar 

  7. Tresp, V.: Mixtures of Gaussian processes. In: Advances in Neural Information Processing Systems, vol. 13, pp. 654–660 (2000)

    Google Scholar 

  8. Snelson, E., Ghahramani, Z.: Sparse Gaussian processes using pseudo-inputs. In: Advances in Neural Information Processing Systems, vol. 18, pp. 1257–1264 (2005)

    Google Scholar 

  9. Nguyen, T., Bonilla, E.: Fast allocation of Gaussian process experts. In: Proceedings of the 31st International Conference on Machine Learning, pp. 145–153 (2014)

    Google Scholar 

  10. Wang, Y., Khardon, R.: Sparse Gaussian processes for multi-task learning. In: The European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases, pp. 711–727 (2012)

    Google Scholar 

  11. Sun, S., Xu, X.: Variational inference for infinite mixtures of Gaussian processes with applications to traffic flow prediction. IEEE Trans. Intell. Transp. Syst. 12(2), 466–475 (2011)

    Article  Google Scholar 

  12. Meeds, E., Osindero, S.: An alternative infinite mixture of Gaussian process experts. In: Advances in Neural Information Processing Systems, vol. 18, pp. 883–890 (2005)

    Google Scholar 

  13. Gramacy, R.B., Lee, H.K.H.: Bayesian treed Gaussian process models with an application to computer modeling. J. Am. Stat. Assoc. 103(483), 1119–1130 (2008)

    Article  MathSciNet  Google Scholar 

  14. Shi, J.Q., Murray-Smith, R., Titterington, D.M.: Bayesian regression and classification using mixtures of Gaussian processes. Int. J. Adapt. Control Sig. Process. 17(2), 149–161 (2003)

    Article  Google Scholar 

  15. Shi, J.Q., Murray-Smith, R., Titterington, D.M.: Hierarchical Gaussian process mixtures for regression. Stat. Comput. 15(1), 31–41 (2005)

    Article  MathSciNet  Google Scholar 

  16. Rasmussen, C.E., Ghahramani, Z.: Infinite mixtures of Gaussian process experts. In: Advances in Neural Information Processing Systems, vol. 14, pp. 881–888 (2001)

    Google Scholar 

  17. Fergie, M.P.: Discriminative Pose Estimation Using Mixtures of Gaussian Processes. The University of Manchester (2013)

    Google Scholar 

  18. Sun, S.: Infinite mixtures of multivariate Gaussian processes. In: Proceedings of the International Conference on Machine Learning and Cybernetics, pp. 1011–1016 (2013)

    Google Scholar 

  19. Tayal, A., Poupart, P., Li, Y.: Hierarchical double Dirichlet process mixture of Gaussian processes. In: Proceedings of the 26th Association for the Advancement of Artificial Intelligence, pp. 1126–1133 (2012)

    Google Scholar 

  20. Ross, J., Dy, J.: Nonparametric mixture of Gaussian processes with constraints. In: Proceedings of the 30th International Conference on Machine Learning, pp. 1346–1354 (2013)

    Google Scholar 

  21. Chatzis, S.P., Demiris, Y.: Nonparametric mixtures of Gaussian processes with power-law behavior. IEEE Trans. Neural Netw. Learn. Syst. 23(12), 1862–1871 (2012)

    Article  Google Scholar 

  22. Platanios, E.A., Chatzis, S.P.: Mixture Gaussian process conditional heteroscedasticity. IEEE Trans. Pattern Anal. Mach. Intell. 36(5), 888–900 (2014)

    Article  Google Scholar 

  23. Kapoor, A., Ahn, H., Picard, R.W.: Mixture of Gaussian processes for combining multiple modalities. In: Oza, N.C., Polikar, R., Kittler, J., Roli, F. (eds.) MCS 2005. LNCS, vol. 3541, pp. 86–96. Springer, Heidelberg (2005)

    Chapter  Google Scholar 

  24. Reece, S., Mann, R., Rezek, I., et al.: Gaussian process segmentation of co-moving animals. Proc. Am. Inst. Phys. 1305(1), 430–437 (2011)

    MathSciNet  Google Scholar 

  25. Lázaro-Gredilla, M., Van, V.S., Lawrence, N.D.: Overlapping mixtures of Gaussian processes for the data association problem. Pattern Recogn. 45(4), 1386–1395 (2012)

    Article  MATH  Google Scholar 

  26. Yang, Y., Ma, J.: An efficient EM approach to parameter learning of the mixture of Gaussian processes. In: Liu, D., Zhang, H., Polycarpou, M., Alippi, C., He, H. (eds.) ISNN 2011, Part II. LNCS, vol. 6676, pp. 165–174. Springer, Heidelberg (2011)

    Chapter  Google Scholar 

  27. Schiegg, M., Neumann, M., Kersting, K.: Markov logic mixtures of Gaussian processes: towards machines reading regression data. In: Proceedings of the 15th International Conference on Artificial Intelligence and Statistics, JMLR:W&CP, vol. 22, pp. 1002–1011 (2012)

    Google Scholar 

  28. Yu, J., Chen, K., Rashid, M.M.: A Bayesian model averaging based multi-kernel Gaussian process regression framework for nonlinear state estimation and quality prediction of multiphase batch processes with transient dynamics and uncertainty. Chem. Eng. Sci. 93(19), 96–109 (2013)

    Article  Google Scholar 

  29. Chen, Z., Ma, J., Zhou, Y.: A precise hard-cut EM algorithm for mixtures of Gaussian processes. In: Huang, D.-S., Jo, K.-H., Wang, L. (eds.) ICIC 2014. LNCS, vol. 8589, pp. 68–75. Springer, Heidelberg (2014)

    Google Scholar 

  30. Rasmussen, C.E., Williams, C.K.I.: Gaussian Processes for Machine Learning. MIT Press, Cambridge (2006)

    MATH  Google Scholar 

  31. Sundararajan, S., Keerthi, S.: Predictive approaches for choosing hyperparameters in Gaussian processes. Neural Comput. 13(5), 1103–1118 (2001)

    Article  MATH  Google Scholar 

  32. Quiñonero-Candela, J., Rasmussen, C.E.: A unifying view of sparse approximate Gaussian process regression. J. Mach. Learn. Res. 6, 1935–1959 (2005)

    Google Scholar 

  33. Csató, L., Opper, M.: Sparse online Gaussian processes. Neural Comput. 14(3), 641–669 (2002)

    Article  Google Scholar 

  34. Seeger, M., Williams, C.K.I., Lawrence, N.D.: Fast forward selection to speed up sparse Gaussian process regression. In: Bishop, C.M., Frey, B.J. (eds.) Proceedings of the 9th International Workshop on Artificial Intelligence and Statistics (2003)

    Google Scholar 

  35. Keerthi, S.S., Chu, W.: A matching pursuit approach to sparse Gaussian process regression. In: Advances in Neural Information Processing Systems, vol. 18 (2005)

    Google Scholar 

  36. Smola, A., Bartlett, P.: Sparse greedy Gaussian process regression. Adv. Neural Inf. Process. Syst. 13, 619–625 (2000)

    Google Scholar 

  37. Murphy, K.P.: Machine Learning: A Probabilistic Perspective. MIT Press, Cambridge (2012)

    Google Scholar 

  38. Smola, A.J., Schölkopf, B.: A tutorial on support vector regression. Stat. Comput. 14(3), 199–222 (2004)

    Article  MathSciNet  Google Scholar 

Download references

Acknowledgement

This work was supported by the Natural Science Foundation of China for Grant 61171138. The authors would like to thank Dr. E. Snelson and Dr. Z. Ghahramani for their valuable advice about FITC model.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Jinwen Ma .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2015 Springer International Publishing Switzerland

About this paper

Cite this paper

Chen, Z., Ma, J. (2015). The Hard-Cut EM Algorithm for Mixture of Sparse Gaussian Processes. In: Huang, DS., Han, K. (eds) Advanced Intelligent Computing Theories and Applications. ICIC 2015. Lecture Notes in Computer Science(), vol 9227. Springer, Cham. https://doi.org/10.1007/978-3-319-22053-6_2

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-22053-6_2

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-22052-9

  • Online ISBN: 978-3-319-22053-6

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics