Skip to main content

Multi-task Sparse Gaussian Processes with Improved Multi-task Sparsity Regularization

  • Conference paper
Pattern Recognition (CCPR 2014)

Part of the book series: Communications in Computer and Information Science ((CCIS,volume 483))

Included in the following conference series:

Abstract

Gaussian processes are a popular and effective Bayesian method for classification and regression. Generating sparse Gaussian processes is a hot research topic, since Gaussian processes have to face the problem of cubic time complexity with respect to the size of the training set. Inspired by the idea of multi-task learning, we believe that simultaneously selecting subsets of multiple Gaussian processes will be more suitable than selecting them separately. In this paper, we propose an improved multi-task sparsity regularizer which can effectively regularize the subset selection of multiple tasks for multi-task sparse Gaussian processes. In particular, based on the multi-task sparsity regularizer proposed in [12], we perform two improvements: 1) replacing a subset of points with a rough global structure when measuring the global consistency of one point; 2) performing normalization on each dimension of every data set before sparsification. We combine the regularizer with two methods to demonstrate its effectiveness. Experimental results on four real data sets show its superiority.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Rasmussen, C., Williams, C.: Gaussian process for machine learning. MIT Press, Cambridge (2006)

    Google Scholar 

  2. Sun, S.: Infinite mixtures of multivariate Gaussian processes. In: Proceedings of the International Conference on Machine Learning and Cybernetics, pp. 1011–1016 (2013)

    Google Scholar 

  3. Bonilla, E., Chai, K.M., Williams, C.K.I.: Multi-task Gaussian process prediction. In: Proceedings of the Neural Information Processing Systems, pp. 1–8 (2008)

    Google Scholar 

  4. Williams, C., Seeger, M.: Using the Nystr\(\ddot{o}\)m method to speed up kernel machines. In: Advances in Neural Information Processing Systems, vol. 13, pp. 682–688 (2001)

    Google Scholar 

  5. Lawrence, N., Seeger, M., Herbrich, R.: Fast sparse Gaussian process methods: The informative vector machine. In: Advances in Neural Information Processing Systems, vol. 15, pp. 609–616 (2002)

    Google Scholar 

  6. Titsias, M.: Variational learning of inducing variables in sparse Gaussian processes. In: Proceedings of the 12th International Workshop on Artificial Intelligence and Statistics, pp. 567–574 (2009)

    Google Scholar 

  7. Dhillon, P., Foster, D., Ungar, L.: Minimum description length penalization for group and multi-task sparse learning. Journal of Machine Learning Research 12, 525–564 (2011)

    MATH  MathSciNet  Google Scholar 

  8. Jebara, T.: Multitask sparsity via maximum entropy discrimination. Journal of Machine Learning Research 12, 75–110 (2011)

    MATH  MathSciNet  Google Scholar 

  9. Lawrence, N., Platt, J.: Learning to learn with the informative vector machine. In: Proceedings of International Conference on Machine Learning, pp. 1–8 (2004)

    Google Scholar 

  10. Wang, Y., Khardon, R.: Sparse gaussian processes for multi-task learning. In: Flach, P.A., De Bie, T., Cristianini, N. (eds.) ECML PKDD 2012, Part I. LNCS, vol. 7523, pp. 711–727. Springer, Heidelberg (2012)

    Chapter  Google Scholar 

  11. Bonilla, E., Agakov, F., Williams, C.: Kernel multi-task learning using task-specific features. In: Proceedings of International Conference on Artificial Intelligence and Statistics, pp. 43–50 (2007)

    Google Scholar 

  12. Zhu, J., Sun, S.: Single-task and multitask Gaussian processes. In: Proceedings of the International Conference on Machine Learning and Cybernetics, pp. 1033–1038 (2013)

    Google Scholar 

  13. Zhu, J., Sun, S.: Sparse Gaussian processes with manifold-preserving graph reduction. Neurocomputing 138, 99–105 (2014)

    Article  Google Scholar 

  14. Sun, S., Hussain, Z., Shawe-Taylor, J.: Manifold-preserving graph reduction for sparse semi-supervised learning. Neurocomputing 124, 13–21 (2014)

    Article  Google Scholar 

  15. Sun, S.: Multitask learning for EEG-based biometrics. In: Proceedings of the 19th International Conference on Pattern Recognition, pp. 1–4 (2008)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2014 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Zhu, J., Sun, S. (2014). Multi-task Sparse Gaussian Processes with Improved Multi-task Sparsity Regularization. In: Li, S., Liu, C., Wang, Y. (eds) Pattern Recognition. CCPR 2014. Communications in Computer and Information Science, vol 483. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-662-45646-0_6

Download citation

  • DOI: https://doi.org/10.1007/978-3-662-45646-0_6

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-662-45645-3

  • Online ISBN: 978-3-662-45646-0

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics