Skip to main content

Continuous and Discrete Deep Classifiers for Data Integration

  • Conference paper
  • First Online:

Part of the book series: Lecture Notes in Computer Science ((LNISA,volume 9385))

Abstract

Data representation in a lower dimension is needed in applications, where information comes from multiple high dimensional sources. A final compact model has to be interpreted by human experts, and interpretation of a classifier whose weights are discrete is much more straightforward. In this contribution, we propose a novel approach, called Deep Kernel Dimensionality Reduction which is designed for learning layers of new compact data representations simultaneously. We show by experiments on standard and on real large-scale biomedical data sets that the proposed method embeds data in a new compact meaningful representation, and leads to a lower classification error compared to the state-of-the-art methods. We also consider some state-of-the art deep learners and their corresponding discrete classifiers. We illustrate by our experiments that although purely discrete models do not always perform better than real-valued classifiers, the trade-off between the model accuracy and the interpretability is quite reasonable.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

References

  1. Alon, U., Barkai, N., Notterman, D.A., Gish, K., Ybarra, S., Mack, D., Levine, A.J.: Broad patterns of gene expression revealed by clustering analysis of tumor and normal colon tissues probed by oligonucleotide arrays. Proc. Nat. Acad. Sci. 96(12), 6745–6750 (1999)

    Article  Google Scholar 

  2. Chevaleyre, Y., Koriche, F., Zucker, J.-D.: Rounding methods for discrete linear classification. In: ICML (2013)

    Google Scholar 

  3. Cotillard, A., et al.: Dietary intervention impact on gut microbial gene richness. Nature 500, 585–588 (2013)

    Article  Google Scholar 

  4. Fukumizu, K., Bach, F., Jordan, M.: Dimensionality reduction for supervised learning with reproducing kernel hilbert spaces. J. Mach. Learn. Res. 5, 73–99 (2004)

    MathSciNet  MATH  Google Scholar 

  5. Fukumizu, K., Bach, F., Jordan, M.I.: Kernel dimensionality reduction for supervised learning. In: NIPS (2003)

    Google Scholar 

  6. Golub, T.R., Slonim, D.K., Tamayo, P., Huard, C., Gaasenbeek, M., Mesirov, J.P., Coller, H., Loh, M.L., Downing, J.R., Caligiuri, M.A., Bloomfield, C.D., Lander, E.S.: Molecular classification of cancer: class discovery and class prediction by gene expression monitoring. Science 286(5439), 531–537 (1999)

    Article  Google Scholar 

  7. Hinton, G.E., Salakhutdinov, R.R.: Reducing the dimensionality of data with neural networks. Science 313, 504–507 (2006)

    Article  MathSciNet  MATH  Google Scholar 

  8. Karatzoglou, A., Smola, A., Hornik, K., Zeileis, A.: Kernlab - an S4 package for kernel methods in R. J. Stat. Softw. 11(9), 1–20 (2004)

    Article  Google Scholar 

  9. Ngiam, J., Khosla, A., Kim, M., Lee, H., Ng, A.Y., Nam, J.: Multimodal deep learning. In: ICML (2011)

    Google Scholar 

  10. Srivastava, N., Hinton, G., Krizhevsky, A., Sutskever, I., Salakhudinov, R.: Dropout: a simple way to prevent neural networks from overfitting. J. Mach. Learn. Res. 15, 1929–1958 (2014)

    MathSciNet  MATH  Google Scholar 

  11. Williamson, D.P., Shmoys, D.B.: The design of approximation algorithms. Cambridge University Press, Cambridge (2011)

    Book  MATH  Google Scholar 

Download references

Acknowledgments

The clinical work was supported by Agence Nationale de la Recherche (ANR MICRO-Obes), KOT-Ceprodi and the association Fondation Coeur et Arteres. All ethical agreement are obtained. This work is also part of the European Unions Seventh Framework Program under grant agreement HEALTH-F4-2012-305312 (Metacardis project).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Nataliya Sokolovska .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2015 Springer International Publishing Switzerland

About this paper

Cite this paper

Sokolovska, N., Rizkalla, S., Clément, K., Zucker, JD. (2015). Continuous and Discrete Deep Classifiers for Data Integration. In: Fromont, E., De Bie, T., van Leeuwen, M. (eds) Advances in Intelligent Data Analysis XIV. IDA 2015. Lecture Notes in Computer Science(), vol 9385. Springer, Cham. https://doi.org/10.1007/978-3-319-24465-5_23

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-24465-5_23

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-24464-8

  • Online ISBN: 978-3-319-24465-5

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics