Skip to main content

Invertible Nonlinear Dimensionality Reduction via Joint Dictionary Learning

  • Conference paper
  • First Online:
Book cover Latent Variable Analysis and Signal Separation (LVA/ICA 2015)

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 9237))

Abstract

This paper proposes an invertible nonlinear dimensionality reduction method via jointly learning dictionaries in both the original high dimensional data space and its low dimensional representation space. We construct an appropriate cost function, which preserves inner products of data representations in the low dimensional space. We employ a conjugate gradient algorithm on smooth manifold to minimize the cost function. By numerical experiments in image processing, our proposed method provides competitive and robust performance in image compression and recovery, even on heavily corrupted data. In other words, it can also be considered as an alternative approach to compressed sensing. While our approach can outperform compressed sensing in task-driven learning problems, such as data visualization.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    http://yann.lecun.com/exdb/mnist/.

References

  1. Van der Maaten, L.J., Postma, E.O., van den Herik, H.J.: Dimensionality reduction: a comparative review. J. Mach. Learn. Res. 10(1–41), 66–71 (2009)

    Google Scholar 

  2. Donoho, D.L.: Compressed sensing. IEEE Trans. Inf. Theory 52(4), 1289–1306 (2006)

    Article  MATH  MathSciNet  Google Scholar 

  3. Aharon, M., Elad, M., Bruckstein, A.: K-SVD: an algorithm for designing overcomplete dictionaries for sparse representation. IEEE Trans. Signal Process. 54(11), 4311–4322 (2006)

    Article  Google Scholar 

  4. Hawe, S., Seibert, M., Kleinsteuber, M.: Separable dictionary learning. In: IEEE Conference on Computer Vision and Pattern Recognition (CVPR), pp. 438–445. IEEE (2013)

    Google Scholar 

  5. Gleichman, S., Eldar, Y.C.: Blind compressed sensing. IEEE Trans. Inf. Theory 57(10), 6958–6975 (2011)

    Article  MathSciNet  Google Scholar 

  6. Carvajalino, J.M.D., Sapiro, G.: Learning to sense sparse signals: simultaneous sensing matrix and sparsifying dictionary optimization. IEEE Trans. Image Process. 18(7), 1395–1408 (2009)

    Article  MathSciNet  Google Scholar 

  7. Elad, M.: Optimized projections for compressed sensing. IEEE Trans. Signal Process. 55(12), 5695–5702 (2007)

    Article  MathSciNet  Google Scholar 

  8. Calderbank, R., Jafarpour, S., Schapire, R.: Compressed learning: Universal sparse dimensionality reduction and learning in the measurement domain. Technical report, Computer Science, Princeton University (2009)

    Google Scholar 

  9. Zeyde, R., Elad, M., Protter, M.: On Single Image Scale-Up Using Sparse-Representations. In: Boissonnat, J.-D., Chenin, P., Cohen, A., Gout, C., Lyche, T., Mazure, M.-L., Schumaker, L. (eds.) Curves and Surfaces 2011. LNCS, vol. 6920, pp. 711–730. Springer, Heidelberg (2012)

    Chapter  Google Scholar 

  10. Johnson, W.B., Lindenstrauss, J.: Extensions of Lipschitz mappings into a Hilbert space. Contemp. Math. 26(189–206), 1 (1984)

    MathSciNet  Google Scholar 

  11. Kim, H., Park, H., Zha, H.: Distance preserving dimension reduction for manifold learning. In: SDM, SIAM, pp. 527–532 (2007)

    Google Scholar 

  12. Baraniuk, R., Davenport, M., DeVore, R., Wakin, M.: A simple proof of the restricted isometry property for random matrices. Constructive Approximation 28(3), 253–263 (2008)

    Article  MATH  MathSciNet  Google Scholar 

  13. Elad, M.: Sparse and Redundant Representations: From Theory to Applications in Signal and Image Processing. Springer, New York (2010)

    Book  Google Scholar 

  14. Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. J. Roy. Stat. Soc. Ser. B (Stat. Methodol.) 67(2), 301–320 (2005)

    Article  MATH  MathSciNet  Google Scholar 

  15. Mairal, J., Bach, F., Ponce, J.: Task-driven dictionary learning. IEEE Trans. Pattern Anal. Mach. Intell. 34(4), 791–804 (2012)

    Article  Google Scholar 

  16. Wei, X., Shen, H., Kleinsteuber, M.: An adaptive dictionary learning approach for modeling dynamical textures. In: Proceedings of the \(39^{th}\) IEEE International Conference on Acoustics, Speech, and Signal Processing (ICASSP), pp. 3567–3571 (2014)

    Google Scholar 

  17. Sim, T., Baker, S., Bsat, M.: The CMU pose, illumination, and expression (PIE) database. In: Fifth IEEE International Conference on Automatic Face and Gesture Recognition (FG), pp. 46–51. IEEE (2002)

    Google Scholar 

  18. De la Torre, F., Black, M.J.: Robust principal component analysis for computer vision. In: Eighth IEEE International Conference on Computer Vision (ICCV), vol. 1, pp. 362–369. IEEE (2001)

    Google Scholar 

  19. Ji, S., Xue, Y., Carin, L.: Bayesian compressive sensing. IEEE Trans. Signal Process. 56(6), 2346–2356 (2008)

    Article  MathSciNet  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Xian Wei .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2015 Springer International Publishing Switzerland

About this paper

Cite this paper

Wei, X., Kleinsteuber, M., Shen, H. (2015). Invertible Nonlinear Dimensionality Reduction via Joint Dictionary Learning. In: Vincent, E., Yeredor, A., Koldovský, Z., Tichavský, P. (eds) Latent Variable Analysis and Signal Separation. LVA/ICA 2015. Lecture Notes in Computer Science(), vol 9237. Springer, Cham. https://doi.org/10.1007/978-3-319-22482-4_32

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-22482-4_32

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-22481-7

  • Online ISBN: 978-3-319-22482-4

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics