Skip to main content

Exemplar Selection Using Collaborative Neighbor Representation

  • Conference paper
  • First Online:
Hybrid Artificial Intelligent Systems (HAIS 2015)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 9121))

Included in the following conference series:

Abstract

Retrieving the most relevant exemplars in image databases has been a difficult task. Most of exemplar selection methods were proposed and developed to work with a specific classifier. Research in exemplar selection is targeting schemes that can benefit a wide range of classifiers. Recently, Sparse Modeling Representative Selection (SMRS) method has been proposed for selecting the most relevant instances. SMRS is based on data self-representation in the sense that it estimates a coding matrix using a codebook set to the data themselves. The matrix coefficients are estimated using block sparsity constraint. In this paper, we propose a coding scheme based on a two stage Collaborative Neighbor Representation in the matrix of coefficients is estimated without any explicit sparse coding. For the second stage, we introduce two schemes for sample pruning in the second stage. Experiments are conducted on summarizing two video movies. We also provide quantitative performance evaluation via classification on the selected prototypes. To this end, one face dataset, one handwritten digits dataset, and one object dataset are used. These experiments showed that the proposed method can outperform state-of-the art methods including the SMRS method.

This work was supported by the project EHU13/40.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    https://www.youtube.com/watch?v=fHP3sdnz-r4.

  2. 2.

    http://www.youtube.com/watch?v=0-mStina7Xs.

  3. 3.

    http://vision.ucsd.eduleekc/ExtYaleDatabase/ExtYaleB.html.

  4. 4.

    http://www.cs.nyu.edu/roweis/data.html.

  5. 5.

    www.cs.columbia.edu/CAVE/software/softlib/coil-20.php.

References

  1. Garcia, S., Derrac, J., Cano, R., Herrera, F.: Prototype selection for nearest neighbor classification: taxonomy and empirical study. IEEE Trans. Pattern Anal. Mach. Intell. 34(3), 417–435 (2012)

    Article  Google Scholar 

  2. Gu, M., Eisenstat, S.: Efficient algorithms for computing a strong rankrevealing QR factorization. SIAM J. Sci. Comput. 17, 848–869 (1996)

    Article  MATH  MathSciNet  Google Scholar 

  3. Frey, B., Dueck, D.: Clustering by passing messages between data points. Sci. Mag. 315, 972–976 (2007)

    MATH  MathSciNet  Google Scholar 

  4. Tropp, J.: Column subset selection, matrix factorization and eigenvalue optimization. In: Proceedings of ACM-SIAM Symposium on Discrete Algorithms (SODA), pp. 978–986, January 2009

    Google Scholar 

  5. Boutsidis, C., Mahoney, M., Drineas, P.: An improved approximation algorithm for the column subset selection problem. In: Proceedings of ACM-SIAM Symposium on Discrete Algorithms (SODA), pp. 968–977, January 2009

    Google Scholar 

  6. Elhamifar, E., Sapiro, G., Vidal, R.: See all by looking at a few: sparse modeling for finding representative objects. In: Proceedings of IEEE Conference on Computer Vision and Pattern Recognition, pp. 1600–1607, June 2012

    Google Scholar 

  7. Bien, J., Xu, Y., Mahoney, M.: CUR from a sparse optimization viewpoint. In: Advances in Neural Information Processing Systems, pp. 217–225, December 2010

    Google Scholar 

  8. Czarnowski, I.: Cluster-based instance selection for machine classification. Knowl. Inf. Syst. 78(3), 1–21 (2010)

    Google Scholar 

  9. Chen, J., Zhang, C., Xue, X., Liu, C.L.: Fast instance selection for speeding up support vector machines. Knowl.-Based Syst. 47, 1–7 (2013)

    Article  Google Scholar 

  10. Narayan, B., Murthy, C., Pal, S.: Maxdiff kd-trees for data condensation. Pattern Recogn. Lett. 27, 187–200 (2006)

    Article  Google Scholar 

  11. Chan, T.: Rank revealing QR factorizations. Linear Algebra Appl. 88–89, 67–82 (1987)

    Google Scholar 

  12. Esser, E., Moller, M., Osher, S., Sapiro, G., Xin, J.: A convex model for nonnegative matrix factorization and dimensionality reduction on physical space. IEEE Trans. Image Process. 21(7), 3239–3252 (2012)

    Article  MathSciNet  Google Scholar 

  13. Charikar, M., Guha, S., Tardos, A., Shmoys, D.: A constant-factor approximation algorithm for the k-median problem. J. Comput. Syst. Sci. 65(1), 129–149 (2002)

    Article  MATH  MathSciNet  Google Scholar 

  14. Givoni, I., Chung, C., Frey, B.: Hierarchical affinity propagation. In: Conference on Uncertainty in Artificial Intelligence, July 2011

    Google Scholar 

  15. Duda, R., Hart, P., Stork, D.: Pattern Classification. WileyInterscience, U.S.A (2004)

    Google Scholar 

  16. Dueck, D., Frey, B.: Non-metric affinity propagation for unsupervised image categorization. In: Proceedings of International Conference in Computer Vision, pp. 1–8, October 2007

    Google Scholar 

  17. Olvera-López, J.A., Carrasco-Ochoa, J.A., Martínez-Trinidad, J.F.: Prototype selection via prototype relevance. In: Ruiz-Shulcloper, J., Kropatsch, W.G. (eds.) CIARP 2008. LNCS, vol. 5197, pp. 153–160. Springer, Heidelberg (2008)

    Chapter  Google Scholar 

  18. Dornaika, F., Aldine, I.K.: Instance selection using two phase collaborative neighbor representation. In: Wermter, S., Weber, C., Duch, W., Honkela, T., Koprinkova-Hristova, P., Magg, S., Palm, G., Villa, A.E.P. (eds.) ICANN 2014. LNCS, vol. 8681, pp. 121–128. Springer, Heidelberg (2014)

    Chapter  Google Scholar 

  19. Boyd, S., Parikh, N., Chu, E., Peleato, B., Eckstein, J.: Distributed optimization and statistical learning via the alternating direction method of multipliers. Found. Trends Mach. Learn. 3(1), 1–122 (2011)

    Article  Google Scholar 

  20. Waqas, J., Yi, Z., Zhang, L.: Collaborative neighbor representation based classification using \(l_{2}\)-minimization approach. Pattern Recogn. Lett. 34(2), 201–208 (2013)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to F. Dornaika .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2015 Springer International Publishing Switzerland

About this paper

Cite this paper

Dornaika, F., Aldine, I.K., Cases, B. (2015). Exemplar Selection Using Collaborative Neighbor Representation. In: Onieva, E., Santos, I., Osaba, E., Quintián, H., Corchado, E. (eds) Hybrid Artificial Intelligent Systems. HAIS 2015. Lecture Notes in Computer Science(), vol 9121. Springer, Cham. https://doi.org/10.1007/978-3-319-19644-2_37

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-19644-2_37

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-19643-5

  • Online ISBN: 978-3-319-19644-2

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics