Abstract
Given the joint distribution p(X,Y) of the original variable X and relevant variable Y, the Information Bottleneck (IB) method aims to extract an informative representation of the variable X by compressing it into a “bottleneck” variable T, while maximally preserving the relevant information about the variable Y. In practical applications, when the variable X is compressed into its representation T, however, this method does not take into account the local geometrical property hidden in data spaces, therefore, it is not appropriate to deal with non-linearly separable data. To solve this problem, in this study, we construct an information theoretic framework by integrating local geometrical structures into the IB methods, and propose Locally-Consistent Information Bottleneck (LCIB) method. The LCIB method uses k-nearest neighbor graph to model the local structure, and employs mutual information to measure and guarantee the local consistency of data representations. To find the optimal solution of LCIB algorithm, we adopt a sequential “draw-and-merge” procedure to achieve the converge of our proposed objective function. Experimental results on real data sets demonstrate the effectiveness of the proposed approach.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Tishby, N., Pereira, F.C., Bialek, W.: The Information Bottleneck Method. In: Proccedings of the 37th Allerton Conference on Communication and Computation, Illinois, USA, pp. 368–377 (1999)
Slonim, N., Friedman, N., Tishby, N.: Unsupervised Document Classification using Sequential Information Maximization. In: Proccedings of the 25th ACM International Conference on Research and Development of Information Retireval, pp. 129–136. ACM Press, Tampere (2002)
Cover, T.M., Thomas, J.A.: Elements of Information Theory. John Wiley, New York (1991)
Slonim, N.: The Information Bottleneck: Theory and Applications. PhD thesis, the Senate of the Hebrew University (2002)
Lou, Z., Ye, Y., Liu, D.: Unsupervised object category discovery via information bottleneck method. In: Proceedings of ACM International Conference on Multimedia, pp. 863–866 (2010)
Cai, D., Mei, Q., Han, J., Zhai, C.: Modeling Hidden Topics on Document Manifold. In: Proceeding of the 17th ACM Conference on Information and Knowledge Management, pp. 911–920 (2008)
Cai, D., Wang, X., He, X.: Probabilistic Dyadic Data Analysis with Local and Global Consistency. In: Proceedings of the 26th Annual International Conference on Machine Learning, pp. 105–112 (2009)
Saenko, K., Kulis, B., Fritz, M., Darrell, T.: Adapting Visual Category Models to New Domains. In: Daniilidis, K., Maragos, P., Paragios, N. (eds.) ECCV 2010. LNCS, vol. 6314, pp. 213–226. Springer, Heidelberg (2010)
Bay, H., Ess, A., Tuytelaars, T., Gool, L.V.: Speeded-Up Robust Features (SURF). Computer Vision and Image Understanding 110(3), 346–359 (2008)
Szummer, M., Jaakkola, T.: Partially labeled classification with markov random walks. In: Advances in Neural Information Processing Systems, pp. 1025–1032 (2001)
Hofmann, T.: Unsupervised Learning by Probabilistic Latent Semantic Analysis. Machine Learning 42(1), 177–196 (2001)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2012 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Lou, Z., Ye, Y., Zhu, Z. (2012). Information Bottleneck with Local Consistency. In: Anthony, P., Ishizuka, M., Lukose, D. (eds) PRICAI 2012: Trends in Artificial Intelligence. PRICAI 2012. Lecture Notes in Computer Science(), vol 7458. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-32695-0_27
Download citation
DOI: https://doi.org/10.1007/978-3-642-32695-0_27
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-32694-3
Online ISBN: 978-3-642-32695-0
eBook Packages: Computer ScienceComputer Science (R0)