Abstract
In semi-supervised classification, many methods use the graph representation of data. Based on the graph, different methods, e.g. random walk model, spectral cluster, Markov chain, and regularization theory etc., are employed to design classification algorithms. However, all these methods use the form of graphs constructed directly from data, e.g. kNN graph. In reality, data is only the observation with noise of hidden variables. Classification results using data directly from the observation may be biased by noise. Therefore, filtering the noise before using any classification methods can give a better classification. We propose a novel method to filter the noise in high dimension data by smoothing the graph. The analysis is given from the aspects of spectral theory, Markov chain, and regularization. We show that our method can reduce the high frequency components of the graph, and also has an explanation from regularization view. A graph volume based parameter learning method can be efficiently applied to classification. Experiments on artificial and real world data set indicate that our method has a superior classification accuracy.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Belkin, M., Niyogi, P.: Semi-supervised Learning on Riemannian Manifolds. Machine Learning 56, 209–239 (2004); Special Issue on Clustering
Chapelle, O., Weston, J., Schölkopf, B.: Cluster Kernels for Semi-Supervised Learning. In: Advances in Neural Information Processing Systems, vol. 15, pp. 585–592. MIT Press, Cambridge (2003)
Chapelle, O., Zien, A.: Semi-Supervised Classification by Low Density Separation. In: Proceedings of the Tenth International Workshop on Artificial Intelligence and Statistics, pp. 57–64 (2005)
Coifman, R.R., Lafon, S., Lee, A.B., Maggioni, M., Nadler, B., Warner, F., Zucker, S.W.: Geometric Diffusions as a Tool for Harmonic Analysis and Structure Definition of Data. In: Proceedings of the National Academy of Sciences (2005)
Henk, C.T.: Stochastic Models: An Algorithmic Approach. John Wiley & Sons, Chichester (1994)
Meila, M., Shi, J.: Learning Segmentation by Random Walks. Neural Information Processing Systems 13, 873–879 (2000)
Szummer, M., Jaakkola, T.: Partially labeled Classification with Markov Random Walks. Neural Information Processing Systems (NIPS)Â 14 (2001)
Zhou, D., Schölkopf, B.: Learning from Labeled and Unlabeled Data Using Random Walks. In: Rasmussen, C.E., Bülthoff, H.H., Schölkopf, B., Giese, M.A. (eds.) DAGM 2004. LNCS, vol. 3175, pp. 237–244. Springer, Heidelberg (2004)
Zhou, D., et al.: Learning with Local and Global Consistency. In: Advances in Neural Information Processing System, vol. 16, pp. 321–328. MIT Press, Cambridge (2004)
Zhu, X., Lafferty, J., Ghahramani, Z.: Semi-Supervised Learning Using Gaussian Fields and Harmonic Function. In: The Twentieth International Conference on Machine Learning (2003)
Zhu, X.: Semi-Supervised Learning with Graphs. Doctoral Thesis. CMU-LTI-05-192, Carnegie Mellon University (2005)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2006 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Zhou, X., Li, C. (2006). Combining Smooth Graphs with Semi-supervised Classification. In: Ng, WK., Kitsuregawa, M., Li, J., Chang, K. (eds) Advances in Knowledge Discovery and Data Mining. PAKDD 2006. Lecture Notes in Computer Science(), vol 3918. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11731139_46
Download citation
DOI: https://doi.org/10.1007/11731139_46
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-33206-0
Online ISBN: 978-3-540-33207-7
eBook Packages: Computer ScienceComputer Science (R0)