Abstract
In this paper, we study the problem of how to reliably compute neighborhoods on affinity graphs. The k-nearest neighbors (kNN) is one of the most fundamental and simple methods widely used in many tasks, such as classification and graph construction. Previous research focused on how to efficiently compute kNN on vectorial data. However, most real-world data have no vectorial representations, and only have affinity graphs which may contain unreliable affinities. Since the kNN of an object o is a set of k objects with the highest affinities to o, it is easily disturbed by errors in pairwise affinities between o and other objects, and also it cannot well preserve the structure underlying the data. To reliably analyze the neighborhood on affinity graphs, we define the k-dense neighborhood (kDN), which considers all pairwise affinities within the neighborhood, i.e., not only the affinities between o and its neighbors but also between the neighbors. For an object o, its kDN is a set kDN(o) of k objects which maximizes the sum of all pairwise affinities of objects in the set {o}∪kDN(o). We analyze the properties of kDN, and propose an efficient algorithm to compute it. Both theoretic analysis and experimental results on shape retrieval, semi-supervised learning, point set matching and data clustering show that kDN significantly outperforms kNN on affinity graphs, especially when many pairwise affinities are unreliable.
Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.References
Arya, S., Mount, D., Netanyahu, N., Silverman, R., & Wu, A. (1998). An optimal algorithm for approximate nearest neighbor searching fixed dimensions. Journal of the ACM, 45(6), 891–923.
Asahiro, Y., Hassin, R., & Iwama, K. (2002). Complexity of finding dense subgraphs. Discrete Applied Mathematics, 121(1–3), 15–26.
Bentley, J. (1975). Multidimensional binary search trees used for associative searching. Communications of the ACM, 18(9), 517–525.
Bomze, M., & De Klerk, E. (2002). Solving standard quadratic optimization problems via linear, semidefinite and copositive programming. Journal of Global Optimization, 24(2), 163–185.
Caetano, T., Caelli, T., Schuurmans, D., & Barone, D. (2006). Graphical models and point pattern matching. IEEE Transactions on Pattern Analysis and Machine Intelligence, 28(10), 1646–1663.
Chan, T. (1998). Approximate nearest neighbor queries revisited. Discrete & Computational Geometry, 20(3), 359–373.
Chang, C.-C., & Lin, C.-J. (2001). LIBSVM: a library for support vector machines.
Chapelle, O., Scholkopf, B., & Zien, A. (2006a) Semi-supervised learning.
Chapelle, O., Scholkopf, B., & Zien, A. (2006b). The Benchmark Data Sets. http://www.kyb.tuebingen.mpg.de/ssl-book/benchmarks.html.
Clauset, A., Moore, C., & Newman, M. (2008). Hierarchical structure and the prediction of missing links in networks. Nature, 453(7191), 98–101.
Connor, K. P. M. (2010). Fast construction of k-nearest neighbor graphs for point clouds. IEEE Transactions on Visualization and Computer Graphics, 16(4), 599–608.
Cover, T., & Hart, P. (1967). Nearest neighbor pattern classification. IEEE Transactions on Information Theory, 13(1), 21–27.
Crammer, K., Talukdar, P., & Pereira, F. (2008). A rate-distortion one-class model and its applications to clustering. In Proceedings of the 25th international conference on machine learning (pp. 184–191).
Denoeux, T. (2008). A k-nearest neighbor classification rule based on Dempster-Shafer theory. In Classic works of the Dempster-Shafer theory of belief functions (pp. 737–760).
Dhillon, I., Guan, Y., & Kulis, B. (2004). Kernel k-means: spectral clustering and normalized cuts. In International conference on knowledge discovery and data mining (pp. 551–556).
Fleishman, S., Cohen-Or, D., & Silva, C. (2005). Robust moving least-squares fitting with sharp features. In ACM special interest group on graphics and interactive techniques (pp. 552–560).
Frank, A., & Asuncion, A. (2010). UCI Machine Learning Repository. http://archive.ics.uci.edu/ml
Frey, B., & Dueck, D. (2007). Clustering by passing messages between data points. Science, 315(5814), 972–976.
Georgescu, B., & Meer, P. (2004). Point matching under large image deformations and illumination changes. IEEE Transactions on Pattern Analysis and Machine Intelligence, 26, 674–688.
Gibson, D., Kumar, R., & Tomkins, A. (2005). Discovering large dense subgraphs in massive graphs. In Proceedings of the 31st international conference on very large data bases (pp. 732–741).
Goldstein, M. (1972). K-nearest neighbor classification. IEEE Transactions on Information Theory, 18(5), 627–630.
Gupta, G., & Ghosh, J. (2005). Robust one-class clustering using hybrid global and local search. In Proceedings of the 22nd international conference on machine learning (pp. 273–280).
Han, E., Karypis, G., & Kumar, V. (2001). Text categorization using weight adjusted k-nearest neighbor classification. In Advances in knowledge discovery and data mining (pp. 53–65).
Horton, P., & Nakai, K. (1997). Better prediction of protein cellular localization sites with the k nearest neighbors classifier. In International conference on intelligent systems for molecular biology (Vol. 5, pp. 147–152).
Hu, H., Yan, X., Huang, Y., Han, J., & Zhou, X. (2005). Mining coherent dense subgraphs across massive biological networks for functional discovery. Bioinformatics, 21, 213–226.
Indyk, P., & Motwani, R. (1998). Approximate nearest neighbors: towards removing the curse of dimensionality. In Proceedings of the 30th annual ACM symposium on theory of computing (pp. 604–613).
Jebara, T., & Shchogolev, V. (2006). B-matching for spectral clustering. In European conference on machine learning (pp. 679–686).
Jebara, T., Wang, J., & Chang, S. (2009). Graph construction and b-matching for semi-supervised learning. In International conference on machine learning (pp. 441–448).
Kolahdouzan, M., & Shahabi, C. (2004). Voronoi-based k nearest neighbor search for spatial network databases. In Proceedings of the international conference on very large data bases (pp. 851–860).
Korn, F., Sidiropoulos, N., Faloutsos, C., Siegel, E., & Protopapas, Z. (1996). Fast nearest neighbor search in medical image databases. In Proceedings of the international conference on very large data bases (pp. 215–226).
Kuhn, W., & Tucker, A. (1951). Nonlinear programming. In Proceedings of 2nd Berkeley symposium (pp. 481–492).
Leordeanu, M., & Hebert, M. (2005). A spectral technique for correspondence problems using pairwise constraints. In International conference on computer vision (pp. 1482–1489).
Li, L., Weinberg, C., Darden, T., & Pedersen, L. (2001). Gene selection for sample classification based on gene expression data: study of sensitivity to choice of parameters of the GA/KNN method. Bioinformatics, 17(12), 1131–1139.
Ling, H., & Jacobs, D. (2007). Shape classification using the inner-distance. IEEE Transactions on Pattern Analysis and Machine Intelligence, 29(2), 286–299.
Motzkin, T., & Straus, E. (1965). Maxima for graphs and a new proof of a theorem of Turán. Canadian Journal of Mathematics, 17(4), 533–540.
Ng, A., Jordan, M., & Weiss, Y. (2001). On spectral clustering: analysis and an algorithm. In Advances in neural information processing systems (pp. 849–856).
Ouyang, Q., Kaplan, P., Liu, S., & Libchaber, A. (1997). DNA solution of the maximal clique problem. Science, 80, 446–448.
Pavan, M., & Pelillo, M. (2007). Dominant sets and pairwise clustering. IEEE Transactions on Pattern Analysis and Machine Intelligence, 29(1), 167–172.
Shi, J., & Malik, J. (2000). Normalized cuts and image segmentation. IEEE Transactions on Pattern Analysis and Machine Intelligence, 22(8), 888–905.
Yu, C., Ooi, B., Tan, K., & Jagadish, H. (2001). Indexing the distance: An efficient method to kNN processing. In Proceedings of the international conference on very large data bases (pp. 421–430).
Zaki, M., Parthasarathy, S., Ogihara, M., & Li, W. (1997). New algorithms for fast discovery of association rules. In International conference on knowledge discovery and data mining (Vol. 20, pp. 283–286).
Zhou, D., Bousquet, O., Lal, T., Weston, J., & Schlkopf, B. (2004). Learning with local and global consistency. In Advances in neural information processing systems (pp. 595–602).
Zhu, X., Ghahramani, Z., & Lafferty, J. (2003). Semi-supervised learning using Gaussian fields and harmonic functions. In International conference on machine learning (Vol. 20, pp. 912–919).
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Liu, H., Yang, X., Latecki, L.J. et al. Dense Neighborhoods on Affinity Graph. Int J Comput Vis 98, 65–82 (2012). https://doi.org/10.1007/s11263-011-0496-1
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11263-011-0496-1