skip to main content
10.1145/3628797.3628866acmotherconferencesArticle/Chapter ViewAbstractPublication PagessoictConference Proceedingsconference-collections
research-article

Adaptive Nonlinear Dimensionality Reduction with a Local Metric

Published: 07 December 2023 Publication History

Abstract

Understanding the structure of multidimensional patterns, especially in unsupervised cases, is of fundamental importance in data mining, pattern recognition and machine learning. Several algorithms have been proposed to analyze the structure of high dimensional data based on the notion of manifold learning. These algorithms have been used to extract the intrinsic characteristics of different types of high dimensional data by performing nonlinear dimensionality reduction. Most of these algorithms rely on the Euclidean metric and manually choosing a neighborhood size. They cannot recover the intrinsic geometry of data manifold automatically and accurately for some data sets. In this paper, we propose an adaptive version of ISOMAP that integrates the advantages of local and global manifold learning algorithms. Faster convergence rate of this algorithm is obtained by replacing the Euclidean metric with the arc length metric as the sum of second-order approximation to the geodesic distance. Our experiments on synthetic data as well as real world images demonstrate that our proposed algorithm can achieve better performance than that of ISOMAP and solve some harder data sets that are impossible for existing global methods.

References

[1]
M. Balasubramanian, E. L. Schwartz, J. B. Tenenbaum, V. de Silva, and J. C. Langford. 2002. The Isomap Algorithm and Topological Stability. Science 295 (2002), 7–7.
[2]
M. Belkin and P. Niyogi. 2003. Using Manifold Stucture for Partially Labeled Classification. In Advances in Neural Information Processing Systems 15. Cambridge, MA, 929–936.
[3]
Y. Bengio and M. Monperrus. 2005. Non-Local Manifold Tangent Learning. In Advances in Neural Information Processing Systems 17. 129–136.
[4]
M. Bernstein, V. de Silva, J. C. Langford, and J. B. Tenenbaum. 2000. Graph Approximations to Geodesics on Embedded Manifolds. Technical Report. Department of Psychology, Stanford University.
[5]
M. Brand. 2003. Charting a Manifold. In Advances in Neural Information Processing Systems 15. Cambridge, MA, 961–968.
[6]
H. Chen, P. Tino, and X. Yao. 2009. Probabilistic Classification Vector Machines. IEEE Transactions on Neural Networks 20, 6 (2009), 901–914.
[7]
H. Chen, P. Tiňo, and X. Yao. 2014. Efficient Probabilistic Classification Vector Machine With Incremental Basis Function Selection. IEEE Transactions on Neural Networks and Learning Systems 25, 2 (2014), 356–369.
[8]
T. Cox and M. Cox. 1994. Multidimensional Scaling. Chapman Hall, London.
[9]
V. de Silva and J. B. Tenenbaum. 2003. Global versus Local Methods in Nonlinear Dimensionality Reduction. In Advances in Neural Information Processing Systems 15. 705–712.
[10]
E. W. Dijkstra. 1959. A Note on Two Problems in Connection with Graphs. Numer. Math. 1 (1959), 269–271.
[11]
G. Golub and A. van Loan. 1996. Matrix Computations. Johns Hopkins U. Press.
[12]
C. Grimes and D. Donoho. 2002. When does isomap recover the natural parameterization of families of articulated images?Technical Report. Stanford University.
[13]
A. Hadid, O. Kouropteva, and M. Pietikainen. 2002. Unsupervised Learning using Locally Linear Embedding: Experiments in Face Pose Analysis. In Proceedings of the 16th International Conference on Pattern Recognition. 111–114.
[14]
S. He 2015. Robust twin boosting for feature selection from high-dimensional omics data with label noise. Information Sciences 291 (2015), 1–18.
[15]
J. Hong, Y. Li, and H. Chen. 2019. Variant grassmann manifolds: A representation augmentation method for action recognition. ACM Transactions on Knowledge Discovery from Data (TKDD) 13, 2 (2019), 1–23.
[16]
A. Hyvärinen, J. Karhunen, and E. Oja. 2001. Independent Component Analysis. John Wiley.
[17]
O. Jenkins and M. Mataric. 2004. A Spatio-temporal Extension to Isomap Nonlinear Dimension Reduction. In Proceedings of the 21st International Conference on Machine Learning. 56–64.
[18]
B. Jiang, M.D. Li, C.and Rijke, X. Yao, and H. Chen. 2019. Probabilistic feature selection and classification vector machine. ACM Transactions on Knowledge Discovery from Data (TKDD) 13, 2 (2019), 1–27.
[19]
B. Jiang, Z. Li, H. Chen, and A. G Cohn. 2018. Latent topic text representation learning on statistical manifolds. IEEE transactions on neural networks and learning systems 29, 11 (2018), 5643–5654.
[20]
B. Jiang, X. Wu, K. Yu, and H. Chen. 2019. Joint semi-supervised feature selection and classification through Bayesian approach. In Proceedings of the AAAI conference on artificial intelligence, Vol. 33. 3983–3990.
[21]
B. Jiang, X. Wu, X. Zhou, Y. Liu, A. Cohn, W. Sheng, and H. Chen. 2022. Semi-supervised multiview feature selection with adaptive graph learning. IEEE Transactions on Neural Networks and Learning Systems (2022).
[22]
I. T. Jolliffe. 1986. Principal Component Analysis. Springer-Verlag, New York.
[23]
J. Jost. 2002. Riemannian Geometry and Geometric Analysis. Springer-Verlag.
[24]
M. H. Law and A. K. Jain. 2006. Incremental Nonlinear Dimensionality Reduction By Manifold Learning. IEEE Transactions of Pattern Analysis and Machine Intelligence 28, 3 (2006), 377–391.
[25]
C. Li and H. Chen. 2014. Sparse Bayesian approach for feature selection. In 2014 IEEE Symposium on Computational Intelligence in Big Data (CIBD). IEEE, 1–7.
[26]
S. Roweis and L. Saul. 2000. Nonlinear Dimensionality Reduction by Locally Linear Embedding. Science 290 (2000), 2323–2326.
[27]
B. Schölkopf, A. J. Smola, and K. Müller. 1999. Kernel Principal Component Analysis. In Advances in Kernel Methods: Support Vector Learning. 327–352.
[28]
J. B. Tenenbaum, V. de Silva, and J. C. Langford. 2000. A Global Geometric Framework for Nonlinear Dimensionality Reduction. Science 290 (2000), 2319–2323.
[29]
J. Wang, Z. Zhang, and H. Zha. 2005. Adaptive Manifold Learning. In Advances in Neural Information Processing Systems 17. 1473–1480.
[30]
X. Wu, B. Jiang, X. Wang, T. Ban, and H. Chen. 2023. Feature Selection in the Data Stream Based on Incremental Markov Boundary Learning. IEEE Transactions on Neural Networks and Learning Systems (2023).
[31]
X. Wu, B. Jiang, K. Yu, H. Chen, 2019. Accurate Markov boundary discovery for causal feature selection. IEEE Transactions on Cybernetics 50, 12 (2019), 4983–4996.
[32]
X. Wu, B. Jiang, K. Yu, H. Chen, and C. Miao. 2020. Multi-label causal feature selection. In Proceedings of the AAAI conference on artificial intelligence, Vol. 34. 6430–6437.
[33]
X. Wu, B. Jiang, Y. Zhong, and H. Chen. 2020. Tolerant markov boundary discovery for feature selection. In Proceedings of the 29th ACM International Conference on Information & Knowledge Management. 2261–2264.
[34]
X. Wu, B. Jiang, Y. Zhong, and H. Chen. 2022. Multi-target markov boundary discovery: Theory, algorithm, and application. IEEE Transactions on Pattern Analysis and Machine Intelligence 45, 4 (2022), 4964–4980.
[35]
M. H. Yang. 2002. Face Recognition using Extended Isomap. In IEEE International Conference on Image Processing. 117–120.
[36]
Y. Zhong, X. Wu, B. Jiang, and H. Chen. 2021. Multi-label local-to-global feature selection. In 2021 International Joint Conference on Neural Networks (IJCNN). IEEE, 1–8.

Index Terms

  1. Adaptive Nonlinear Dimensionality Reduction with a Local Metric
        Index terms have been assigned to the content through auto-classification.

        Recommendations

        Comments

        Information & Contributors

        Information

        Published In

        cover image ACM Other conferences
        SOICT '23: Proceedings of the 12th International Symposium on Information and Communication Technology
        December 2023
        1058 pages
        ISBN:9798400708916
        DOI:10.1145/3628797
        Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

        Publisher

        Association for Computing Machinery

        New York, NY, United States

        Publication History

        Published: 07 December 2023

        Permissions

        Request permissions for this article.

        Check for updates

        Author Tags

        1. ISOMAP
        2. Manifold Learning
        3. dimensionality reduction
        4. unsupervised learning

        Qualifiers

        • Research-article
        • Research
        • Refereed limited

        Conference

        SOICT 2023

        Acceptance Rates

        Overall Acceptance Rate 147 of 318 submissions, 46%

        Contributors

        Other Metrics

        Bibliometrics & Citations

        Bibliometrics

        Article Metrics

        • 0
          Total Citations
        • 43
          Total Downloads
        • Downloads (Last 12 months)31
        • Downloads (Last 6 weeks)1
        Reflects downloads up to 01 Mar 2025

        Other Metrics

        Citations

        View Options

        Login options

        View options

        PDF

        View or Download as a PDF file.

        PDF

        eReader

        View online with eReader.

        eReader

        HTML Format

        View this article in HTML Format.

        HTML Format

        Figures

        Tables

        Media

        Share

        Share

        Share this Publication link

        Share on social media