Graph-based Kullback-Leibler Divergence Minimization for Unsupervised Feature Selection
Abstract
References
Recommendations
Manifold Adaptive Multiple Kernel K-Means for Clustering
ACAI '20: Proceedings of the 2020 3rd International Conference on Algorithms, Computing and Artificial IntelligenceMultiple kernel methods based on k-means aims to integrate a group of kernels to improve the performance of kernel k-means clustering. However, we observe that most existing multiple kernel k-means methods exploit the nonlinear relationship within ...
Feature Selection: A Data Perspective
Feature selection, as a data preprocessing strategy, has been proven to be effective and efficient in preparing data (especially high-dimensional data) for various data-mining and machine-learning problems. The objectives of feature selection include ...
Direct Alignment Maximization for Clustering Ensemble
ICMLSC '21: Proceedings of the 2021 5th International Conference on Machine Learning and Soft ComputingGiven multiple clustering algorithms with different candidate hyper-parameter configurations, we will generate multiple weak partition results for the same data set. By integrating a group of such base clustering results, clustering ensemble methods ...
Comments
Information & Contributors
Information
Published In

Publisher
Association for Computing Machinery
New York, NY, United States
Publication History
Check for updates
Author Tags
Qualifiers
- Research-article
- Research
- Refereed limited
Conference
Contributors
Other Metrics
Bibliometrics & Citations
Bibliometrics
Article Metrics
- 0Total Citations
- 90Total Downloads
- Downloads (Last 12 months)16
- Downloads (Last 6 weeks)1
Other Metrics
Citations
View Options
Login options
Check if you have access through your login credentials or your institution to get full access on this article.
Sign inFull Access
View options
View or Download as a PDF file.
PDFeReader
View online with eReader.
eReaderHTML Format
View this article in HTML Format.
HTML Format