Distributed Boosting: An Enhancing Method on Dataset Distillation
Abstract
References
Index Terms
- Distributed Boosting: An Enhancing Method on Dataset Distillation
Recommendations
Tab-Distillation: Impacts of Dataset Distillation on Tabular Data For Outlier Detection
ICAIF '24: Proceedings of the 5th ACM International Conference on AI in FinanceDataset distillation aims to replace large training sets with significantly smaller synthetic sets while preserving essential information. This method reduces the training costs of advanced deep learning models and is widely used in the image domain. ...
Importance-aware adaptive dataset distillation
AbstractHerein, we propose a novel dataset distillation method for constructing small informative datasets that preserve the information of the large original datasets. The development of deep learning models is enabled by the availability of large-scale ...
UDD: Dataset Distillation via Mining Underutilized Regions
Pattern Recognition and Computer VisionAbstractDataset distillation synthesizes a small dataset such that a model trained on this set approximates the performance of the original dataset. Recent studies on dataset distillation focused primarily on the design of the optimization process, with ...
Comments
Information & Contributors
Information
Published In

Sponsors
Publisher
Association for Computing Machinery
New York, NY, United States
Publication History
Check for updates
Author Tags
Qualifiers
- Short-paper
Conference
Acceptance Rates
Upcoming Conference
Contributors
Other Metrics
Bibliometrics & Citations
Bibliometrics
Article Metrics
- 0Total Citations
- 75Total Downloads
- Downloads (Last 12 months)75
- Downloads (Last 6 weeks)9
Other Metrics
Citations
View Options
Login options
Check if you have access through your login credentials or your institution to get full access on this article.
Sign in