ABSTRACT
In order to address scalability issues which are a challenge for Deep Learning methods, we propose Wide Learning --- a model that scales horizontally rather than vertically enabling distributed learning. This approach first trains a repertoire of architecturally diverse neural networks of low complexity in parallel. Each network in the repertoire extracts a set of features from the dataset; these are then aggregated in a second short training phase in a centralised model to solve the classification task. The repertoire is generated using a quality diversity evolutionary algorithm (MAP-Elites) which returns a set of neural networks which are diverse w.r.t. feature descriptors partially describing their architecture and optimised w.r.t. their ability to solve the task. The technique is shown to perform well on two benchmark classification problems which have been tackled in the literature with Deep Learning techniques. Additional experiments provide insight into the role that diversity plays in contributing to the performance of the repertoire. We propose that evolving neural networks by promoting architectural diversity could in future lead to better results in some domains where current approaches have fallen short.
- Thomas G Dietterich. 2000. Ensemble methods in machine learning. In International workshop on multiple classifier systems. Springer, 1--15.Google Scholar
- Jean-Baptiste Mouret and Jeff Clune. 2015. Illuminating search spaces by mapping elites. arXiv:cs.AI/1504.04909Google Scholar
Index Terms
- Diversity-driven wide learning for training distributed classification models
Recommendations
Using novelty search to explicitly create diversity in ensembles of classifiers
GECCO '21: Proceedings of the Genetic and Evolutionary Computation ConferenceThe diversity between individual learners in an ensemble is known to influence its performance. However, there is no standard agreement on how diversity should be defined, and thus how to exploit it to construct a high-performing classifier. We propose ...
WILDA: Wide Learning of Diverse Architectures for Classification of Large Datasets
Applications of Evolutionary ComputationAbstractIn order to address scalability issues, which can be a challenge for Deep Learning methods, we propose Wide Learning of Diverse Architectures—a model that scales horizontally rather than vertically, enabling distributed learning. We propose a ...
Increasing Population Diversity Through Cultural Learning
A number of learning models are commonly employed in the simulation of social behavior. These include population learning, lifetime learning and cultural learning. Population learning allows popula tions as a whole to ...
Comments