Elsevier

Neural Networks

Volume 133, January 2021, Pages 123-131
Neural Networks

Gradient-based training and pruning of radial basis function networks with an application in materials physics

https://doi.org/10.1016/j.neunet.2020.10.002Get rights and content
Under a Creative Commons license
open access

Abstract

Many applications, especially in physics and other sciences, call for easily interpretable and robust machine learning techniques. We propose a fully gradient-based technique for training radial basis function networks with an efficient and scalable open-source implementation. We derive novel closed-form optimization criteria for pruning the models for continuous as well as binary data which arise in a challenging real-world material physics problem. The pruned models are optimized to provide compact and interpretable versions of larger models based on informed assumptions about the data distribution. Visualizations of the pruned models provide insight into the atomic configurations that determine atom-level migration processes in solid matter; these results may inform future research on designing more suitable descriptors for use with machine learning algorithms.

Keywords

Radial basis function networks
Pruning
Interpretability
Materials physics

Cited by (0)

1

These authors contributed equally.