Abstract:
This paper proposes a novel method for training neural networks (NNs). It uses an approach from optimal control theory, namely, Hamilton–Jacobi–Bellman equation, which op...Show MoreMetadata
Abstract:
This paper proposes a novel method for training neural networks (NNs). It uses an approach from optimal control theory, namely, Hamilton–Jacobi–Bellman equation, which optimizes system performance along the trajectory. This formulation leads to a closed-form solution for an optimal weight update rule, which has been combined with per-parameter adaptive scheme AdaGrad to further enhance its performance. To evaluate the proposed method, the NNs are trained and tested on two problems related to EEG classification, namely, mental imagery classification (multiclass) and eye state recognition (binary class). In addition, a novel dataset with the name EEG eye state, for benchmarking learning methods, is presented. The convergence proof for the proposed approach is also included, and performance is validated on many small to large scale, synthetic datasets (UCI, LIBSVM datasets). The performance of NNs trained with the proposed scheme is compared with other state-of-the-art approaches. Evaluation results substantiate the improvements brought about by the proposed scheme regarding faster convergence and better accuracy.
Published in: IEEE Transactions on Emerging Topics in Computational Intelligence ( Volume: 4, Issue: 2, April 2020)