Loading [a11y]/accessibility-menu.js
HJB-Equation-Based Optimal Learning Scheme for Neural Networks With Applications in Brain–Computer Interface | IEEE Journals & Magazine | IEEE Xplore

HJB-Equation-Based Optimal Learning Scheme for Neural Networks With Applications in Brain–Computer Interface


Abstract:

This paper proposes a novel method for training neural networks (NNs). It uses an approach from optimal control theory, namely, Hamilton–Jacobi–Bellman equation, which op...Show More

Abstract:

This paper proposes a novel method for training neural networks (NNs). It uses an approach from optimal control theory, namely, Hamilton–Jacobi–Bellman equation, which optimizes system performance along the trajectory. This formulation leads to a closed-form solution for an optimal weight update rule, which has been combined with per-parameter adaptive scheme AdaGrad to further enhance its performance. To evaluate the proposed method, the NNs are trained and tested on two problems related to EEG classification, namely, mental imagery classification (multiclass) and eye state recognition (binary class). In addition, a novel dataset with the name EEG eye state, for benchmarking learning methods, is presented. The convergence proof for the proposed approach is also included, and performance is validated on many small to large scale, synthetic datasets (UCI, LIBSVM datasets). The performance of NNs trained with the proposed scheme is compared with other state-of-the-art approaches. Evaluation results substantiate the improvements brought about by the proposed scheme regarding faster convergence and better accuracy.
Page(s): 159 - 170
Date of Publication: 15 August 2018
Electronic ISSN: 2471-285X

Funding Agency:


Contact IEEE to Subscribe

References

References is not available for this document.