Sigmoid and Beyond: Algebraic Activation Functions for Artificial Neural Networks Based on Solutions of a Riccati Equation | IEEE Journals & Magazine | IEEE Xplore

Sigmoid and Beyond: Algebraic Activation Functions for Artificial Neural Networks Based on Solutions of a Riccati Equation


Abstract:

Activation functions play a key role in neural networks, as they significantly affect the training process and the network’s performance. Based on the solution of a certa...Show More

Abstract:

Activation functions play a key role in neural networks, as they significantly affect the training process and the network’s performance. Based on the solution of a certain ordinary differential equation of the Riccati type, this work proposes an alternative generalized adaptive solution to the fixed sigmoid, which is called “generalized Riccati activation” (GRA). The proposed GRA function was employed on the output layer of an artificial neural network with a single hidden layer that consisted of eight neurons. The performance of the neural network was evaluated on a binary and a multiclass classification problem using different combinations of activation functions in the input/output layers. The results demonstrated that the swish/GRA combination yields higher accuracy than any other combination of activation functions. This benefit in terms of accuracy could be critical for certain domains, such as healthcare and smart grids, where AI-assisted decisions are becoming essential.
Published in: IT Professional ( Volume: 24, Issue: 5, 01 Sept.-Oct. 2022)
Page(s): 30 - 36
Date of Publication: 30 November 2022

ISSN Information:

Funding Agency:


Contact IEEE to Subscribe

References

References is not available for this document.