Abstract:
The predictive performance of a neural network depends on its weights and architecture. optimizers based on gradient descent are most commonly used to optimize the weight...View moreMetadata
Abstract:
The predictive performance of a neural network depends on its weights and architecture. optimizers based on gradient descent are most commonly used to optimize the weights, and grid search is utilized to find the most suitable architecture from the list of predefined architectures. On the other hand, neuroevolution offers a solution for the simultaneous growth of neural network architecture and the evolution of its weights. Thus, it is not limited by the user-defined list of possible architectures and can find configurations optimal for a specific task. Both approaches can be effectively parallelized and take advantage of modern multi-process systems. In this research, we compare neuroevolution and backpropagation in terms of the time consumed by the algorithm, the predictive performance of the neural network, and the complexity of the neural network. The total time for each algorithm is measured along with the times for each section of the algorithm and the time spent on synchronization due to the multi-process setting. The neural networks are compared by their predictive performance in terms of Matthews correlation coefficient score and their complexity as the number of nodes and connections. The case study is based on two synthetic and two real-world datasets for classification tasks.
Published in: 2023 46th MIPRO ICT and Electronics Convention (MIPRO)
Date of Conference: 22-26 May 2023
Date Added to IEEE Xplore: 29 June 2023
ISBN Information: