Abstract:
In this study, the effects of Activation Functions (AF) in Artificial Neural Network (ANN) on regression and classification performance are compared. In comparisons, succ...Show MoreMetadata
Abstract:
In this study, the effects of Activation Functions (AF) in Artificial Neural Network (ANN) on regression and classification performance are compared. In comparisons, success rates in test data and duration of training are evaluated for both problems. A total of 11 AF functions, 10 AF commonly used in the literature and Square function proposed in this study, are compared using 7 different datasets, 2 for regression and 5 for classification. 3 different ANN architectures, which are considered to be the most appropriate for each dataset are employed in the experiments. As a result of totally 231 different training procedures, the effects of Afs are examined for different datasets and architectures. Similarly, the effects of AF on training time are shown for different datasets. In the experiments it is shown that ReLU is the most succesfull AF in general purposes. In addition to ReLU, Square function gives the better results in image datasets.
Date of Conference: 02-05 May 2018
Date Added to IEEE Xplore: 09 July 2018
ISBN Information: