Abstract:
Deep neural networks are able to solve much more complex and nonlinear problems than very popular but shallow technologies such as ELM, SVR or SLP. Despite of their power...Show MoreMetadata
Abstract:
Deep neural networks are able to solve much more complex and nonlinear problems than very popular but shallow technologies such as ELM, SVR or SLP. Despite of their power deep neural networks are difficult to apply due to problems with effective and successful training caused by `vanishing' problem. The paper shows that these problems can be reduced by using appropriate network architecture. The paper presents the influence of the neural network architecture on the training effectiveness and the training time. Selected network architectures such as BMPL, FCC and MLPL are described. Presented experimental results of research confirming the significant influence of architecture on the success of network training.
Date of Conference: 25-27 June 2019
Date Added to IEEE Xplore: 27 December 2019
ISBN Information: