Abstract
Radial Basis Gated Unit-Recurrent Neural Network (RBGU-RNN) algorithm is a new architecture-based on recurrent neural network which combines a Radial Basis Gated Unit within the Long Short Term Memory (LSTM) network architecture. This unit then gives an advantage to RBGU-RNN over the existing LSTM network. Firstly, given that the RBGU is just an activation unit and which do not perform any weighted operations as it should in a classical neuron unit, it has an advantage for not propagating (duplicating) error as compared to the LSTM. Secondly, due to the fact that this unit is located at the beginning of the network treatment workflow, it provides standardization to the data set, before they are run into the weighted units, which is not the case of a simple LSTM. This study then provided a theoretical and experimental comparison of the LSTM and RBGU-RNN. Indeed, using a real world call data record, precisely a survey on the end user cell network data traffic, we built up a cellular traffic prediction model. We start with ARIMA model which permit us to choose the number of time steps needed to build the RBGU-RNN prediction model that is the number of time steps needed to predict the next individual in the time series. The results show that RBGU-RNN accurately predict cellular data traffic with great success in generalization than LSTM. The R-squared statistics or determination coefficients show that \(58.31 \%\) of user traffic consumption can be explained by LSTM model, while \(96.86 \%\) of the user traffic consumption can be done by RBGU-RNN model in the training set. Likewise, in the test set, we found that \(61.24 \%\) of user traffic consumption can also be explained by LSTM model and \(95.20 \%\) can be done by RBGU-RNN. Also, the RBGU-RNN has more efficient gradient descent than the standard LSTM by analysing and experimenting the graphs given by the Mean Squared Error (MSE), the Mean Absolute Percentage Error (MAPE) and the Maximum Absolute Error (MAXAE) functions over the number of iteration.














Similar content being viewed by others
Data availability
The datasets used and analyzed during the current study are available from the corresponding author on reasonable request.
References
Azari A, Papapetrou P, Denic S, Peters G. Cellular traffic prediction and classification: a comparative evaluation of lstm and arima. In: International Conference on Discovery Science. Springer, pp. 129–144. 2019.
Bahdanau D, Cho K, Bengio Y. Neural machine translation by jointly learning to align and translate. Technical report. 2014. arXiv preprint arXiv:1409.0473.
Bengio Y, Simard P, Frasconi P. Learning long-term dependencies with gradient descent is difficult. IEEE Trans Neural Netw. 1994;5(2):157–66.
Campagne JE. l’apprentissage face à la malédiction de la grande dimension, Notes et commentaires au sujet des conférences de S. Mallat du Collège de France (2020).
Cerwall P, Lundvall A, Jonsson P, Moller R, avertoft SB, Carson S, Godor I. Ericsson mobility report 2018. 2018.
Chen M, Challita U, Saad W, Yin C, Debbah M. Artificial neural networks-based machine learning for wireless networks: a tutorial. IEEE Commun Surv Tutor. 2019;21(4):3039–71.
Cho K, van Merrienboer B, Bahdanau D, Bengio Y. On the properties of neural machine translation: Encoder-decoder approaches. 2014. arXiv preprint arXiv:1409.1259.
Deng L, Hinton G, Kingsbury B. New types of deep neural network learning for speech recognition and related applications: an overview. In: 2013 IEEE International Conference on Acoustics, Speech and Signal Processing. IEEE, pp. 8599–8603. 2013.
Du KL, Cheng KKM, Swamy MNS. A fast neural beamformer for antenna arrays. In: Proceedings of the International Conference on Communications (ICC ’02), vol. 1, pp. 139–144, New York, NY, USA. 2002.
El Nashar A, El-Saidny MA, Sherif M, Design, deployment and performance of 4G-LTE networks: a practical approach. Wiley. 2014.
Goodfellow I, Bengio Y, Courville A. Deep learning. MIT Press. 2016.
Graves A. Generating sequences with recurrent neural networks. 2013. arXiv preprint arXiv:1308.0850
Hochreiter S. Untersuchungen zu dynamischen neuronalen Netzen. Diploma thesis, Institut fur̈ Informatik, Lehrstuhl Prof. Brauer, Technische Universitat München. 1991.
Hochreiter S, Bengio Y, Frasconi P, Schmidhuber J, Gradient flow in recurrent nets: the difficulty of learning long-term. 2001.
Hyndman RJ, Athanasopoulos G. Seasonal ARIMA models. Forecasting: principles and practice. oTexts. Retrieved 19 May 2015.
Jaffry S, Hasan SF, Gui X. Effective resource sharing in mobile cell environments. 2018. arXiv preprint arXiv:1808.01700.
Jaffry S, Hasan SF, Gui X, Kuo YW. Distributed device discovery in prose environments. In: TENCON 2017-2017 IEEE Region 10 Conference. IEEE, pp. 614–618. 2017.
Jaffry S, Hasan SF, Gui X. Shared spectrum for mobile-cell’s backhaul and access link. In: IEEE Global Communications Conference (GLOBECOM). IEEE. 2018;2018:1–6.
Junyoung CC, Gulcehre KH, Cho Y, Bengio. Empirical evaluation of gated recurrent neural network on sequence modelling. 2014. arXiv:1412.3555v1[cs.NE].
Khyati M. A detailed guide to 7 loss functions for machine learning algorithms with python code. 2019. https://www.analyticsvidhya.com.
Kyunghyun C, van Merrienboer B, Gulcehre C, Bahdanau D, Bougares F, Schwenk H, Bengio Y. Learning Phrase Representations using RNN Encoder-Decoder for Statistical Machine Translation. 2014. arXiv:1406.1078.
LeCun Y, Bengio Y, Hinton G. Deep learning. Nature. 2015;521(7553):436–44.
Liao Y, Fang SC, Nuttle HLW. Relaxed conditions for radial-basis function networks to be universal approximators. Neural Netw. 2003;16(7):1019–28.
Micchelli CA. Interpolation of scattered data: distance matrices and conditionally positive definite functions. Construct Approx. 1986;2(1):11–22.
Olav Ø, Ole G. Benefits of Self-Organizing Networks (SON) for Mobile Operators. J Comput Netw Commun. 2022;2022:862527. https://doi.org/10.1155/2012/862527.
Parwez MS, Rawat DB, Garuba M. Big data analytics for user-activity analysis and user-anomaly detection in mobile wireless network. IEEE Trans Ind Inform. 2017;13(4):2058–65.
Poggio T, Girosi F. Networks for approximation and learning. Proc IEEE. 1990;78(9):1481–97.
Qiu C, Zhang Y, Feng Z, Zhang P, Cui S. Spatio-temporal wireless traffic prediction with recurrent neural network. IEEE Wirel Commun Lett. 2018;7(4):554–7.
Ravı D, Wong C, Deligianni F, Berthelot M, Andreu-Perez J, Lo B, Yang G-Z. Deep learning for health informatics. IEEE J Biomed Health Inform. 2016;21(1):4–21.
Robinson AJ, Fallside F. The utility driven dynamic error propagation network. Technical Report CUED/F-INFENG/TR.1, Cambridge University Engineering Department. 1987.
Sepp H, Jurgen S. Long short-term memory. Neural Comput. 1997;9(8):1735–80. https://doi.org/10.1162/neco.1997.9.8.1735.
Shu Y, Yu M, Yang O, Liu J, Feng H. Wireless traffic modeling and prediction using seasonal arima models. IEICE Trans Commun. 2005;88(10):3992–9.
Vaswani A, Shazeer N, Parmar N, Uszkoreit J, Jones L, Aidan NG, Lukasz K, Illia P. NeurIPS: Attention is all you need. 2017.
Voulodimos A, Doulamis N, Doulamis A, Protopapadakis E. Deep learning for computer vision: a brief review. Comput Intell Neurosci. 2018.
Werbos PJ. Backpropagation through time: what it does and how to do it. Proc IEEE. 1990;78(10):1550–60.
Williams RJ, Zipser D. Gradient-based learning algorithms for recurrent networks and their computational complexity. In: Chauvin Y, Rumelhart DE, editors. Back-propagation: theory, architectures and Applications. Hillsdale: Erlbaum; 1992.
Xavier G, Yoshua B. Understanding the difficulty of training deep feedforward neural networks. In: Proceedings of the International Conference on Artificial Intelligence and Statistics (AISTATS’10). Society for Artificial Intelligence and Statistics, USA (2010).
Young T, Hazarika D, Poria S, Cambria E. Recent trends in deep learning based natural language processing. IEEE Comput Intell Mag. 2018;13(3):55–75.
Zappone A, Di Renzo M, Debbah M. Wireless networks design in the era of deep learning: model-based, ai-based, or both?. 2019. arXiv preprint arXiv:1902.02647.
Zhao Y, Zhou Z, Wang X, Liu T, Liu Y, Yang Z. Celltrademap delineating trade areas for urban commercial districts with cellular networks. In: IEEE INFOCOM 2019-IEEE Conference on Computer Communications. IEEE, 2019, pp. 937–945.
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Conflicts of interest
On behalf of all authors, the corresponding author states that there is no conflict of interest.
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.
About this article
Cite this article
Rollin, N.F., Giquel, S., Chantal, MA. et al. Radial Basis Gated Unit-Recurrent Neural Network (RBGU-RNN) Algorithm. SN COMPUT. SCI. 5, 68 (2024). https://doi.org/10.1007/s42979-023-02376-x
Received:
Accepted:
Published:
DOI: https://doi.org/10.1007/s42979-023-02376-x