ABSTRACT
Traffic matrix (TM) shows the traffic volume of a network. Therefore, TM prediction is of great significance for network management. Attention mechanism has been successful in many sub-domains of machine learning, such as computer vision and natural language processing, and it performs particularly well on time series data. In this work, we first introduce attention mechanisms into the traffic matrix prediction field by proposing an attention-based deep learning model for traffic matrix prediction. This model is composed of two parts, encoder and decoder. We use a recurrent neural network (RNN) architecture as the encoder and our decoder has an attention layer and a linear layer. Attention mechanism allows the model to have better memory ability, so the model can concentrate on those important data regardless of distance. We also reduce the time consumption of our model using GPU-based parallel acceleration. Finally, we evaluate the effectiveness of our model on a real world TM dataset, and the results show our implementations on the proposed model perform better than the baseline models.
- A. Azzouni and G. Pujolle. 2017. A Long Short-Term Memory Recurrent Neural Network Framework for Network Traffic Matrix Prediction. arXiv (2017).Google Scholar
- D. Bahdanau, K. Cho, and Y. Bengio. 2014. Neural Machine Translation by Jointly Learning to Align and Translate. Computer Science (2014).Google Scholar
- M. Barabas, G. Boanea, A. B. Rus, V. Dobrota, and J. Domingo-Pascual. 2011. Evaluation of network traffic prediction based on neural networks with multi-task learning and multiresolution decomposition. In 2011 IEEE 7th International Conference on Intelligent Computer Communication and Processing.Google Scholar
- K. Cho, B Van Merrienboer, C. Gulcehre, D. Ba Hdanau, F. Bougares, H. Schwenk, and Y. Bengio. 2014. Learning Phrase Representations using RNN Encoder-Decoder for Statistical Machine Translation. Computer Science (2014).Google Scholar
- J. Chung, C. Gulcehre, K. H. Cho, and Y. Bengio. 2014. Empirical Evaluation of Gated Recurrent Neural Networks on Sequence Modeling. arXiv (2014).Google Scholar
- H. Feng and Y. Shu. 2005. Study on network traffic prediction techniques. In International Conference on Wireless Communications.Google Scholar
- B. Fortz and M. Thorup. 2001. Internet traffic engineering by optimizing OSPF weights. proceedings of ieee infocom mar(2001).Google Scholar
- S. Hochreiter and J. Schmidhuber. 1997. Long Short-Term Memory. Neural Computation 9, 8 (1997), 1735–1780.Google ScholarDigital Library
- Y. Kim, C. Denton, L. Hoang, and A. M. Rush. 2017. Structured Attention Networks. (2017).Google Scholar
- Z. Liu, Z. Wang, X. Yin, X. Shi, and Y. Tian. 2019. Traffic Matrix Prediction Based on Deep Learning for Dynamic Traffic Engineering. In 2019 IEEE Symposium on Computers and Communications (ISCC).Google Scholar
- N. Ramakrishnan and T. Soni. 2018. Network Traffic Prediction Using Recurrent Neural Networks. In 2018 17th IEEE International Conference on Machine Learning and Applications (ICMLA).Google Scholar
- A. Sang and S. Q. Li. 2000. A predictability analysis of network traffic. In Infocom Nineteenth Joint Conference of the IEEE Computer & Communications Societies IEEE.Google Scholar
- A. Soule, A. Lakhina, N. Taft, K. Papagiannaki, K. Salamatian, A. Nucci, M. Crovella, and C. Diot. 2005. Traffic Matrices: Balancing Measurements, Inference and Modeling. Performance evaluation review 33, 1 (2005), p.362–373.Google ScholarDigital Library
- S. Troia, R. Alvizu, Y. Zhou, G. Maier, and A. Pattavina. 2018. Deep Learning-Based Traffic Prediction for Network Optimization. In International Conference on Transparent Optical Networks (ICTON) 2018.Google Scholar
- Ashish Vaswani, Noam Shazeer, Niki Parmar, Jakob Uszkoreit, Llion Jones, Aidan N Gomez, Lukasz Kaiser, and Illia Polosukhin. 2017. Attention Is All You Need. arXiv (2017).Google Scholar
- L. Wei, H. Ao, O. Liang, W. Ding, and Z. Ge. 2014. Prediction and correction of traffic matrix in an IP backbone network. IEEE (2014).Google Scholar
- Y. Zhang. 2011,. Abilene: American Research and Education Network dataset. http://www.cs.utexas.edu/yzhang/research/AbileneTMGoogle Scholar
Recommendations
Prediction of Network Traffic Using Dynamic Bilinear Recurrent Neural Network
ICNC '09: Proceedings of the 2009 Fifth International Conference on Natural Computation - Volume 02Prediction of a network traffic using Dynamic_Bilinear Recurrent Neural Network (D-BLRNN) is proposed and presented in this paper. D-BLRNN was developed to enhance the prediction capability of the BLRNN further by introducing dynamic learning control ...
Structure optimization of BiLinear Recurrent Neural Networks and its application to Ethernet network traffic prediction
The BiLinear Recurrent Neural Network (BLRNN) based on the bilinear polynomial has been successfully applied to several nonlinear time series prediction problems. In order to obtain faster convergence in training the BLRNN, two procedures are applied to ...
Prediction of Sunspot Series Using BiLinear Recurrent Neural Network
ICIME '09: Proceedings of the 2009 International Conference on Information Management and EngineeringA prediction scheme of sunspot series using a BiLinear Recurrent Neural Network (BLRNN) is proposed in this paper. Since the BLRNN is based on the bilinear polynomial, it has been successfully used in modeling highly nonlinear systems with time-series ...
Comments