Abstract
The traditional neural networks are only able to process vectorial data, resulting in the loss of spatial information in high-dimensional structural data when vectorising data. The matrix neural networks (MatNet), a new approach, is only capable of capturing structural information on the first and the second dimension/mode of matrix data. Although the state-of-the-art method multilinear tensor regression (MLTR) manages to capture the linear relational information in high dimensions, the possible nonlinear relationships within multidimensional data may be ignored. To analyse both linear and nonlinear relationships among each mode of the multidimensional relational data, a new model, named tensorial neural networks, is proposed. Within the tensorial neural networks, the hidden layers are in high-dimensions rather than one dimension or two dimensions. The backpropagation algorithm for tensorial neural networks is derived and provided. The performance of the new approach is assessed in analysing longitudinal network data which contains weekly international relationships among 25 countries from 2004 to mid-2014 from World-Wide Integrated Crisis Early Warning System. In other words, the application of this newly proposed method, tensorial neural networks, is on international relationship study in this paper. The dependencies among the international relationship data are generally reciprocity and transitivity which are also the interests of the research.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Minhas, S., Hoff, P.D., Ward, M.D.: A new approach to analyzing coevolving longitudinal networks in international relations. J. Peace Res. 53(3), 491–505 (2016)
Wang, B., Hu, Y., Gao, J., Sun, Y., Yin, B.: Laplacian LRR on product grassmann manifolds for human activity clustering in multi-camera video surveillance. IEEE Trans. Circ. Syst. Video Technol. 27(3), 554–566 (2017)
Kolda, T.G., Bader, B.W.: Tensor decompositions and applications. SIAM Rev. 51(3), 455–500 (2009)
Hoff, P.D.: Multilinear tensor regression for longitudinal relational data. Ann. Appl. Stat. 9(3), 1169–1193 (2015)
Gao, J., Guo, Y., Wang, Z.: Matrix neural networks. In: Cong, F., Leung, A., Wei, Q. (eds.) ISNN 2017. LNCS, vol. 10261, pp. 313–320. Springer, Cham (2017). doi:10.1007/978-3-319-59072-1_37
Chien, J.T., Bao, Y.T.: Tensor-factorized neural networks. IEEE Trans. Neural Netw. Learn. Syst. (2017), http://ieeexplore.ieee.org/document/7902201/
Bishop, C.M.: Pattern Recognition and Machine Learning. Information Science and Statistics. Springer, New York (2006)
Vedaldi, A., Lenc, K.: MatConvNet: convolutional neural networks for MATLAB. In: Proceedings of the 23rd ACM International Conference on Multimedia, MM 2015, NY, USA, pp. 689–692. ACM, New York (2015)
Kolda, T.G.: Multilinear Operators for Higher-Order Decompositions. Technical report, Sandia National Laboratories (2006)
Bader, B.W., Kolda, T.G., et al.: MATLAB Tensor Toolbox Version 2.6 (2015), http://www.sandia.gov/tgkolda/TensorToolbox/
Bader, B.W., Kolda, T.G.: Algorithm 862: MATLAB tensor classes for fast algorithm prototyping. ACM TOMS 32(4), 635–653 (2006)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2017 Springer International Publishing AG
About this paper
Cite this paper
Bai, M., Zhang, B., Gao, J. (2017). Tensorial Neural Networks and Its Application in Longitudinal Network Data Analysis. In: Liu, D., Xie, S., Li, Y., Zhao, D., El-Alfy, ES. (eds) Neural Information Processing. ICONIP 2017. Lecture Notes in Computer Science(), vol 10635. Springer, Cham. https://doi.org/10.1007/978-3-319-70096-0_58
Download citation
DOI: https://doi.org/10.1007/978-3-319-70096-0_58
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-70095-3
Online ISBN: 978-3-319-70096-0
eBook Packages: Computer ScienceComputer Science (R0)