Abstract
Factorization Machines (FMs) are a series of effective solutions for sparse data prediction by considering the interactions among users, items, and auxiliary information. However, the feature representations in most state-of-the-art FMs are fixed, which reduces the prediction performance as the same feature may have unequal predictabilities under different input instances. In this paper, we propose a novel Feature-adjusted Factorization Machine (FaFM) model by adaptively adjusting the feature vector representations from both vector-level and bit-level. Specifically, we adopt a fully connected layer to adaptively learn the weight of vector-level feature adjustment. And a user-item specific gate is designed to refine the vector in bit-level and to filter noises caused by over-adaptation of the input instance. Extensive experiments on two real-world datasets demonstrate the effectiveness of FaFM. Empirical results indicate that FaFM significantly outperforms the traditional FM with a 10.89% relative improvement in terms of Root Mean Square Error (RMSE) and consistently exceeds four state-of-the-art deep learning based models.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Baltrunas, L., Church, K., Karatzoglou, A., Oliver, N.: Frappe: understanding the usage and perception of mobile app recommendations in-the-wild. CoRR abs/1505.03014 (2015)
Blondel, M., Fujino, A., Ueda, N., Ishihata, M.: Higher-order factorization machines. In: NIPS, pp. 3351–3359 (2016)
Glorot, X., Bordes, A., Bengio, Y.: Deep sparse rectifier neural networks. In: AISTATS, pp. 315–323 (2011)
Guo, H., Tang, R., Ye, Y., Li, Z., He, X.: DeepFM: a factorization-machine based neural network for CTR prediction. In: IJCAI, pp. 1725–1731 (2017)
Harper, F.M., Konstan, J.A.: The MovieLens datasets: history and context. ACM Trans. Interact. Intell. Syst. (TiiS) 5(4), 19 (2016)
He, X., Chua, T.S.: Neural factorization machines for sparse predictive analytics. In: SIGIR, pp. 355–364. ACM (2017)
He, X., Liao, L., Zhang, H., Nie, L., Hu, X., Chua, T.S.: Neural collaborative filtering. In: WWW, International World Wide Web Conferences Steering Committee, pp. 173–182 (2017)
Hinton, G.E., Srivastava, N., Krizhevsky, A., Sutskever, I., Salakhutdinov, R.R.: Improving neural networks by preventing co-adaptation of feature detectors. CoRR abs/1207.0580 (2012)
Juan, Y., Zhuang, Y., Chin, W.S., Lin, C.J.: Field-aware factorization machines for CTR prediction. In: RecSys, pp. 43–50. ACM (2016)
Ma, C., Kang, P., Liu, X.: Hierarchical gating networks for sequential recommendation. In: KDD, pp. 825–833 (2019)
McMahan, H.B., et al.: Ad click prediction: a view from the trenches. In: KDD, pp. 1222–1230. ACM (2013)
Qian, Y., et al.: Interaction graph neural network for news recommendation. In: Cheng, R., Mamoulis, N., Sun, Y., Huang, X. (eds.) WISE 2020. LNCS, vol. 11881, pp. 599–614. Springer, Cham (2019). https://doi.org/10.1007/978-3-030-34223-4_38
Rendle, S.: Factorization machines. In: ICDM, pp. 995–1000. IEEE (2010)
Rendle, S.: Factorization machines with libFM. TIST 3(3), 57 (2012)
Rendle, S., Gantner, Z., Freudenthaler, C., Schmidt-Thieme, L.: Fast context-aware recommendations with factorization machines. In: SIGIR, pp. 635–644 (2011)
Richardson, M., Dominowska, E., Ragno, R.: Predicting clicks: estimating the click-through rate for new ads. In: WWW, pp. 521–530. ACM (2007)
Shan, Y., Hoens, T.R., Jiao, J., Wang, H., Yu, D., Mao, J.: Deep crossing: web-scale modeling without manually crafted combinatorial features. In: KDD, pp. 255–262. ACM (2016)
Srivastava, N., Hinton, G., Krizhevsky, A., Sutskever, I., Salakhutdinov, R.: Dropout: a simple way to prevent neural networks from overfitting. J. Mach. Learn. Res. 15(1), 1929–1958 (2014)
Wang, X., He, X., Nie, L., Chua, T.S.: Item silk road: recommending items from information domains to social users. In: SIGIR, pp. 185–194. ACM (2017)
Xiao, J., Ye, H., He, X., Zhang, H., Wu, F., Chua, T.S.: Attentional factorization machines: learning the weight of feature interactions via attention networks. In: IJCAI, pp. 3119–3125 (2017)
Yu, Y., Wang, Z., Yuan, B.: An input-aware factorization machine for sparse prediction. In: IJCAI, pp. 1466–1472. AAAI Press (2019)
Zhang, T., et al.: Feature-level deeper self-attention network for sequential recommendation. In: IJCAI, pp. 4320–4326 (2019)
Zhao, P., et al.: Where to go next: a spatio-temporal gated network for next POI recommendation. AAAI 33, 5877–5884 (2019)
Acknowledgments
This research was partially supported by NSFC (No. 6187 6117, 61876217, 61872258, 61728205), Suzhou Science and Technology Development Program(SYG201803), Open Program of Key Lab of IIP of CAS (No. IIP2019-1) and PAPD.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2020 Springer Nature Switzerland AG
About this paper
Cite this paper
Wu, Y., Zhao, P., Liu, Y., Sheng, V.S., Fang, J., Zhuang, F. (2020). Vector-Level and Bit-Level Feature Adjusted Factorization Machine for Sparse Prediction. In: Nah, Y., Cui, B., Lee, SW., Yu, J.X., Moon, YS., Whang, S.E. (eds) Database Systems for Advanced Applications. DASFAA 2020. Lecture Notes in Computer Science(), vol 12112. Springer, Cham. https://doi.org/10.1007/978-3-030-59410-7_27
Download citation
DOI: https://doi.org/10.1007/978-3-030-59410-7_27
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-59409-1
Online ISBN: 978-3-030-59410-7
eBook Packages: Mathematics and StatisticsMathematics and Statistics (R0)