Abstract
This article focuses on the use of data mining and machine learning in AI education to achieve better prediction accuracy of students’ academic achievement. So far, there are already many well-built gradient boosting machines for small data sets prediction, such as lightGBM, XGBoost, etc. Based on this, we presented and experimented a new method in a regression prediction. Our Stacking Network combines the traditional ensemble models with the idea of deep neural network. Compared with the original Stacking method, Stacking Network can infinitely increase the number of layers, making the effect of Stacking Network much higher than that of traditional Stacking. Simultaneously, compared with deep neural network, this Stacking Network inherits the advantages of the Boosting machines. We have applied this approach to achieve higher accuracy and better speed than the conventional Deep neural network. And also, we achieved a highest rank on the Middle School Grade Dataset provided by Shanghai Telecom Corporation.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Ke, G., Meng, Q., Finley, T., et al.: Lightgbm: a highly efficient gradient boosting decision tree. In: Advances in Neural Information Processing Systems, pp. 3146–3154 (2017)
Chen, T., Guestrin, C.: XGBoost: a scalable tree boosting system. In: Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, pp. 785–794. ACM (2016)
Lemley, M.A., Shapiro, C.: Patent holdup and royalty stacking. Tex. L. Rev. 2006, 85 (1991)
Breiman, L.: Bagging predictors. Mach. Learn. 24(2), 123–140 (1996)
Fauconnier, G., Turner, M.: The Way We Think: Conceptual Blending and the Mind’s Hidden Complexities. Basic Books, New York (2008)
Rowley, H.A., Baluja, S., Kanade, T.: Neural network-based face detection. IEEE Trans. Pattern Anal. Mach. Intell. 20(1), 23–38 (1998)
Specht, D.F.: A general regression neural network. IEEE Trans. Neural Netw. 2(6), 568–576 (1991)
Krogh, A., Vedelsby, J.: Neural network ensembles, cross validation, and active learning. In: Advances in Neural Information Processing Systems, pp. 231–238 (1995)
Li, J., Chang, H., Yang, J.: Sparse deep stacking network for image classification. In: Twenty-Ninth AAAI Conference on Artificial Intelligence (2015)
Prokhorenkova, L., Gusev, G., Vorobev, A., et al.: CatBoost: unbiased boosting with categorical features. In: Advances in Neural Information Processing Systems, pp. 6638–6648 (2018)
Odom, M.D., Sharda, R.: A neural network model for bankruptcy prediction. In: 1990 IJCNN International Joint Conference on Neural Networks, pp. 163–168. IEEE (1990)
Rose, S.: Mortality risk score prediction in an elderly population using machine learning. Am. J. Epidemiol. 177(5), 443–452 (2013)
Grady, J., Oakley, T., Coulson, S.: Blending and metaphor. Amst. Stud. Theory Hist. Linguist. Sci. Ser. 4, 101–124 (1999)
Freund, Y., Iyer, R., Schapire, R.E., et al.: An efficient boosting algorithm for combining preferences. J. Mach. Learn. Res. 4(Nov), 933–969 (2003)
Schapire, R.E.: A brief introduction to boosting. In: IJCAI, vol. 99, pp. 1401–1406 (1999)
Solomatine, D.P., Shrestha, D.L.: AdaBoost. RT: a boosting algorithm for regression problems. In: 2004 IEEE International Joint Conference on Neural Networks (IEEE Cat. No. 04CH37541), vol. 2, pp. 1163–1168. IEEE (2004)
Kudo, T., Matsumoto, Y.: A boosting algorithm for classification of semi-structured text. In: Proceedings of the Conference on Empirical Methods in Natural Language Processing, pp. 301–308 (2004)
Yosinski, J., Clune, J., Bengio, Y., et al.: How transferable are features in deep neural networks? In: Advances in Neural Information Processing Systems, pp. 3320–3328 (2014)
Esteva, A., Kuprel, B., Novoa, R.A., et al.: Dermatologist-level classification of skin cancer with deep neural networks. Nature 542(7639), 115 (2017)
Glorot, X., Bengio, Y.: Understanding the difficulty of training deep feedforward neural networks. In: Proceedings of the Thirteenth International Conference on Artificial Intelligence and Statistics, pp. 249–256 (2010)
Hecht-Nielsen, R.: Theory of the backpropagation neural network. In: Neural Networks for Perception, pp. 65–93. Academic Press (1992)
Maas, A.L., Hannun, A.Y., Ng, A.Y.: Rectifier nonlinearities improve neural network acoustic models. In: Proceedings of ICML, vol. 30, no. 1, p. 3 (2013)
Psaltis, D., Sideris, A., Yamamura, A.A.: A multilayered neural network controller. IEEE Control Syst. Mag. 8(2), 17–21 (1988)
Kalchbrenner, N., Grefenstette, E., Blunsom, P.: A convolutional neural network for modelling sentences. arXiv preprint arXiv:1404.2188 (2014)
Saposnik, G., Cote, R., Mamdani, M., et al.: JURaSSiC: accuracy of clinician vs risk score prediction of ischemic stroke outcomes. Neurology 81(5), 448–455 (2013)
Holland, P.W., Hoskens, M.: Classical test theory as a first-order item response theory: application to true-score prediction from a possibly nonparallel test. Psychometrika 68(1), 123–149 (2003)
Liu, Y., An, A., Huang, X.: Boosting prediction accuracy on imbalanced datasets with SVM ensembles. In: Ng, W.-K., Kitsuregawa, M., Li, J., Chang, K. (eds.) PAKDD 2006. LNCS (LNAI), vol. 3918, pp. 107–118. Springer, Heidelberg (2006). https://doi.org/10.1007/11731139_15
Chawla, N.V., Lazarevic, A., Hall, L.O., Bowyer, K.W.: SMOTEBoost: improving prediction of the minority class in boosting. In: Lavrač, N., Gamberger, D., Todorovski, L., Blockeel, H. (eds.) PKDD 2003. LNCS (LNAI), vol. 2838, pp. 107–119. Springer, Heidelberg (2003). https://doi.org/10.1007/978-3-540-39804-2_12
Bühlmann, P., Hothorn, T.: Boosting algorithms: regularization, prediction and model fitting. Stat. Sci. 22(4), 477–505 (2007)
Bagnell, J.A., Chestnutt, J., Bradley, D.M., et al.: Boosting structured prediction for imitation learning. In: Advances in Neural Information Processing Systems, pp. 1153–1160 (2007)
Du, X., Sun, S., Hu, C., et al.: DeepPPI: boosting prediction of protein-protein interactions with deep neural networks. J. Chem. Inf. Model. 57(6), 1499–1510 (2017)
Lu, N., Lin, H., Lu, J., et al.: A customer churn prediction model in telecom industry using boosting. IEEE Trans. Industr. Inf. 10(2), 1659–1665 (2012)
Bühlmann, P., Hothorn, T.: Twin boosting: improved feature selection and prediction. Stat. Comput. 20(2), 119–138 (2010)
Friedman, J.H.: Stochastic gradient boosting. Comput. Stat. Data Anal. 38(4), 367–378 (2002)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2020 Springer Nature Singapore Pte Ltd.
About this paper
Cite this paper
Zhang, S., Liu, M., Zhang, J. (2020). An Academic Achievement Prediction Model Enhanced by Stacking Network. In: Zhai, G., Zhou, J., Yang, H., An, P., Yang, X. (eds) Digital TV and Wireless Multimedia Communication. IFTC 2019. Communications in Computer and Information Science, vol 1181. Springer, Singapore. https://doi.org/10.1007/978-981-15-3341-9_20
Download citation
DOI: https://doi.org/10.1007/978-981-15-3341-9_20
Published:
Publisher Name: Springer, Singapore
Print ISBN: 978-981-15-3340-2
Online ISBN: 978-981-15-3341-9
eBook Packages: Computer ScienceComputer Science (R0)