Abstract
In terms of determining the success of a musical artist's song, there is a positive correlation of radio play success and music sales success. Therefore, being able to forecast the future plays of a song on the radio can serve as powerful risk management and product portfolio management tools for record labels and other stakeholders of a song. This research strives to predict the remaining product life cycle of a song on the radio after it has been played for one or two months. The best results were achieved using a k-d tree to calculate the songs the most similar to the test songs and use a Random Forest model to forecast radio plays. Accuracy of 82.78% and 83.44% was achieved for the two time periods, respectively. This explorative research leads to over 4500 test metrics to find the best combination of models and pre-processing techniques. Other algorithms tested were KNN, MLP, and CNN. The features only consist of daily radio plays and use no musical features.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Middlebrook, K., Sheik, K.: Song Hit prediction: predicting billboard hits using spotify data, pp. 1–6, August 2019. http://arxiv.org/abs/1908.08609
Cibils, C., Meza, Z., Ramel, G.: Predicting a song’s path through the billboard hot 100. Stanford Univ. Calif., pp. 1–6 (2015). cs229.stanford.edu/proj2015/012_report.pdf
Herremans, D., Martens, D., Sörensen, K.: Dance hit song prediction. J. New Music Res. 43(3), 291–302 (2014). https://doi.org/10.1080/09298215.2014.881888
Dewan, S., Ramaprasad, J.: Social media, traditional media, and music sales. MIS Q. 38(1), 101–121 (2014). https://doi.org/10.25300/MISQ/2014/38.1.05
Tibshirani, R., Walther, G., Hastie, T.: Estimating the number of clusters in a data set via the gap statistic. J. R. Stat. Soc. B 63, 411–423 (2001)
Hu, K., Acimovic, J., Erize, F., Thomas, D.J., Van Mieghem, J.A.: Forecasting new product life cycle curves: practical approach and empirical analysis. Manuf. Serv. Oper. Manag. 21(1), 66–85 (2019). https://doi.org/10.1287/msom.2017.0691
Chen, Y., et al.: Fast neighbor search by using revised k-d tree. Inf. Sci. (Ny) 472, 145–162 (2019). https://doi.org/10.1016/j.ins.2018.09.012
Susto, G.A., Schirru, A., Pampuri, S., McLoone, S., Beghi, A.: Machine learning for predictive maintenance: a multiple classifier approach. IEEE Trans. Ind. Informatics 11(3), 812–820 (2015). https://doi.org/10.1109/TII.2014.2349359
Parmezan, A.R.S., Souza, V.M.A., Batista, G.E.A.P.A.: Evaluation of statistical and machine learning models for time series prediction: Identifying the state-of-the-art and the best conditions for the use of each model. Inf. Sci. (Ny) 484(February), 302–337 (2019). https://doi.org/10.1016/j.ins.2019.01.076
Chu, Y., Coimbra, C.F.M.: Short-term probabilistic forecasts for direct normal irradiance. Renew. Energy 101, 526–536 (2017). https://doi.org/10.1016/j.renene.2016.09.012
Haara, A., Kangas, A.: Comparing K nearest neighbours methods and linear regression-is there reason to select one over the other? Math. Comput. For. Nat. Sci. 4(1), 50–65 (2012)
Becketti, S.: Nonparametric regression: Kernel, WARP, ad k-NN estimators, no. May (2014)
Tyralis, H., Papacharalampous, G., Langousis, A.: A brief review of random forests for water scientists and practitioners and their recent history in water resources. Water 11(5), 910 (2019). https://doi.org/10.3390/w11050910
Zhang, C., Ma, Y.: Ensemble Machine Learning. Springer, Heidelberg (2012). https://doi.org/10.1007/978-1-4419-9326-7
Kaparthi, S., Bumblauskas, D.: Designing predictive maintenance systems using decision tree-based machine learning techniques. Int. J. Qual. Reliab. Manag. 37(4), 659–686 (2020). https://doi.org/10.1108/IJQRM-04-2019-0131
Biau, Gérard., Scornet, E.: A random forest guided tour. TEST 25(2), 197–227 (2016). https://doi.org/10.1007/s11749-016-0481-7
Ingrassia, S., Morlini, I.: Neural network modeling for small datasets. Technometrics 47(3), 297–311 (2005). https://doi.org/10.1198/004017005000000058
Nwankpa, C., Ijomah, W., Gachagan, A., Marshall, S.: Activation functions: comparison of trends in practice and research for deep learning, pp. 1–20, November 2018. http://arxiv.org/abs/1811.03378
Gu, J., et al.: Recent advances in convolutional neural networks. Pattern Recognit. 77, 354–377 (2018). https://doi.org/10.1016/j.patcog.2017.10.013
Werbos, P.J.: Backpropagation through time: what it does and how to do it. Proc. IEEE 78(10), 1550–1560 (1990). https://doi.org/10.1109/5.58337
Zhang, A., Lopton, Z.C., Li, M., Smola, A.J.: Dive Into Deep Learning (2020)
Wang, K., et al.: Multiple convolutional neural networks for multivariate time series prediction. Neurocomputing 360, 107–119 (2019). https://doi.org/10.1016/j.neucom.2019.05.023
Le Guennec, A., Malinowski, S., Tavenard, R.: Data augmentation for time series classification using convolutional neural networks. In: ECML/PKDD Workshop on Advanced Analytics and Learning Temporal Data (2016)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2021 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this paper
Cite this paper
Grooss, O.F., Holm, C.N., Alphinas, R.A. (2021). Predicting the Product Life Cycle of Songs on the Radio. In: Arai, K. (eds) Intelligent Computing. Lecture Notes in Networks and Systems, vol 284. Springer, Cham. https://doi.org/10.1007/978-3-030-80126-7_34
Download citation
DOI: https://doi.org/10.1007/978-3-030-80126-7_34
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-80125-0
Online ISBN: 978-3-030-80126-7
eBook Packages: Intelligent Technologies and RoboticsIntelligent Technologies and Robotics (R0)