Abstract
Can we dynamically extract some information and strong relationship between some financial features in order to select some financial trades over time? Despite the advent of representation learning and end-to-end approaches, mainly through deep learning, feature selection remains a key point in many machine learning scenarios. This paper introduces a new theoretically motivated method for feature selection. The approach that fits within the family of embedded methods, casts the feature selection conundrum as a coordinate ascent optimization with variables dependencies materialized by block variables. Thanks to a limited number of iterations, it proves efficiency for gradient boosting methods, implemented with XGBoost. In case of convex and smooth functions, we are able to prove that the convergence rate is polynomial in terms of the dimension of the full features set. We provide comparisons with state of the art methods, Recursive Feature Elimination and Binary Coordinate Ascent and show that this method is competitive when selecting some financial trades.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Almuallim, H., Dietterich, T.G.: Learning Boolean concepts in the presence of many irrelevant features. Artif. Intell. 69, 279–305 (1994)
Alvarez-Melis, D., Jaakkola, T.S.: On the robustness of interpretability methods. CoRR (2018). http://arxiv.org/abs/1806.08049
Bach, F.R.: Exploring large feature spaces with hierarchical multiple kernel learning. In: Koller, D., Schuurmans, D., Bengio, Y., Bottou, L. (eds.) Advances in Neural Information Processing Systems 21, pp. 105–112 (2009)
Beck, A., Tetruashvili, L.: On the convergence of block coordinate descent type methods. SIAM J. Optim. 23(4), 2037–2060 (2013)
Blum, A.L., Langley, P.: Selection of relevant features and examples in machine learning. Artif. Intell. 97(1–2), 245–271 (1997)
Chaboud, A.P., Chiquoine, B., Hjalmarsson, E., Vega, C.: Rise of the machines: algorithmic trading in the foreign exchange market. J. Finan. 69(5), 2045–2084 (2015)
Chan, E.: Algorithmic Trading: Winning Strategies and Their Rationale, 1st edn. Wiley Publishing, Hoboken (2013)
Chen, T., Guestrin, C.: XGBoost: a scalable tree boosting system. CoRR abs/1603.02754 (2016)
Diakonikolas, J., Orecchia, L.: Alternating randomized block coordinate descent. In: Proceedings of the 35th International Conference on Machine Learning. Proceedings of Machine Learning Research, PMLR, 10–15 July 2018, Stockholmsmässan, Stockholm, Sweden (2018)
Fung, S.P.Y.: Optimal online two-way trading with bounded number of transactions. CoRR (2017)
Gaudel, R., Sebag, M.: Feature Selection as a one-player game. In: International Conference on Machine Learning, Haifa, Israel, pp. 359–366, June 2010. https://hal.inria.fr/inria-00484049
Di Graziano, G.(D.B.A.): Optimal trading stops and algorithmic trading. SSRN (2014). https://ssrn.com/abstract=2381830
Goldstein, M., Viljoen, T., Westerholm, P.J., Zheng, H.: Algorithmic trading, liquidity, and price discovery: an intraday analysis of the SPI 200 futures. Fin. Rev. 49(2), 245–270 (2014)
Guyon, I., Elisseeff, A.: An introduction to variable and feature selection. J. Mach. Learn. Res. 3, 1157–1182 (2003)
Hastie, T., Tibshirani, R., Friedman, J.: The Elements of Statistical Learning. SSS. Springer, New York (2009). https://doi.org/10.1007/978-0-387-84858-7
Kira, K., Rendell, L.A.: A practical approach to feature selection. In: Proceedings of the Ninth International Workshop on Machine Learning. ML92, San Francisco, CA, USA (1992)
Kirilenko, A., Kyle, A.S., Samadi, M., Tuzun, T.: The flash crash: high-frequency trading in an electronic market. J. Finan. 72, 967–998 (2017)
Kohavi, R., John, G.H.: Wrappers for feature subset selection. Artif. Intell. 97(1), 273–324 (1997)
Koller, D., Sahami, M.: Toward optimal feature selection. In: Proceedings of the Thirteenth International Conference on International Conference on Machine Learning, pp. 284–292. ICML 1996, San Francisco, CA, USA. Morgan Kaufmann Publishers Inc., Burlington (1996)
Labadie, M., Lehalle, C.A.: Optimal algorithmic trading and market microstructure. Working papers (2010)
Mangal, A., Holm, E.A.: A comparative study of feature selection methods for stress hotspot classification in materials. ArXiv e-prints, April 2018
Mitra, P., Murthy, C.A., Pal, S.K.: Unsupervised feature selection using feature similarity. IEEE Trans. Pattern Anal. Mach. Intell. 24(3), 301–312 (2002)
Nesterov, Y.: Efficiency of coordinate descent methods on huge-scale optimization problems. SIAM J. Optim. 22(2), 341–362 (2012)
Ng, A.Y.: Feature selection, l1 vs. l2 regularization, and rotational invariance. In: ICML 2004 (2004)
Staniak, M., Biecek, P.: Explanations of model predictions with live and breakdown packages. R J. 10, 395 (2018)
Tibshirani, R.: Regression shrinkage and selection via the Lasso. J. Roy. Stat. Soc. B 58, 267–288 (1994)
Tuv, E., Borisov, A., Runger, G., Torkkola, K.: Feature selection with ensembles, artificial variables, and redundancy elimination. J. Mach. Learn. Res. 10, 1341–1366 (2009)
Vezeris, D., Kyrgos, T., Schinas, C.T.P., Loss, S.: Trading strategies comparison in combination with an MACD trading system. J. Risk Fin. Manage. 11, 56 (2018)
Wright: Coordinate descent algorithms. Math. Program. 151(1), 3–34 (2015)
Zarshenas, A., Suzuki, K.: Binary coordinate ascent: an efficient optimization technique for feature subset selection for machine learning. Knowl.-Based Syst. 110, 191–201 (2016)
Zhang, T.: Adaptive forward-backward greedy algorithm for sparse learning with linear models. Curran Associates, Inc. (2009)
Author information
Authors and Affiliations
Corresponding authors
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2021 Springer Nature Switzerland AG
About this paper
Cite this paper
Saltiel, D., Benhamou, E., Laraki, R., Atif, J. (2021). Trade Selection with Supervised Learning and Optimal Coordinate Ascent (OCA). In: Bitetta, V., Bordino, I., Ferretti, A., Gullo, F., Ponti, G., Severini, L. (eds) Mining Data for Financial Applications. MIDAS 2020. Lecture Notes in Computer Science(), vol 12591. Springer, Cham. https://doi.org/10.1007/978-3-030-66981-2_1
Download citation
DOI: https://doi.org/10.1007/978-3-030-66981-2_1
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-66980-5
Online ISBN: 978-3-030-66981-2
eBook Packages: Computer ScienceComputer Science (R0)