Abstract
The paper presents a competitive prediction-style upper bound on the square loss of the Aggregating Algorithm for Regression with Changing Dependencies in the linear case. The algorithm is able to compete with a sequence of linear predictors provided the sum of squared Euclidean norms of differences of regression coefficient vectors grows at a sublinear rate.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Notes
- 1.
The fact that the powers of n and T sum to 1 makes the straightforward kernelisation of the bound based on the representer theorem useless. This observation may potentially lead to a lower bound.
References
Adamskiy, D., Koolen, W.M., Chernov, A., Vovk, V.: A closer look at adaptive regret. In: Bshouty, N.H., Stoltz, G., Vayatis, N., Zeugmann, T. (eds.) ALT 2012. LNCS, vol. 7568, pp. 290–304. Springer, Heidelberg (2012)
Azoury, K.S., Warmuth, M.K.: Relative loss bounds for on-line density estimation with the exponential family of distributions. Mach. Learn. 43, 211–246 (2001)
Beckenbach, E.F., Bellman, R.E.: Inequalities. Springer, Heidelberg (1961)
Busuttil, S., Kalnishkan, Y.: Online regression competitive with changing predictors. In: Hutter, M., Servedio, R.A., Takimoto, E. (eds.) ALT 2007. LNCS (LNAI), vol. 4754, pp. 181–195. Springer, Heidelberg (2007)
Busuttil, S., Kalnishkan, Y.: Weighted kernel regression for predicting changing dependencies. In: Kok, J.N., Koronacki, J., Mantaras, R.L., Matwin, S., Mladenič, D., Skowron, A. (eds.) ECML 2007. LNAI, vol. 4701, pp. 535–542. Springer, Heidelberg (2007)
Cesa-Bianchi, N., Conconi, A., Gentile, C.: A second-order perceptron algorithm. SIAM J. Comput. 34(3), 640–668 (2005)
Forster, J.: On relative loss bounds in generalized linear regression. In: Ciobanu, G., Păun, G. (eds.) FCT 1999. LNCS, vol. 1684, pp. 269–280. Springer, Heidelberg (1999)
Horn, R.A., Johnson, C.R.: Topics in Matrix Analysis. Cambridge University Press, Cambridge (1994)
Horn, R.A., Johnson, C.R.: Matrix Analysis, 2nd edn. Cambridge University Press, Cambridge (2013)
Herbster, M., Warmuth, M.K.: Tracking the best linear predictor. J. Mach. Learn. Res. 1, 281–309 (2001)
Moroshko, E., Vaits, N., Crammer, K.: Second-order non-stationary on-line learning for regression. J. Mach. Learn. Res. 16, 1481–1517 (2015)
Salkuyeh, D.K.: Comments on “A note on a three-term recurrence for a tridiagonal matrix”. Appl. Math. Comput. 176(2), 442–444 (2006)
Vovk, V.: Aggregating strategies. In: Proceedings of the 3rd Annual Workshop on Computational Learning Theory, pp. 371–383. Morgan Kaufmann, San Mateo (1990)
Vovk, V.: A game of prediction with expert advice. J. Comput. Syst. Sci. 56, 153–173 (1998)
Vovk, V.: Competitive on-line statistics. Int. Stat. Rev. 69(2), 213–248 (2001)
Acknowledgement
The author has been supported by the Leverhulme Trust through the grant RPG-2013-047 ‘Online self-tuning learning algorithms for handling historical information’. The author would like to thank Vladimir Vovk, Dmitry Adamskiy, and Vladimir V’yugin for useful discussions. Special thanks to Alexey Chernov, who helped to simplify the statement of the main result.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2016 Springer International Publishing Switzerland
About this paper
Cite this paper
Kalnishkan, Y. (2016). An Upper Bound for Aggregating Algorithm for Regression with Changing Dependencies. In: Ortner, R., Simon, H., Zilles, S. (eds) Algorithmic Learning Theory. ALT 2016. Lecture Notes in Computer Science(), vol 9925. Springer, Cham. https://doi.org/10.1007/978-3-319-46379-7_16
Download citation
DOI: https://doi.org/10.1007/978-3-319-46379-7_16
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-46378-0
Online ISBN: 978-3-319-46379-7
eBook Packages: Computer ScienceComputer Science (R0)