Abstract
A new boosting algorithm ADABOOST-Ra for regression problems is presented and upper bound on the error is obtained. Experimental results to compare ADABOOST-RΔ and other learning algorithms are given.
Supported by grants CT 9305230.ST 74, CT 90.021.56.74
Preview
Unable to display preview. Download preview PDF.
References
Freund, Y., Schapire, R.E.: A decision-theoretic generalization of on-line learning and an application to boosting. Internal Report of AT & T, September (1995)
Freund, Y., Schapire, R.E.: Experiments with a new boosting algorithm. Machine Learning: Proc. of Thirteenth Int. Conf. (1996) 148–156
V.N. Vapnik (1982) Estimation of Dependences Based on Empirical Data. Springer-Verlag.
V.N. Vapnik, A.Y. Chervonenkis (1971) On the uniform convergence of relative frequencies of events to their probabilities. Theory of Probability and its Applications, 16(2): 264–280.
Freund, Y.: Boosting a weak learning algorithm by majority. Information and Computation 121 (2) (1995) 256–285
Author information
Authors and Affiliations
Editor information
Rights and permissions
Copyright information
© 1997 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Bertoni, A., Campadelli, P., Parodi, M. (1997). A boosting algorithm for regression. In: Gerstner, W., Germond, A., Hasler, M., Nicoud, JD. (eds) Artificial Neural Networks — ICANN'97. ICANN 1997. Lecture Notes in Computer Science, vol 1327. Springer, Berlin, Heidelberg. https://doi.org/10.1007/BFb0020178
Download citation
DOI: https://doi.org/10.1007/BFb0020178
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-63631-1
Online ISBN: 978-3-540-69620-9
eBook Packages: Springer Book Archive