Abstract
In the previous chapter we discussed some problems with the process of learning a prediction function by choosing its parameters as minimising the cost function. However, that assumes that we know how many parameters should be used in the first place. This is related to the process of overfitting, in which any set of data points, for example, can be fitted by a suitably high-order polynomial. Figure 7.1 shows a plot of a signal and measurements made using some noisy method. Minimising the MSE could cause a model to fit these noisy points perfectly.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2002 Springer-Verlag London
About this chapter
Cite this chapter
Shadbolt, J. (2002). Overfitting, Generalisation and Regularisation. In: Shadbolt, J., Taylor, J.G. (eds) Neural Networks and the Financial Markets. Perspectives in Neural Computing. Springer, London. https://doi.org/10.1007/978-1-4471-0151-2_7
Download citation
DOI: https://doi.org/10.1007/978-1-4471-0151-2_7
Publisher Name: Springer, London
Print ISBN: 978-1-85233-531-1
Online ISBN: 978-1-4471-0151-2
eBook Packages: Springer Book Archive