Skip to main content

Synonyms

Error

Definition

Error rate refers to a measure of the degree of prediction error of a model made with respect to the true model.

The term error rate is often applied in the context of classification models. In this context, error rate = P(λ(X)≠Y ), where XY is a joint distribution and the classification model λ is a function X → Y. Sometimes this quantity is expressed as a percentage rather than a value between 0.0 and 1.0.

Two common measures of error rate for regression models are mean squared errorand mean absolute error.

The error rate of a model is often assessed or estimated by applying it to test data for which the class labels (Y values) are known. The error rate of a classifier on test data may be calculated as number of incorrectly classified objects/total number of objects. Alternatively, a smoothing function may be applied, such as a Laplace estimate or an m-estimate.

Error rate is directly related to accuracy, such that error rate = 1.0 − accuracy(or when...

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Institutional subscriptions

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2011 Springer Science+Business Media, LLC

About this entry

Cite this entry

Ting, K.M. (2011). Error Rate. In: Sammut, C., Webb, G.I. (eds) Encyclopedia of Machine Learning. Springer, Boston, MA. https://doi.org/10.1007/978-0-387-30164-8_262

Download citation

Publish with us

Policies and ethics