Jump to content

Generalization error

From Wikipedia, the free encyclopedia
This is an old revision of this page, as edited by Alamino (talk | contribs) at 02:11, 15 August 2005. The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.
(diff) ← Previous revision | Latest revision (diff) | Newer revision → (diff)

The generalization error of a machine learning algorithm is a function that measures how far the student machine is from the teacher machine in average over the entire set of possible data that can be generated by the teacher after each iteration of the learning process. It has this name due to the fact that this function indicates the capacity of a machine that learns with the specified algorithm to infer a rule (or generalize) that is used by the teacher machine to generate data based only in a few examples.

The generalization error of a perceptron is the probability of the student perceptron to classify an example differently from the teacher and is given by the overlap of the student and teacher synaptic vectors and is a function of their scalar product.