Instance-based learning
In machine learning, instance-based learning or memory-based learning is a family of learning algorithms that, instead of performing explicit generalization, compare new problem instances with instances seen in training, which have been stored in memory. Instance-based learning is a kind of lazy learning.
It is called instance-based because it constructs hypotheses directly from the training instances themselves.[1] This means that the hypothesis complexity can grow with the data.[1]
A simple example of an instance-based learning algorithm is the k-nearest neighbor algorithm. Daelemans and Van den Bosch describe variations of this algorithm for use in natural language processing (NLP), claiming that memory-based learning is both more psychologically realistic than other machine-learning schemes and practically effective.[2]
References
- ^ a b Stuart Russell and Peter Norvig (2003). Artificial Intelligence: A Modern Approach, second edition, p. 733. Prentice Hall. ISBN 0-13-080302-2
- ^ Walter Daelemans and Antal van den Bosch (2005). Memory-Based Language Processing. Cambridge University Press.
External link
- TiMBL, the Tilburg Memory Based Learner is an instance-based learning package geared toward NLP