Probabilistic classification
This article needs additional citations for verification. |
Part of a series on |
Machine learning and data mining |
---|
In statistics and machine learning, a probabilistic classifier is a classifier that is able to predict a conditional probability distribution P(Y over a set of classes Y, given inputs X. Probabilistic classifiers provide classification with a degree of certainty, which can be useful when combining them into larger systems.
Some classification models, such as naive Bayes, logistic regression and multilayer perceptrons (when trained under an appropriate loss function) are naturally probabilistic. Other models such as support vector machines are not, but probability models can be fit on their outputs; examples of this technique include Platt scaling and isotonic calibration.[1]
Evaluating probabilistic classification
Commonly used loss functions for probabilistic classification include log loss and the mean squared error between the predicted and the true probability distributions. The former of these is commonly used to train logistic models.
References
- ^ Niculescu-Mizil, Alexandru; Caruana, Rich (2005). Predicting good probabilities with supervised learning (PDF). ICML.