User:Flow~enwiki
Linear Discriminants
What they are: straight line classifiers
Advantages
- fastest, simplest possible classifier
Disadvantages
- simplistic, and high probability of error, however, this can be overcome/mitigated by
1) transforming the data by some high-dimensional non-linear functions, and creating the straight line in non-linear space (Support Vector Machines)
OR
2) Combining several discriminants when making a classifier by i) voting ii) aggregation iii) Neural Networks (uses some kind of voting?)
Perceptron algorithm
What it does: finds a linear discriminant between two clusters
Advantages
- will always find a discriminant, if one exists
Disadvantages
- discriminant usually not optimum, or close to optimum
- premature halting of the algorithm gives nonsense
- may require an arbitrary number of iterations - number of iterations is inversely proportional to distance between clusters
Minimum Square Error (MSE)
What it does: finds a linear discriminant between clusters, by the least squares solution of the distance from (all points in the cluster or only the closest?) to the line
Advantages
- will always give a 'reasonable' result, even if the clusters are not separable
Disadvantages
Template Title
What it does:
Advantages
Disadvantages
Some examples of mathematical expressions:
Failed to parse (SVG (MathML can be enabled via browser plugin): Invalid response ("Math extension cannot connect to Restbase.") from server "http://localhost:6011/en.wikipedia.org/v1/":): {\displaystyle \sin x + \ln y \alpha a^2 }
Failed to parse (SVG (MathML can be enabled via browser plugin): Invalid response ("Math extension cannot connect to Restbase.") from server "http://localhost:6011/en.wikipedia.org/v1/":): {\displaystyle T_P = \frac{\pi} {\omega_n \sqrt{1-\zeta^2}} }