Jump to content

Optimal discriminant analysis and classification tree analysis

From Wikipedia, the free encyclopedia
This is an old revision of this page, as edited by Sbelknap (talk | contribs) at 21:37, 11 September 2009 (Created page with '{{Nofootnotes|date=September 2009}} '''Optimal discriminant analysis (ODA)''' and the related '''Classification tree analysis''' (CTA) are statistical methods that...'). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.
(diff) ← Previous revision | Latest revision (diff) | Newer revision → (diff)

Optimal discriminant analysis (ODA) and the related Classification tree analysis (CTA) are statistical methods that maximize predictive accuracy. For any specific sample and exploratory or confirmatory hypothesis, optimal discriminant analysis (ODA) identifies the statistical model that yields maximum predictive accuracy, assesses the exact Type I error rate, and evaluates potential cross-generalizability. Optimal discriminant analysis may be applied to > 0 dimensions, with the one-dimensional case being referred to as UniODA and the multidimensional case being referred to as MultiODA. Classification Tree Analysis is a generalization of Optimal Discriminant Analysis to non-orthogonal trees. Optimal Discriminant Analysis and Classification Tree Analysis may be used to find the combination of variables and cut points that best separate classes of objects or events. These variables and cut points may then be used to reduce dimensions and to then build a statistical model that optimally describes the data.

Optimal discriminant analysis may be thought of as a generalization of Fisher's Linear Discriminant Analysis. In the case Optimal discriminant analysis is an alternative to ANOVA (analysis of variance) and regression analysis, which attempt to express one dependent variable as a linear combination of other features or measurements. However, ANOVA and regression analysis give a dependent variable that is a numerical variable, while optimal discriminant analysis gives a dependent variable that is a class variable.

See also

References

  • Template:Cite boo
  • Fisher, R. A. (1936). "The Use of Multiple Measurements in Taxonomic Problems" (PDF). Annals of Eugenics. 7: 179–188. Retrieved 2009-05-09.
  • Martinez, A. M.; Kak, A. C. (2001). "PCA versus LDA" (PDF). IEEE Transactions on Pattern Analysis and Machine Intelligence. 23 (2): 228–233. doi:10.1109/34.908974.
  • Mika, S.; et al. (1999). "Fisher Discriminant Analysis with Kernels". IEEE Conference on Neural Networks for Signal Processing IX: 41–48. doi:10.1109/NNSP.1999.788121. {{cite journal}}: Explicit use of et al. in: |author= (help)