Jump to content

Talk:Confusion matrix

Page contents not supported in other languages.
From Wikipedia, the free encyclopedia
This is an old revision of this page, as edited by Jiuguang Wang (talk | contribs) at 17:21, 13 July 2008 ({{WikiProject Robotics|nested=no|class=stub|importance=mid}}). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.
WikiProject iconRobotics Stub‑class Mid‑importance
WikiProject iconThis article is within the scope of WikiProject Robotics, a collaborative effort to improve the coverage of Robotics on Wikipedia. If you would like to participate, please visit the project page, where you can join the discussion and see a list of open tasks.
StubThis article has been rated as Stub-class on Wikipedia's content assessment scale.
MidThis article has been rated as Mid-importance on the project's importance scale.

Accuracy

We badly need clarification for the definition of producer's and user's accuracy, which is closely associated with the confusion matrix. Comment added by Ctzcheng (talkcontribs) 17:26, 10 March 2008 (UTC)[reply]

Merger

I suggest this article should be merged at this address with Table of confusion. The issue is the same and there should be only one article in order to avoid confusion //end of lame joke//. --Ben T/C 15:46, 21 May 2007 (UTC)[reply]


I don not support this change of name. "Confusion matrix" has been used for ever in Speech Recognition, and in some other Pattern Recognition tasks, although I cannot trace the ancestry of the use. For instance, some fairly standard sequence recognition toolkits like HTK have tools specifically designed to obtain this "confusion matrix".

I do grant you that most of the times what we see is a table (specially if reading it from paper), and I guess that the "table of confusion" stuff comes from statistics and people who developed their field before computers even existed.

In communications we call a related diagram a ROC (Receiver_operating_characteristic), each of whose working points is a table of confusion. I suggest "table of confusion" goes in there and "confusion matrix" is improved. --FJValverde 09:24, 14 June 2007 (UTC)[reply]


Idea is to have as much information access for as wide an audience as possible here. Since the 2 are the same thing with different terms - what makes sense is merging while redirecting searches for either to this one page. -user AOberai, 14 aug2007

Geography

Just to futher confuse things, confusion matrices arnt soely used in AI (as this article would suggest). A confusion matric is also used in Earth Observation when validating thematic classifications.


Yes, I believe AI is too narrow in this discussion. I suggest "Pattern Recognition" is the actual context where confusion matrices makes sense. FJValverde 09:01, 14 June 2007 (UTC)[reply]

I think they are used more generally in statistics, be it for pattern recognition or earth observation. --Ben T/C 07:41, 20 June 2007 (UTC)[reply]

Er... In my very limited historical view of either statistics and PR, the latter actually sprung from the former, but has since gained some independence: not all techniques in PR are statistical (or even probabilistic). However, I think that confusion matrix is properly a PR concept in the sense that a n-to-m classifier is a very basic PR task. In this sense earth observation and "thematic classification" (meaning classifying the type of soil & such based on the images taken by satellites, right?) is strictly a type of PR task. --FJValverde 08:47, 22 June 2007 (UTC)[reply]

Missing Labeling of Matrix Columns/Rows

Please add labels to the matrix which ones are the actual values and what are the predicted values. Reading the text it becomes clear, but please take note that the article about Receiver Operating Characteristic links to here and over there the confusion matrix is transposed (but labeled). Stevemiller 04:30, 9 October 2007 (UTC)[reply]