Jump to content

Talk:Winnow (algorithm)

Page contents not supported in other languages.
From Wikipedia, the free encyclopedia
This is an old revision of this page, as edited by Johndburger (talk | contribs) at 18:09, 6 October 2011 (Is that update procedure correct?: Yes.). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.
WikiProject iconComputing Stub‑class
WikiProject iconThis article is within the scope of WikiProject Computing, a collaborative effort to improve the coverage of computers, computing, and information technology on Wikipedia. If you would like to participate, please visit the project page, where you can join the discussion and see a list of open tasks.
StubThis article has been rated as Stub-class on Wikipedia's content assessment scale.
???This article has not yet received a rating on the project's importance scale.
Note icon
This article has been automatically rated by a bot or other tool as Stub-class because it uses a stub template. Please ensure the assessment is correct before removing the |auto= parameter.
WikiProject iconRobotics Start‑class Low‑importance
WikiProject iconThis article is within the scope of WikiProject Robotics, a collaborative effort to improve the coverage of Robotics on Wikipedia. If you would like to participate, please visit the project page, where you can join the discussion and see a list of open tasks.
StartThis article has been rated as Start-class on Wikipedia's content assessment scale.
LowThis article has been rated as Low-importance on the project's importance scale.

Removed the "insufficient context" tag

The tag was placed on 2007-10-29T16:33:23 by user Kallerdis, but the article is much improved since then. I think it does provide enough context. -Pgan002 (talk) 18:31, 22 October 2010 (UTC)[reply]

Is that update procedure correct?

The text specifies the update procedure as:

  • If an example is correctly classified, do nothing.
  • If an example is predicted to be 1 but the correct result was 0, all of the weights involved in the mistake are set to zero (demotion step).
  • If an example is predicted to be 0 but the correct result was 1, all of the weights involved in the mistake are multiplied by (promotion step).

Step 2 moves weights to zero, and neither of the other steps changes any weight that is zero. Because of that, I fear that this will too easily migrate towards an all-zero weights vector.

So, either explain why that will not happen, or adjust the description. —Preceding unsigned comment added by 145.36.235.3 (talk) 11:30, 19 April 2011 (UTC)[reply]

That is indeed the original (Winnow1) algorithm. Roughly, any feature that is implicated in a false positive should thereafter be ignored. I think the weights cannot all go to zero if the data is linearly separable: there will always be some instance that on the other side of the current hyperplane. And if the data is not linearly separable, all bets are off, just as with most linear classifiers. —johndburger 18:09, 6 October 2011 (UTC)[reply]