Jump to content

Constraint (information theory)

From Wikipedia, the free encyclopedia
This is the current revision of this page, as edited by Jlwoodwa (talk | contribs) at 05:50, 12 April 2024 (tag as one source). The present address (URL) is a permanent link to this version.
(diff) ← Previous revision | Latest revision (diff) | Newer revision → (diff)

Constraint in information theory is the degree of statistical dependence between or among variables.

Garner[1] provides a thorough discussion of various forms of constraint (internal constraint, external constraint, total constraint) with application to pattern recognition and psychology.

See also

[edit]

References

[edit]
  1. ^ Garner W R (1962). Uncertainty and Structure as Psychological Concepts, John Wiley & Sons, New York.