Jump to content

Statistical signal processing

From Wikipedia, the free encyclopedia
This is an old revision of this page, as edited by Cihan (talk | contribs) at 01:47, 3 September 2007. The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

Statistical signal processing is an area of signal processing that treats signals as stochastic processes, dealing with their statistical properties (e.g., mean, covariance, etc.). Traditionally it is taught at the graduate level in electrical engineering departments around the world, although important applications exist in almost all scientific fields.

In many areas signals are modeled as functions consisting of both deterministic and stochastic components. A simple example and also a common model of many statistical systems is a signal that consists of a deterministic part added to noise which can be modeled in many situations as white Gaussian noise :

where

White noise simply means that the noise process is completely uncorrelated. As a result, its autocorrelation function is an impulse:

where

is the Dirac delta function.

Given information about a statistical system and the random variable from which it is derived, we can increase our knowledge of the output signal; conversely, given the statistical properties of the output signal, we can infer the properties of the underlying random variable.

These statistical techniques are developed in the fields of estimation theory, detection theory, and numerous related fields that rely on statistical information to maximize their efficiency.


See also