Jump to content

Time deviation

From Wikipedia, the free encyclopedia

Time deviation (TDEV),[1] also known as , measures the time stability of a clock source's phase over an observation interval, expressed as a standard deviation of the time variations. This indicates the time instability of the signal source. This is a scaled variant of frequency stability of Allan deviation. It is commonly defined from the modified Allan deviation, but other estimators may be used.

Time variance (TVAR), symbolised as , is the time stability of phase versus observation interval tau. It is a scaled variant of modified Allan variance.

TDEV is a metric often used to determine an aspect of the quality of timing signals in telecommunication applications and is a statistical analysis of the phase stability of a signal over a given period. Measurements of a reference timing signal will refer to its TDEV and maximum time interval error (MTIE) values, comparing them to specified masks or goals.

Definition

[edit]

The most common estimator uses the modified Allan variance

where .[Add definitions of n and τ0] The 3 in the denominator normalizes TVAR to be equal to the classical variance if the deviations in x are random and uncorrelated (white noise).

Alternatively, TDEV, which is the square-root of TVAR, may be derived from MDEV modified Allan deviation

References

[edit]