Jump to content

Maximum time interval error

From Wikipedia, the free encyclopedia
This is an old revision of this page, as edited by Kvng (talk | contribs) at 18:18, 24 May 2012 (added Category:Measurement using HotCat). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

Maximum time interval error (MTIE) is a measure of the worst case phase variation of a signal with respect to a perfect signal over a given period of time. It is used to specify clock stability requirements in telecommunications standards.[1] MTIE measurements can be used to detect clock instability that can cause data loss on a communications channel.[2]

References

  1. ^ Stefano Bregni (October 1996). "Measurement of Maximum Time Interval Error for Telecommunications Clock Stability Characterization" (PDF). IEEE Transactions on Instrumentation and Measurement. Retrieved 2012-05-24. {{cite journal}}: Unknown parameter |publiser= ignored (|publisher= suggested) (help)
  2. ^ "Time and Frequency from A to Z". NIST. Retrieved 2012-05-24.


Template:Telecomm-term-stub