Jump to content

Queueing delay

From Wikipedia, the free encyclopedia
This is an old revision of this page, as edited by Achillezzz (talk | contribs) at 17:29, 5 September 2008 (See also). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

In computer engineering, a queueing delay is the time a job waits in a queue until it can be executed.

This term is most often used in reference to routers. When packets arrive at a router, they have to be processed and transmitted. A router can only process one packet at a time. If packets arrive faster than the router can process them (such as in a burst transmission) the router puts them into the queue (also called the buffer) until it can get around to transmitting them.

The maximum queuing delay is proportional to buffer size. The longer the line of packets waiting to be transmitted, the longer the average waiting time is. However, this is much preferable to a shorter buffer, which would result in ignored ("dropped") packets, which in turn would result in much longer overall transmission times.

The M/M/1/K queuing model, where K is the size of the buffer, may be used to analyze the queuing delay in a specific system. Check [1].

References

See also