Jump to content

Worst-case complexity

From Wikipedia, the free encyclopedia
This is an old revision of this page, as edited by Aaron.mer (talk | contribs) at 22:38, 20 February 2009 (fixed further readings (big omega --> big O)). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

In computer science, worst case complexity is a performance measure of an algorithm. It is defined as the maximum of the complexity measure of the algorithm over all possible input instances of the same size.

The complexity measure most commonly used is the running time of the algorithm, although other measures may also be used (the amount of memory used, the number of packets sent over a network, etc.). The size measure most commonly used is the number of bits comprising the input instance, although, again, other measures may be used (the number of vertices in a graph, etc.).

See also

References

Further readings

Paul E. Black, "big-O notation", in Dictionary of Algorithms and Data Structures [online], Paul E. Black, ed., U.S. National Institute of Standards and Technology. 20 November 2008. Retrieved Feb 20/09. Available from: http://www.itl.nist.gov/div897/sqg/dads/HTML/bigOnotation.html

Michael Sipser (1997). Introduction to the Theory of Computation. PWS Publishing.