Jump to content

Worst-case complexity

From Wikipedia, the free encyclopedia
This is an old revision of this page, as edited by Beekeepingschool (talk | contribs) at 19:09, 17 February 2009 (typo). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

In computer science, worst case complexity is a performance measure of an algorithm. It is defined as the maximum of the complexity measure of the algorithm over all possible input instances of the same size.

The complexity measure most commonly used is the running time of the algorithm, although other measures may also be used (the amount of memory used, the number of packets sent over a network, etc.). The size measure most commonly used is the number of bits comprising the input instance, although, again, other measures may be used (the number of vertices in a graph, etc.).

See also

References