Jump to content

Worst-case complexity

From Wikipedia, the free encyclopedia
This is an old revision of this page, as edited by Edward Vielmetti (talk | contribs) at 06:34, 31 December 2008 (See also). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

In computer science, worst case complexity is a performance measure of an algorithm. It is defined as the maximum of the complexity measure of the algorithm over all possible input instances of the same size. The most commonly used complexity measure is the running time and the size refers to the number of bits of the input instance. However, the complexity measure can also refer to amount of memory used, number of packets sent over a network, or any other performance measure. Similarly, size of an input instance can be measured differently, e.g., if the input of an algorithm is a graph, size can refer to the number of its vertices.

See also

References