Implicit data structure
![]() | This article has multiple issues. Please help improve it or discuss these issues on the talk page. (Learn how and when to remove these messages)
|
In computer science, an implicit data structure stores very little information other than the main or required data. These storage schemes retain no pointers, represent a file of n k-key records as an n by k array.[clarification needed] In implicit data structures, the only structural information is to allow the array[clarification needed] to grow and shrink. It is called "implicit" because the order of the elements carries meaning. Another term used interchangeably is space efficient. Definitions of “very little” are vague and can mean from O(1) to O(log n) extra space. Everything is accessed in-place, by reading bits at various positions in the data. To achieve memory-optimal coding, appropriate data items use bits instead of bytes. Implicit data structures are also succinct data structures.
Efficiency concerns
Implicit data structures are designed to improve main memory utilization, concomitantly reducing access to slower storage. A greater fraction of data in an implicit data structure can fit in main memory, reducing administrative processing. Implicit data structures can improve cache-efficiency and thus running speed, especially if the method used improves locality of reference.
Weighted element
For presentation of elements with different weights, several data structures are required. The structure[clarification needed] uses one more location besides those required for element values.[clarification needed] The first structure supports worst case search time in terms of rank of weight of elements with respect to set of weights.[clarification needed] If the elements are drawn from uniform distribution, then variation[clarification needed] of this structure will take average time.[clarification needed] The same result obtains[clarification needed] for the data structures in which the intervals between consecutive values[clarification needed] have access probabilities.[clarification needed]
Examples
A trivial example of an implicit data structure is an array data structure, which is an implicit data structure for a list, and requires only the constant overhead of the length; unlike a linked list, which has a pointer associated with each data element. Similarly, a null-terminated string is an implicit data structure for a string (list of characters). These are considered very simple because they only admit the simple operation of iteration over the elements.
A less trivial example is a sorted array, which allows search in logarithmic time by binary search. Contrast with a search tree, specifically a binary search tree, which also allows logarithmic-time search, but requires pointers.
An important example of an implicit data structure is representing a perfect binary tree as a list, in increasing order of depth, so root, first left child, first right child, first left child of first left child, etc. Such a tree occurs notably for an ancestry chart to a give depth, and the implicit representation is known as an Ahnentafel (ancestor table).
This can be generalized to a complete binary tree (where the last level may be incomplete), which yields the best-known example of an implicit data structure, namely the binary heap, which is an implicit data structure for a priority queue. This is more sophisticated than earlier examples because it allows multiple operations, which modify the data: not only top, but also insert and pop.
More sophisticated implicit data structures include the beap (bi-parental heap).
History
The trivial examples of lists or tables of values date to prehistory, while historically non-trivial implicit data structures date at least to the Ahnentafel, which was introduced by Michaël Eytzinger in 1590 for use in genealogy. In formal computer science, the first implicit data structure is generally considered to be the sorted list, used for binary search, which was introduced by John Mauchly in 1946, in the Moore School Lectures, the first ever set of lectures regarding any computer-related topic.[1][2] The binary heap was introduced in (Williams 1964) to implement the heapsort.[2] The notion of an implicit data structure was formalized in (Munro & Suwanda 1980), as part of introducing and analyzing the beap.[2]
References
- ^ Knuth 1998, §6.2.1 ("Searching an ordered table"), subsection "History and bibliography".
- ^ a b c Franceschini, Gianni; Munro, J. Ian (2006). Implicit dictionaries with O(1) modifications per update and fast search. Seventeenth Annual ACM-SIAM Symposium on Discrete Algorithms. Miami, FL, United States. pp. 404–413. doi:10.1145/1109557.1109603.
- Munro, J.Ian; Suwanda, Hendra (October 1980). "Implicit data structures for fast search and update". Journal of Computer and System Sciences. 21 (2): 236–250. doi:10.1016/0022-0000(80)90037-9.
{{cite journal}}
: CS1 maint: date and year (link)
Further reading
- See publications of Hervé Brönnimann,[dead link] J. Ian Munro, Greg Frederickson