Jump to content

Distributed source coding

From Wikipedia, the free encyclopedia
This is an old revision of this page, as edited by Fangyong (talk | contribs) at 12:28, 4 December 2009 (Virtual channel). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

In the computer science field of data compression, distributed source coding (DSC) is the compression of multiple correlated information sources that do not communicate with each other.[1] Unlike other source coding technologies, DSC relies on channel codes. It has applications in sensor networks and video/multimedia compression (see distributed video coding[2]). One of the main properties of distributed source coding is that the computational burden in encoders is shifted to the joint decoder.

Theoretical bounds

The information theoretical lossless compression bound on DSC (the Slepian-Wolf bound) was first purposed by David Slepian and Jack Keil Wolf in terms of entropies of correlated information sources in 1973.[3] They also showed that two isolated sources can compress data as efficiently as though they communicating with each other. This bound has been extended to the case of more than two correlated sources by Thomas M. Cover in 1975.[4]

Similar results were obtained in 1976 by Aaron D. Wyner and Jacob Ziv with regard to lossy coding of joint Gaussian sources.[5]

History

Syndrome decoding technology was first used in distributed source coding by the DISCUS system of SS Pradhan and K Ramachandran (Distributed Source Coding Using Syndromes).[6] They compress binary block data from one source into syndromes and transmit data from the other source uncompressed as side information. This kind of DSC scheme achieves asymmetric compression rates per sources and results in asymmetric DSC. This asymmetric DSC scheme can be easily extended to the case of more than two correlated information sources. There are also some DSC schemes that use parity bits rather than syndrome bits.

The correlation between two sources in DSC has been modelled as a virtual channel which is usually referred as a binary symmetric channel.[7][8]

With deterministic and probabilistic discussions of correlation model of two correlated information sources, DSC schemes with more general compressed rates have been developed.[9][10][11] In these non-asymmetric schemes, both of two correlated sources are compressed.

Under a certain deterministic assumption of correlation between information sources, a DSC framework in which any number of information sources can be compressed in a distributed way has been demonstrated by X. Cao and M. Kuijper.[12] This method performs non-asymmetric compression with flexible rates for each source, achieving the same overall compression rate as repeatedly applying asymmetric DSC for more than two sources.

Slepian–Wolf bound

Virtual channel

Deterministic model

Probabilistic model

Asymmetric DSC

Non-asymmetric DSC

Non-asymmetric DSC for more than two sources

References