Jump to content

Histogram matching

From Wikipedia, the free encyclopedia
This is an old revision of this page, as edited by 93.173.53.113 (talk) at 22:24, 29 January 2014 (References). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.
An example of histogram matching

Histogram matching is a method in image processing of color adjustment of two images using the image histograms.

It is possible to use histogram matching to balance detector responses as a relative detector calibration technique. It can be used to normalize two images, when the images were acquired at the same local illumination (such as shadows) over the same location, but by different sensors, atmospheric conditions or global illumination.

The algorithm

Given two images, the reference and the adjusted images, we compute their histograms. Following, we calculate the cumulative distribution functions of the two images' histograms - for the reference image and for the target image. Then for each gray level , we find the gray level for which , and this is the result of histogram matching function: . Finally, we apply the function on each pixel of the reference image.

Multiple Histograms Matching

The Histogram matching Algorithm can be extended to find a monotonic mapping between two sets of histograms. Given two sets of histograms and , the optimal monotonic color mapping is calculated to minimize the distance between the two sets simultaneously, namely where is a distance metric between two histograms. The optimal solution is calculated using dynamic programming [1]

References

  1. ^ Shapira D., Avidan S., Hel-Or Y. (2013). "Multiple Histogram Matching". Proceedings of the The IEEE International Conference on Image Processing. {{cite conference}}: External link in |booktitle= (help); Unknown parameter |booktitle= ignored (|book-title= suggested) (help)CS1 maint: multiple names: authors list (link)

See also