Jump to content

Normalization (image processing)

From Wikipedia, the free encyclopedia
This is an old revision of this page, as edited by Feifanlai (talk | contribs) at 12:47, 23 April 2018 (Undid revision 837852195 by WikiDan61 (talk) The example section is all my own work, images are created by me. And I think some examples could help readers to understand better.). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

In image processing, normalization is a process that changes the range of pixel intensity values. Applications include photographs with poor contrast due to glare, for example. Normalization is sometimes called contrast stretching or histogram stretching. In more general fields of data processing, such as digital signal processing, it is referred to as dynamic range expansion.[1]

The purpose of dynamic range expansion in the various applications is usually to bring the image, or other type of signal, into a range that is more familiar or normal to the senses, hence the term normalization. Often, the motivation is to achieve consistency in dynamic range for a set of data, signals, or images to avoid mental distraction or fatigue. For example, a newspaper will strive to make all of the images in an issue share a similar range of grayscale.

Normalization transforms an n-dimensional grayscale image with intensity values in the range (Min,Max), into a new image with intensity values in the range (newMin,newMax).

The linear normalization of a grayscale digital image is performed according to the formula

For example, if the intensity range of the image is 50 to 180 and the desired range is 0 to 255 the process entails subtracting 50 from each of pixel intensity, making the range 0 to 130. Then each pixel intensity is multiplied by 255/130, making the range 0 to 255.

Normalization might also be non linear, this happens when there isn't a linear relationship between and . An example of non-linear normalization is when the normalization follows a sigmoid function, in that case, the normalized image is computed according to the formula

Where defines the width of the input intensity range, and defines the intensity around which the range is centered.[2]

Auto-normalization in image processing software typically normalizes to the full dynamic range of the number system specified in the image file format.

Example

From a visual perspective, normalization is similar to moving and scaling an image. During normalization, an image is squeezed or stretched to a desired extent in different axis, until it fits in the normalization model. For a 2-D image, scaling could happen on the X axis or Y axis, or both, depending on the newMax, newMin. Normalization could help to make patterns appear in a desired area, with the normalized size.

In a line drawing, normalization might change the length and direction of a line, in respect to the desired normalization range. The example shows that the original line has a range of x=[0,5], y=[1,6]. After normalization, the line falls into the normalized range of x=[0,1], y=[0,1].

line example

Another example is a triangle drawing, with the original pattern has a range of x=[-10,10], y=[0,10]. After normalization, the triangle falls into the normalized range of x=[0,1], y=[0,1].

triangle pattern example

Another common function of normalization is changing the grayscale of an actual photo, by fitting the intensity values of the pixels into a normalized range. The example shows that an original photo with density range of [0,255] is normalized to a density range of [0,127]. After normalization, the photo has much lower brightness, but keeps the other characteristics.

File:Original photo normalization.jpg
original photo
File:Photo after normalization.jpg
photo after normalization

See also

References

  1. ^ Rafael C. González, Richard Eugene Woods (2007). Digital Image Processing. Prentice Hall. p. 85. ISBN 0-13-168728-X.
  2. ^ ITK Software Guide