Jump to content

Data normalization

From Wikipedia, the free encyclopedia
This is an old revision of this page, as edited by Ukexpat (talk | contribs) at 21:06, 31 October 2012 (added Category:Algorithms and data structures using HotCat). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

Data normalization is the process of reducing data to its canonical form. For instance, Database normalization is the process of organizing the fields and tables of a relational database to minimize redundancy and dependency. In the field of software security, a common vulnerability is unchecked malicious input. The mitigation for this problem is proper input validation. Before input validation may be performed, the input must be normalized, i.e., eliminating encoding (for instance HTML encoding) and reducing the input data to a single common character set.

Other forms of data, typically associated with signal processing (including audio and imaging), can be normalized in order to provide a limited range of values within a norm.