Jump to content

Data normalization

From Wikipedia, the free encyclopedia
This is an old revision of this page, as edited by Softtest123 (talk | contribs) at 15:57, 31 October 2012 (Deleting indirection. Adding article. See Talk.). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

Data normalization is the process of reducing data to its canonical form. For instance, Database normalization is the process of organizing the fields and tables of a relational database to minimize redundancy and dependency. In the field of software security, a common vulnerability is unchecked malicious input. The mitigation for this problem is proper input validation. Before input validation may be performed, the input must be normalized, i.e., eliminating encoding (for instance HTML encoding) and reducing the input data to a single common character set.