Jump to content

Standard normal deviate

From Wikipedia, the free encyclopedia
This is an old revision of this page, as edited by Michael Hardy (talk | contribs) at 15:11, 14 October 2008 (replaced self-link with properly bolded title phrase per WP:MOS). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

A standard normal deviate is a normally distributed random variable with expected value 0 and variance 1.

Standard normal deviates arise in practical statistics in two ways.

  • Given a model for a set of observed data, a set of manipulations of the data can result in a derived quantity which, assuming that the model is a true representation of reality, is a standard normal deviate (perhaps in an approximate sense). This enables a significance test to be made for the validity of the model.
  • In the computer generation of a pseudorandom number sequence, the aim may be to generate random numbers having a normal distribution: these can be obtained from standard normal deviates (themselves the output of a pseudorandom number sequence by adding the required location parameter and multiplying by the scale parameter. More generally, the generation of pseudorandom number sequence having other marginal distributions may involve manipulating sequences of standard normal deviates: an example here is the chi-square distribution, random values of which can be obtained by adding the squares of standard normal deviates (although this would seldom be the fastest method of generating such values.