normal distribution

Naively, a variable is said to be normally distributed when there is a central value about which randomly selected values tend to cluster, symmetrically, with the frequency dropping off to either side in the familiar “bell curve” pattern. This distribution occurs in many phenomena and may be expected any time the deviations of a variable away from its mean are owing solely to natural variability.

Formally, the normal distribution, also called the Gaussian distribution in probability theory, is the curve that is completely determined by its mean \(\mu\) and standard deviation \(\sigma\) and which is given by:

\[f(x)=\displaystyle\frac{1}{\sigma\sqrt{2\pi}} e^{-\frac{(x-\mu)^2}{2\sigma^2}}\]

The normal distribution with mean \(\mu\) and standard deviation \(\sigma\) is typically denoted by \(N(\mu,\sigma)\).

The normal curve is an example of a density curve, and because it is symmetric its mean, median, and mode all coincide.

Figure 1: The Normal Distribution

Notably, the distance to either side of one standard deviation away from the mean corresponds to the points at which the curvature changes from concave down to concave up (its points of inflection).

Like all density curves, the normal curve encloses an area of 1 against the horizontal axis, and the area under the curve between any two values on the axis represents the proportion of randomly selected values of the variable that can be expected to fall between those two values. The shape of the bell-curve—taller and thinner or shorter and broader— and where on the axis the center falls are determined by the standard deviation \(\sigma\) and the mean \(\mu\), respectively.

Figure 2: Normal Curves

The 68–95–99.7 Rule

For any normal distribution the proportion of area under the curve and within one standard deviation of the mean is approximately 68%; within two standard deviations approximately 95%; and within 3 standard deviations approximately 99.7%. This rule is a useful heuristic for estimating the frequency of scores falling within a given distance of the mean for any normally distributed variable.

Figure 3: The “68–95–99.7 Rule.”

The Standard Normal

Because the area under the curve within a given distance of the mean as measured by standard deviations is the same regardless of the actual value of the mean or the standard deviation, so too the frequency of values that may be expected in a normal distribution to fall within a given number of standard deviations is not dependent on the actual value of the mean or standard deviation. Consequently, it is useful to consider a standard normal distribution, one with mean equal to 0 and standard deviation equal to 1, in which the value of a variable is exactly its distance from the mean as measured in standard deviations.

Figure 4: The Standard Normal.

Typically the variable for the standard normal distribution is denoted by \(z\), and any normal distribution may be standardized by replacing each value of the variable with its distance from the mean in standard deviations—what is called its “\(z\)-score.” That is, given a normal distrubution \(N(\mu,\sigma)\) with mean \(\mu\) and standard deviation \(\sigma\), we convert each value \(x\) to its corresponding \(z\)-score by the formula:

\[z=\displaystyle\frac{x-\mu}{\sigma}\]

Finding a data-point's \(z\)-score allows the statistician to infer the approximate percentile of the data-point, i.e., the proportion of data values in the population that are smaller.

The Cumulative Distribution Function

The normal cumulative distribution function defines the area under the curve and to the left of any specified value of the normally distributed variable. Prior to the availability of scientific calculators, this function was encoded in a table of values corresponding to standard scores (\(z\)-scores), and a statistician would find the value of the cumulative distribution function for a given value of the normally distributed variable by first converting that value to its standard score and then looking up the corresponding cumulative proportion in the table.

Home