Quant #28: Standard Deviation and Variance

Standard deviation can be defined as a statistical measure of how far a group of numbers is from the mean. In other words, standard deviation measures how far apart numbers are in a data set. If the data points are further from the mean, there is a higher deviation within the data. In contrast, if the data points are closer to the mean, there is a lower deviation. Standard deviation is computed as the square root of the variance. But what is a variance?

Variance refers to the average of the squared differences from the mean. Like standard deviation, variance can be understood as the measure of variability. Essentially, variance reflects the degree of spread in the data set. A larger variance implies more spread in the data. One of the reasons variance is important is its usefulness in parametric statistical tests which require equal or similar variances (this is referred to as homogeneity of variance or homoscedasticity). Unequal variances between samples yield biased test results and, consequently, necessitate non-parametric statistical tests.