Standardization Is Not Normalization

Standardization Is Not Normalization
Image generated with A.I.

Standardization and normalization are often confused or thought to be the same. They are not. While both are methods employed to transform data into a given format, but their approach and purpose differ.

Normalization constrains the values between 0 and 1. That's all. Normalization is useful when the values of the variable are not comparable and have different scales. Normalization ensures that all values are within the same range.

Standardization, on the other hand, ensures that your data has properties of a normal distribution, i.e., the mean is 0 and the standard deviation is 1. As this method uses Z-scores, it is also known as Z-score normalization. Standardization is useful when the range of values is large and the variables have different units of measurement. Standardization allows for easier comparison between variables.

If you want to dig deeper, below are a couple of resources:

Resource 1: Normalization vs Standardization - Quantitative Analysis

Resource 2: About Feature Scaling and Normalization