Feature Scaling: Normalize and Standardize

shekhar pandey
1 min readMay 3, 2020

If our dataset has features measured in different scales, then their magnitudes might vary a lot in terms of range, so we need to adopt a feature scaling technique, so that magnitudes of features are at same scale.

Normalize:
Normalization also called as Min-Max Scaling we calculate new values of feature as following:
x’ = (x - min(x)) / (max(x)- min(x))
[where x is original value of feature, x’ is new value after normalization]

when x = min(x), numerator becomes (min(x)-min(x) => x’=0
when x = max(x), numerator becomes (max(x)-min(x)) i.e
numerators and denominators are same , hence x’=1

So, in normalization process we are in a way boxing the values of a feature in range of [0,1]

Standardize:
In standardization we calculate new values of feature as following:
x’ = (x - mean(x))/std(x)
where :
mean(x) is mean value of x
std(x) : standard deviation of x
x, is original values, x’ is standardized value

Property of Standarization is that —
i. mean of value standardized feature is 0
ii. standard deviation of standardized feature is 1

Implementation:

--

--