I found this online: "StandardScaler or Z-Score Normalization is one of the feature scaling techniques, here the transformation of features is done by subtracting from the mean and dividing by standard deviation. This is often called Z-score normalization. The resulting data will have the mean as 0 and the standard deviation as 1."
To sum it up: why is it called Z-Score Normalization if it uses a Standardization technique?
What I'm thinking is that if it is called Z-Score Normalization shouldn't it use a Normalization technique rather than a Standardization one?
According to sklearn StandardScaler documentation:
StandardScalerstandardize features by removing the mean and scaling to unit variance.The standard score of a sample
xis calculated as:z = (x - u) / s(The formula for calculating a z-score)So, both of
StandardScaler(standard normalization) andZ-Score Normalizationuse the same formula and they are equivalent.