+ 3
Difference between Normalization and Standardization? Why we do Minmax scaling and standard scaling?
Please explain me what happen in scaling?
1 Answer
+ 1
Normalization and standardization are used to preprocess data to improve the performance of machine learning algorithms. Normalization scales data to a range between 0 and 1, while standardization scales data to have zero mean and unit variance. Minmax scaling and standard scaling are specific techniques for normalization and standardization, respectively. Minmax scaling is used when the range of the variable is important, while standard scaling is used when the mean and variance of the variable are important.