-
Notifications
You must be signed in to change notification settings - Fork 0
standardization vs normalization
Tansu Dasli edited this page Sep 18, 2023
·
12 revisions
Both are feature scaling and preprocessing steps.
-
standardization does not change the distribution!
- column-based
- standard scaler (μ=0, σ=1)
- min max scaler (min=0, max=1)
-
normalization changes the distribution, and has two-types. L1 and L2
- row-based
- L1 : least absolute deviations, sum of absolute rows = 1
- L2 : least squares (root), sum of squares rows = 1
scaling
gradient calculations | mandatory | linear r, logistic r., neural n., deep l.
distance based calculations | no-need | tree, ensemble models, k-means ...
neural networks, dl | Normalization | cause model expects normalized values!
linear, logistic r., svm | Standardization