- Regularization discourages overly complex models by penalizing the loss function
- Lasso and Ridge are two commonly used so-called regularization techniques
- In Ridge regression, the cost function is changed by adding a penalty term to the square of the magnitude of the coefficients
- Ridge regression is often also referred to as L2 Norm Regularization
- Lasso regression is very similar to Ridge regression, except that the magnitude of the coefficients are not squared in the penalty term
- Lasso regression is often also referred to as L1 Norm Regularization
- AIC and BIC are two measures which give you a comprehensive measure of model performace taking into account the varying number of features
- The lower the AIC and/or BIC, the better the model
learn-co-curriculum / dsc-regularization-recap-v2-1 Goto Github PK
View Code? Open in Web Editor NEWLicense: Other