Overview#
RegularizationRegularization in Machine Learning#
Regularization in Machine Learning penalizes the weight when it is too large.For Regularization with Logistic Regression, you try to minimize the Cost function add lambda (λ), which is called the regularization parameter which is usually determined form the development set.
Above shows the L2 Regularization formula and then the Regularization Parameter added to the Cost function "J"
L2 Regularization formula is just the square Euclidean norm of the prime to vector w which is called L^2 Regularization which is the most common.
Two popular examples of Regularization methods for Linear Regression are:
- LASSO Regression
- Ridge Regression
- Dropout Regularization
Misc Notes#
The bias is generally not Regularization.In Python lambda is a reserved word so often it is used as lambd.
More Information#
There might be more information for this subject on one of the following:- Dropout Regularization
- Hyperparameters
- Least Absolute Shrinkage and Selection Operator
- Machine Learning Algorithms
- Overfitting
- Ridge Regression
- [#1] - Frobenius Norm
- based on information obtained 2018-01-03