Overview#
Regularization
Regularization in
Machine Learning penalizes the
weight when it is too large.For Regularization with
Logistic Regression, you try to minimize the
Cost function add lambda (λ), which is called the
regularization parameter which is usually determined form the development set.
Above shows the L2 Regularization formula and then the Regularization Parameter added to the Cost function "J"L2 Regularization formula is just the square Euclidean norm of the prime to vector w which is called L^2 Regularization which is the most common.
Two popular examples of Regularization methods for Linear Regression are:
These methods are effective to use when there is collinearity in your input values and
Ordinary Least Squares would cause
Overfitting the
Training dataset.
Misc Notes#
The
bias is generally not Regularization.
In Python lambda is a reserved word so often it is used as lambd.
There might be more information for this subject on one of the following: