Ridge Penalty
Ridge penalty, or L2 regularization, adds a penalty equal to the square of the magnitude of coefficients to the loss function. Unlike Lasso, it does not shrink coefficients to zero but rather spreads the impact across all variables by keeping them small.
This is particularly useful when there are many features that are highly correlated with each other, which is frequent in market data. In options pricing, where many Greeks might be related, the Ridge penalty helps maintain model stability by preventing any single coefficient from becoming excessively large.
It effectively reduces the variance of the model, making it less sensitive to small changes in the training data. This leads to a more robust model that is less prone to overfitting, even if it does not perform feature selection in the same way as Lasso.