question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Make RidgeCV, LogisticRegressionCV, ... warn when found optimal regularization parameter lies at the boundary of the range

See original GitHub issue

I think the RidgeCV().fit(X_train, y_train) should warn the user if the found value for alpha_ is either alphas.min() or alphas.max(). E.g.

StatisticalWarning: the optimal value for the regularization parameter 'alpha' was 0.01 which lies at a boundary of the explored range (between 0.01 and 1.). Consider setting the 'alphas' parameter to explore a wider range. 

We could add a new boundary_warning=True constructor parameter to make it possible to silence the warning.

BTW, the default ranges could probably be extended between 1e-6 and 1e6 with 13 levels on the logspace whenever it is cheap to do so (e.g. for RidgeCV whos current default range much too narrow [0.1, 1., 10.]).

Issue Analytics

  • State:open
  • Created 4 years ago
  • Reactions:2
  • Comments:7 (7 by maintainers)

github_iconTop GitHub Comments

1reaction
rthcommented, Feb 6, 2020

Please do @divyaprabha123 ! I think it would be better to make 2 separate PRs: one to warn for such cases (which makes sense no matter the boundaries) and one to increase the default boundaries in a few models.

0reactions
bmreinigercommented, Oct 11, 2021

I think I can finish off the work by @divyaprabha123 toward RidgeCV in #16408. Depending on how that goes I may have a look at the others listed by @Reksbril at #16783.

Read more comments on GitHub >

github_iconTop Results From Across the Web

sklearn.linear_model.LogisticRegressionCV
The liblinear solver supports both L1 and L2 regularization, with a dual formulation only for ... the best hyperparameter is selected by the...
Read more >
An Introduction to glmnet
alpha is for the elastic net mixing parameter α, with range α ∈ [0, 1]. α = 1 is lasso regression (default) and...
Read more >
Does sklearn LogisticRegressionCV use all data for final model
This is looking at just one regularization parameter and 5 folds in the CV. So clf.scores_ will be a dictionary with one key...
Read more >
Lecture notes on ridge regression - arXiv
The ridge regression estimator is always to be found on the boundary of the ridge parameter constraint and is never an interior point....
Read more >
Harvard CS109A | Lab 5: Regularization and Cross-Validation
In this lab we will explore one regularized regression model: ridge regression. ... It is often the case that you just can't make...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found