question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Why use SVD in Ridge Regression?

See original GitHub issue

LIME use Ridge Regression with argument “solver=‘auto’” in lime_base.py and in most cases SVD is used as solver. SVD is unstable for very sparse data. I think should we use “solver=‘sag’” or “solver=‘saga’”. Modified code is below.

    def feature_selection(self, data, labels, weights, num_features, method):
        """Selects features for the model. see explain_instance_with_data to
           understand the parameters."""
        if method == 'none':
            return np.array(range(data.shape[1]))
        elif method == 'forward_selection':
            return self.forward_selection(data, labels, weights, num_features)
        elif method == 'highest_weights':
            clf = Ridge(alpha=0, fit_intercept=True,
                        random_state=self.random_state, solver='saga')

Issue Analytics

  • State:closed
  • Created 5 years ago
  • Comments:5 (3 by maintainers)

github_iconTop GitHub Comments

1reaction
marcotcrcommented, Mar 29, 2018

It seems to me that the predict_fn does not matter at all at this point, as it just provides the labels that Ridge will use. SVD is being used to optimize ridge, not whatever predict_fn is. Am I missing something?

0reactions
0shimaxcommented, Mar 24, 2018

I mean that I use DNN as a predict_func in “explain_instance” function, just in case. For Ridge, you can see the description below in documentation of Scikit-learn.

All last five solvers support both dense and sparse data. However, only ‘sag’ and ‘saga’ supports sparse input when fit_intercept is True.

http://scikit-learn.org/stable/modules/generated/sklearn.linear_model.Ridge.html

Read more comments on GitHub >

github_iconTop Results From Across the Web

SVD in Machine Learning: Ridge Regression and ...
Ridge regression builds on least squares by adding a regularization term in the cost function so that it becomes ∥y — Xw∥² +...
Read more >
SVD Part 2 - Insights into Ridge Regression using SVD
In this post, I will attempt to use this fact to gain more insight into Ridge Regression. The idea is borrowerd from Section...
Read more >
The proof of shrinking coefficients using ridge regression ...
The question appears to ask for a demonstration that Ridge Regression shrinks coefficient estimates towards zero, using a spectral decomposition.
Read more >
1 Ridge regression using SVD 2 Ridge regression with ...
HW3. 1 Ridge regression using SVD. Let X = UDVT be the SVD of the design matrix, and let w = (XT X...
Read more >
The Singular-Value Decomposition - NYU
The SVD provides a complete characterization of the variance of a p-dimensional dataset in every direction of Rp, and is therefore very useful...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found