question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

[FEATURE] scale hyperparameter for the RepeatingBasisFunction

See original GitHub issue

Hi!

How about a hyperparameter that controls the width of the RBFs? You set it to a (sensible) fixed value so far.https://github.com/koaning/scikit-lego/blob/51ef0c7f4fb0ac8717b3660931b78633831e170e/sklego/preprocessing/repeatingbasis.py#L86

But how about some scale hyperparameter and changing this line to

self.width_ = self.scale / self.n_periods

with a default parameter of 1? I think it’s good to have a choice there.

Best Robert

Issue Analytics

  • State:closed
  • Created 3 years ago
  • Comments:6 (5 by maintainers)

github_iconTop GitHub Comments

1reaction
Garvecommented, Mar 7, 2021

Sure, I would also keep the default like this. It usually works quite well and 1 is a great constant, nobody asks questions 😉

1reaction
Garvecommented, Mar 7, 2021

I think @Garve means that the scale_ feature should influence the width of the generated rbf features. Increasing the width would make farther away values in the original space more alike in the transformed space and therefore influence the model fitting.

Adding it makes sense to me, but I understand the confusion with calling it scale.

Oh yeah, I mean scale in the sense of being able to scaling the width of the bumps. Sure, the name can be changed to something else, I just didn’t want to call it width again since there is width_ already. And this shouldn’t change much, there should be just some scaling factor to it.

Read more comments on GitHub >

github_iconTop Results From Across the Web

Tune Hyperparameters for Classification Machine Learning ...
Machine learning algorithms have hyperparameters that allow you to tailor the behavior of the algorithm to your specific dataset.
Read more >
Hyperparameter tuning for machine learning models.
This is often referred to as "searching" the hyperparameter space for the optimum values. In the following visualization, the x and y dimensions ......
Read more >
How to tune a Decision Tree?. Hyperparameter tuning
Hyperparameter tuning is searching the hyperparameter space for a set of values that will optimize your model architecture. This is different ...
Read more >
Hyperparameter Optimization for Machine Learning Models ...
In practice, it is necessary to continuously adjust hyperparameters and train a number of models with different combinations of values, and then compare...
Read more >
Hyperparameter optimization - Wikipedia
In machine learning, hyperparameter optimization or tuning is the problem of choosing a set of optimal hyperparameters for a learning algorithm.
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found