question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Tuning implicite hyperparameter of tranformers inside TransformedTargetForecaster

See original GitHub issue

Is your feature request related to a problem? Please describe. I have locally created a class TuningTransformer inspired by this TabularToSeriesAdaptor in order to tune the implicite hyperparameter of using a tranformers or not inside TransformedTargetForecaster in order to tune this all together in a ForecastingGridSearchCV. Therefore, the class TuningTransformer has an arg use: bool to turn on and off the use of a tranformer.

Describe the solution you’d like All together looks as follows:

TuningTransformer(BoxCoxTransformer(method="pearsonr"))

Or more complicated with TabularToSeriesAdaptor:

TuningTransformer(TabularToSeriesAdaptor(MinMaxScaler()))

So in the end I will put each tranformer inside a TuningTransformer and all together with GS looks like this:

pipe = TransformedTargetForecaster(steps=[
    ("deseasonalise", TuningTransformer(transformer=Deseasonalizer())),
    ("minmax", TuningTransformer(transformer=TabularToSeriesAdaptor(MinMaxScaler()))),
    ("power", TuningTransformer(transformer=TabularToSeriesAdaptor(PowerTransformer()))),
    ("model", TBATS()),
    ]
)

cv = SlidingWindowSplitter(initial_window=int(len(y_train) * 0.5), window_length=24, start_with_window=True, step_length=24)
param_grid = {
    "deseasonalise__transformer__model" : ["multiplicative", "additive"],
    "deseasonalise__transformer__sp" : [12],
    "deseasonalise__use" : [True, False],

    "power__transformer__transformer__method" : ["yeo-johnson", "box-cox"],
    "power__transformer__transformer__standardize" : [True, False],
    "power__use" : [True, False],

    "minmax__transformer__transformer__copy": [(0,1), (0,2)],
    "minmax__use" : [True, False],

    'model__sp': [12],
}
gscv = ForecastingGridSearchCV(forecaster=pipe, param_grid=param_grid, cv=cv, n_jobs=-1)
gscv.fit(y)

It can look a bit confusing to have grid arguments like minmax__transformer__transformer__copy, however this is all consistent with existing stuff in sktime and sklearn. And it is very handy for tuning 😃

Additional context I might contribute this to sktime in case it is appreciated, otherwise I dont mind to keep it in our private repo. I am open to adjust my implementations in case there are other ideas how to solve the implicit hyperparameter tuning in this case. Maybe the class can have a better name…


Martin Walter martin_friedrich.walter@daimler.com, Mercedes-Benz AG on behalf of Daimler TSS GmbH. https://github.com/Daimler/daimler-foss/blob/master/LEGAL_IMPRINT.md

Issue Analytics

  • State:closed
  • Created 2 years ago
  • Comments:15

github_iconTop GitHub Comments

2reactions
fkiralycommented, Mar 24, 2021

Quick question, @aiwalter, what does this do? Does it simply default to the identity transformer if use=False, and to the wrapped transformer if use=True?

Seems like a really good idea to me!

The name though seems sub-optimal, because you would expect a TuningTransformer to tune (which it doesn’t). How about Optional, or OptionalTunable or similar?

PS: beware overfitting by overtuning…

1reaction
fkiralycommented, Mar 30, 2021

Regarding implementation, it’s probably a good idea to have an OptionalPassthrough transformer.

Agreed, seem my point above:

Might be worth having the OptionalPassthrough as a shorthand, but that would imply that it should perhaps be wherever the multiplexer is, and it should perhaps even be a contraction of the multiplexer?

Read more comments on GitHub >

github_iconTop Results From Across the Web

OptionalPassthrough — sktime documentation
Allows tuning the implicit hyperparameter whether or not to use a particular transformer inside a pipeline (e.g. TranformedTargetForecaster) or not.
Read more >
Hyperparameter optimization for fine-tuning pre-trained ...
In this post, we discussed hyperparameter optimization for fine-tuning pre-trained transformer models from Hugging Face based on Syne Tune. We ...
Read more >
Transformer Model — darts documentation - GitHub Pages
Transformer is a state-of-the-art deep learning model introduced in 2017. It is an encoder-decoder architecture whose core feature is the 'multi-head ...
Read more >
Hyperparameter Search with Transformers and Ray Tune
Ray Tune is a popular Python library for hyperparameter tuning that provides many state-of-the-art algorithms out of the box, along with ...
Read more >
sklearn.compose.TransformedTargetRegressor
Meta-estimator to regress on a transformed target. Useful for applying a non-linear transformation to the target y in regression problems. This transformation ......
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found