Tuning implicite hyperparameter of tranformers inside TransformedTargetForecaster
See original GitHub issueIs your feature request related to a problem? Please describe.
I have locally created a class TuningTransformer
inspired by this TabularToSeriesAdaptor
in order to tune the implicite hyperparameter of using a tranformers or not inside TransformedTargetForecaster
in order to tune this all together in a ForecastingGridSearchCV
. Therefore, the class TuningTransformer
has an arg use: bool
to turn on and off the use of a tranformer.
Describe the solution you’d like All together looks as follows:
TuningTransformer(BoxCoxTransformer(method="pearsonr"))
Or more complicated with TabularToSeriesAdaptor
:
TuningTransformer(TabularToSeriesAdaptor(MinMaxScaler()))
So in the end I will put each tranformer inside a TuningTransformer
and all together with GS looks like this:
pipe = TransformedTargetForecaster(steps=[
("deseasonalise", TuningTransformer(transformer=Deseasonalizer())),
("minmax", TuningTransformer(transformer=TabularToSeriesAdaptor(MinMaxScaler()))),
("power", TuningTransformer(transformer=TabularToSeriesAdaptor(PowerTransformer()))),
("model", TBATS()),
]
)
cv = SlidingWindowSplitter(initial_window=int(len(y_train) * 0.5), window_length=24, start_with_window=True, step_length=24)
param_grid = {
"deseasonalise__transformer__model" : ["multiplicative", "additive"],
"deseasonalise__transformer__sp" : [12],
"deseasonalise__use" : [True, False],
"power__transformer__transformer__method" : ["yeo-johnson", "box-cox"],
"power__transformer__transformer__standardize" : [True, False],
"power__use" : [True, False],
"minmax__transformer__transformer__copy": [(0,1), (0,2)],
"minmax__use" : [True, False],
'model__sp': [12],
}
gscv = ForecastingGridSearchCV(forecaster=pipe, param_grid=param_grid, cv=cv, n_jobs=-1)
gscv.fit(y)
It can look a bit confusing to have grid arguments like minmax__transformer__transformer__copy
, however this is all consistent with existing stuff in sktime
and sklearn
. And it is very handy for tuning 😃
Additional context
I might contribute this to sktime
in case it is appreciated, otherwise I dont mind to keep it in our private repo. I am open to adjust my implementations in case there are other ideas how to solve the implicit hyperparameter tuning in this case. Maybe the class can have a better name…
Martin Walter martin_friedrich.walter@daimler.com, Mercedes-Benz AG on behalf of Daimler TSS GmbH. https://github.com/Daimler/daimler-foss/blob/master/LEGAL_IMPRINT.md
Issue Analytics
- State:
- Created 2 years ago
- Comments:15
Quick question, @aiwalter, what does this do? Does it simply default to the identity transformer if
use=False
, and to the wrapped transformer ifuse=True
?Seems like a really good idea to me!
The name though seems sub-optimal, because you would expect a
TuningTransformer
to tune (which it doesn’t). How aboutOptional
, orOptionalTunable
or similar?PS: beware overfitting by overtuning…
Agreed, seem my point above: