Tuning the batch size in the Tuner component
See original GitHub issueHello,
I’m currently using TFX to build a pipeline on the Google AI platform with the Kubeflow engine. I have a model where the batch size is an important hyper-parameter to tune.
I would like to search this hyper-parameter in the Tuner component.
Is it even possible?
I follow the TFX example with the Penguin dataset, more precisely the tuner component implementation: found here.
The _get_hyperparameters
function returns the sample space for the model hyperparameters (see line 139). However, the batch size to train the model is fixed and specified at the end of the tuner_fn (see line 246).
Is there a way to dynamically change the batch size based on a sample from the hyper-parameter space?
Thanks for your help !
Issue Analytics
- State:
- Created 2 years ago
- Reactions:3
- Comments:5 (1 by maintainers)
Top Results From Across the Web
How to tune the number of epochs and batch_size in Keras ...
This can be done by subclassing the Tuner class you are using and overriding run_trial. ... Don't pass epochs or batch_size here, let...
Read more >Tuner Component: Is it possible to tune the batch size?
I'm currently using TFX to build a pipeline on the Google AI platform with the Kubeflow engine. I have a model where the...
Read more >Inquiry on batch_size in keras-tuner - Google Groups
When I apply keras-tuner to train my model, I don't know how to set 'batch_size' in ... In this case, since you want...
Read more >Tune hyperparameters in your custom training loop - Keras
In the custom training loop, we tune the batch size of the dataset as we wrap the NumPy data into a tf.data.Dataset ....
Read more >Easy Hyperparameter Tuning with Keras Tuner and TensorFlow
To learn how to tune hyperparameters with Keras Tuner, ... define the total number of epochs to train, batch size, and the #...
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
Customize kerastuner.BaseTuner should work for batch size tuning
You’d pass an unbatched dataset to the Tuner, and in the run_trial method of the CustomTuner, you’d batch the dataset with a variable size drawn from
hp
:Are you satisfied with the resolution of your issue? Yes No