question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Tuning the batch size in the Tuner component

See original GitHub issue

Hello,

I’m currently using TFX to build a pipeline on the Google AI platform with the Kubeflow engine. I have a model where the batch size is an important hyper-parameter to tune.

I would like to search this hyper-parameter in the Tuner component.

Is it even possible?

I follow the TFX example with the Penguin dataset, more precisely the tuner component implementation: found here.

The _get_hyperparameters function returns the sample space for the model hyperparameters (see line 139). However, the batch size to train the model is fixed and specified at the end of the tuner_fn (see line 246).

Is there a way to dynamically change the batch size based on a sample from the hyper-parameter space?

Thanks for your help !

Issue Analytics

  • State:closed
  • Created 2 years ago
  • Reactions:3
  • Comments:5 (1 by maintainers)

github_iconTop GitHub Comments

2reactions
1025KBcommented, Apr 21, 2021

Customize kerastuner.BaseTuner should work for batch size tuning

You’d pass an unbatched dataset to the Tuner, and in the run_trial method of the CustomTuner, you’d batch the dataset with a variable size drawn from hp:

  1. CustomTuner with a run_trial
  2. In run trial, get the batch_size from trial
  3. Update dataset with correct batch_size
  4. train the model with updated dataset
  5. you might need to update how the metrics is calculated if you want to add execution time as one of the evaluation method for trial
  6. in your tuner_fn you need to change the fit_args to match the run_trial params (pass in unbatched dataset)
0reactions
google-ml-butler[bot]commented, May 21, 2021

Are you satisfied with the resolution of your issue? Yes No

Read more comments on GitHub >

github_iconTop Results From Across the Web

How to tune the number of epochs and batch_size in Keras ...
This can be done by subclassing the Tuner class you are using and overriding run_trial. ... Don't pass epochs or batch_size here, let...
Read more >
Tuner Component: Is it possible to tune the batch size?
I'm currently using TFX to build a pipeline on the Google AI platform with the Kubeflow engine. I have a model where the...
Read more >
Inquiry on batch_size in keras-tuner - Google Groups
When I apply keras-tuner to train my model, I don't know how to set 'batch_size' in ... In this case, since you want...
Read more >
Tune hyperparameters in your custom training loop - Keras
In the custom training loop, we tune the batch size of the dataset as we wrap the NumPy data into a tf.data.Dataset ....
Read more >
Easy Hyperparameter Tuning with Keras Tuner and TensorFlow
To learn how to tune hyperparameters with Keras Tuner, ... define the total number of epochs to train, batch size, and the #...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found