question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Documentation on Tensorboard HParams plugin

See original GitHub issue

I am looking for documentation on Tensorboard HParams plugin, in particular, I want to know about hp.Discrete & hp.RealInterval and if there are also some other functions like these available. I tried to look for documentation on these but could not find anything. I would be thankful if someone can share a link of the documentation.

Issue Analytics

  • State:closed
  • Created 4 years ago
  • Comments:17 (6 by maintainers)

github_iconTop GitHub Comments

10reactions
thisismygitrepocommented, Jun 22, 2019

I highly recommend that these details be added to the notebook, cause the it leaves much to be desired.

6reactions
wchargincommented, Jun 19, 2019

This is described in the text and code of section 3:

For simplicity, use a grid search: try all combinations of the discrete parameters and just the lower and upper bounds of the real-valued parameter. For more complex scenarios, it might be more effective to choose each hyperparameter value randomly (this is called a random search).

and

  for dropout_rate in (HP_DROPOUT.domain.min_value, HP_DROPOUT.domain.max_value):
    ...

You can see that we’re explicitly picking the two endpoints of the domain only. You could instead use something like

  for dropout_rate in tf.linspace(
      HP_DROPOUT.domain.min_value,
      HP_DROPOUT.domain.max_value,
      11,
  ):
    ...

for a higher-resolution grid search, or

  for dropout_rate_iteration in range(10):
    dropout_rate = HP_DROPOUT.domain.sample_uniform()
    ...

for a random search.

More generally, the TensorBoard hparams plugin doesn’t tune the model for you. You bring your own tuner, or write a simple one inline, as in the tutorial; the hparams dashboard lets you visualize the results of tuning along with the rest of your metrics.

Read more comments on GitHub >

github_iconTop Results From Across the Web

Hyperparameter Tuning with the HParams Dashboard
Visualize the results in TensorBoard's HParams plugin ... This tutorial will focus on the following steps: Experiment setup and HParams ...
Read more >
Hyperparameter tuning using tensorboard.plugins.hparams ...
I am trying to use tensorboard.plugins.hparams api for hyperparameter tuning and don't know how to incorporate my custom loss function there ...
Read more >
TensorBoard: Hyperparameter Optimization
Prerequisites: TensorBoard- A Visualization suite for Tensorflow models. In this article, you will learn hyperparameter optimization and ...
Read more >
Neural Networks Hyperparameter tuning in tensorflow 2.0
import tensorflow as tf from tensorboard.plugins.hparams import api as hp from tensorflow import feature_column from tensorflow.keras import layers
Read more >
torch.utils.tensorboard — PyTorch 1.13 documentation
Once you've installed TensorBoard, these utilities let you log PyTorch ... Otherwise the value you added by add_scalar will be displayed in hparam...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found