question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

TensorBoardLogger should be able to add metric names in hparams

See original GitHub issue

🚀 Feature

TensorBoard allows investigating the effect of hyperparameters in the hparams tab. Unfortunately, the log_hyperparams function in TensorBoardLogger cannot add any information about which of the logged metrics is actually a “metric” which can be used for such a comparison.

Motivation

I would like to use the built-in hparams module of TensorBoard to evaluate my trainings.

Pitch

PyTorch-Lightning should give me the possibility to define the metrics of my model in some way such that any logger is able to derive which metric may be used for hyperparameter validation, as well as other possible characteristics which may be defined for those.

Additional context

The hparams method of a summary takes the following parameters:

def hparams(hparam_dict=None, metric_dict=None):

metric_dict is basically a dictionary mapping metric names to values, whereas the values are omitted in the function itself.

Issue Analytics

  • State:closed
  • Created 4 years ago
  • Reactions:18
  • Comments:9 (7 by maintainers)

github_iconTop GitHub Comments

2reactions
tstummcommented, Mar 11, 2020

I think if Lightning offers such a logger mechanism, it should offer an abstraction to enable this functionality. I’d be fine with having a register_metric function in TensorBoardLogger, but I don’t want to rely on implementation details of the underlying logging mechanism.

0reactions
stale[bot]commented, Jul 3, 2020

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.

Read more comments on GitHub >

github_iconTop Results From Across the Web

TensorBoardLogger should be able to add metric names in ...
I would like to use the built-in hparams module of TensorBoard to evaluate my trainings. Pitch. PyTorch-Lightning should give me the possibility ...
Read more >
What is hp_metric in TensorBoard and how to get rid of it?
The hp_metric helps you track the model performance across different hyperparameters. You can check it at hparams in your tensorboard.
Read more >
Logging — PyTorch Lightning 1.8.5.post0 documentation
By default, Lightning uses TensorBoard logger under the hood, ... Setting both on_step=True and on_epoch=True will create two keys per metric you log...
Read more >
Hyperparameter Tuning with the HParams Dashboard
This tutorial will focus on the following steps: Experiment setup and HParams summary; Adapt TensorFlow runs to log hyperparameters and metrics ...
Read more >
Awesome PyTorch Lightning template | by Arian Prabowo
Proper use hp_metric so we can select the best hyperparameters within TensorBoard (not working yet T_T. As a temporary work around, it saves...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found