question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Lightning no longer works with non-primitive types in hparams

See original GitHub issue

🐛 Bug

I will often use things like a layer definition passed in, for things like changing from batch norm to group norm or using a custom layer. Now that lightning uses the hparams feature in tensor board it errors out with these in hparams.

Code sample

from pytorch_lightning.loggers import TensorBoardLogger
from argparse import Namespace
params = Namespace(foo = range(10))
logger = TensorBoardLogger(save_dir='.')
logger.log_hyperparams(params)

Expected behavior

Lightning should convert non primitive types to strings before passing to the summary writer. This worked fine when logging hparams as a text object

Environment

Pytorch 1.4 Ubuntu 16 pip install pytorch_lightning Python 3.7

Issue Analytics

  • State:closed
  • Created 4 years ago
  • Reactions:4
  • Comments:6 (6 by maintainers)

github_iconTop GitHub Comments

1reaction
Bordacommented, Mar 12, 2020

Sounds good, would you mind sending a PR with these suggestions?

1reaction
monneycommented, Mar 12, 2020

@Borda Use Cases for various non-primitives: Sequences: Num Layers as one hparams, per layer filter size or stride for each conv. layer. I use this to test topology changes across different resolutions.

Current Alternative: passing in a string to decoded in model, something like ‘[3,5,3,5,3,5,5]’

Namespaces: Better organization of parameters

Current Alternative: prefix on names like “backbone_init”

Functions: Swapping out compatible layers such as BatchNorm and InstanceNorm or changing activation functions. BigGAN’s repo is a good example of doing this kind of thing: https://github.com/ajbrock/BigGAN-PyTorch

Current Alternative: Mapping within the model like “gn” for to use GroupNorm or “bn” to use BatchNorm

I try to make hparams as complete a description of the experiment as possible, so non-primitives are helpful for this, as the alternatives are messier.

Decoding:

I think it’s reasonable to use eval, and assume an appropriate repr for what is passed in. With a warning if something fails. Currently I manually reconstruct hparams, and construct the model from there.

Pickling the whole hparams or each non-primitive also seems reasonable to me for full reproducibility.

Read more comments on GitHub >

github_iconTop Results From Across the Web

Lightning no longer works with non-primitive types in hparams
I will often use things like a layer definition passed in, for things like changing from batch norm to group norm or using...
Read more >
Changelog — PyTorch Lightning 1.8.5 documentation
It is no longer needed to call model.double() when using precision=64 in Lightning Lite (#14827). HPC checkpoints are now loaded automatically only in...
Read more >
CHANGELOG.md · zhiqwang/pytorch-lightning - Gitee.com
Fixed hash of LightningEnum to work with value instead of name (#8421). Fixed a bug where an extra checkpoint was saved at the...
Read more >
Java Non-Primitive Data Types - W3Schools
Primitive types are predefined (already defined) in Java. Non-primitive types are created by the programmer and is not defined by Java (except for...
Read more >
Many bug fixes, added flexibility, parity tests with pytorch and ...
PyTorchLightning /pytorch-lightning: NO API changes - Many bug fixes, ... Added support for non-primitive types in hparams for ...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found