wandb.watch doesn't work with modules that return dicts
See original GitHub issuewandb --version && python --version && uname
- Weights and Biases version: 0.9.1
- Python version: 3.7.7
- Operating System: Manjaro Linux 20.0.3
Description
When I use wandb.watch(model)
in a model of mine, I get a StopIteration
message and my program crashes. When I comment that line out, everything works fine.
Two things my model does that could be considered non-standard are the use of some neural network layers from PyTorch Geometric and the output of a dictionary instead of a tensor. In the traceback I provide below, the error occurs exactly in a layer that returns a dict, so I believe the dict is the culprit.
What I Did
Traceback (most recent call last):
File "/home/dodo/Code/dodonet/dodonet/training/run.py", line 434, in <module>
OffPolicySMACRunner(trainer).run()
File "/home/dodo/Code/dodonet/dodonet/training/run.py", line 109, in run
output_dict = self.model.policy_net(current_state)
File "/home/dodo/.anaconda3/lib/python3.7/site-packages/torch/nn/modules/module.py", line 550, in __call__
result = self.forward(*input, **kwargs)
File "/home/dodo/Code/dodonet/dodonet/nn/nets.py", line 215, in forward
xdict = self.action_layer(obs_by_class, enc_node_types)
File "/home/dodo/.anaconda3/lib/python3.7/site-packages/torch/nn/modules/module.py", line 559, in __call__
var = next((v for v in var.values() if isinstance(v, torch.Tensor)))
StopIteration
Issue Analytics
- State:
- Created 3 years ago
- Comments:18 (5 by maintainers)
Top Results From Across the Web
wandb.watch doesn't work with modules that return dicts
When I use wandb.watch(model) in a model of mine, I get a StopIteration message and my program crashes. When I comment that line...
Read more >wandb.watch - Documentation - Weights & Biases
You can recover your data by running wandb. The path to your run will be a folder in your wandb directory corresponding to...
Read more >When is one supposed to run wandb.watch so that weights ...
wandb.watch will only start working once you call wandb.log after a backwards pass that touches the watched Module (docs).
Read more >Brief Introduction to Logging with Weights & Biases
After signing up, you need to install wandb module to your Python ... Having problem with exploding (or vanishing) gradients and don't know ......
Read more >wandb — PyTorch Lightning 1.8.5.post0 documentation
A new W&B run will be created when training starts if you have not ... do not log graph (in case of errors)...
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
@cola-nicolas Thanks so much for the repro! I was able to implement a fix in the PR listed above. If you want to try it yourself you can install wandb with:
This should make it into the next release likely early next week.
@cola-nicolas the new branch for this fix can be installed with: