question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Weights of monkeypatched module get reset by a forward pass after being changed manually

See original GitHub issue

Any computed update to the weights of a monkeypatched module (not with a differentiable optimizer) doesn’t behave as expected:

data = torch.randn(32,10)  
module = torch.nn.Linear(10,10)  
fmodel = higher.monkeypatch(module,copy_initial_weights=True)  
fmodel.UpdateWeights() # some update to the weights, for a simple example: fmodel.weight = fmodel.weight * 2.
fmodel.forward(data) # weights are now reset as if fmodel.UpdateWeights() never happened

Edit: I’ve noticed that fmodel.named_parameters() and fmodel._parameters contain the correct updated parameters whereas fmodel.parameters() and fmodel.fast_params contain the old parameter values

Issue Analytics

  • State:closed
  • Created 4 years ago
  • Comments:6 (4 by maintainers)

github_iconTop GitHub Comments

1reaction
egrefencommented, Mar 4, 2020

Regarding a workaround, you can do fmodel.parameters() to get the current state of the “fast” parameters. If you want to apply a (non in-place) function to a subset of them, do so on the returned iterable, forming a new iterable new_params with the same number of parameter tensors (i.e. including the unmodified ones), and then update the parameters by calling fmodel.update_params(new_params).

This isn’t really a pattern I would encourage or will guarantee to support, but it should work for now.

0reactions
egrefencommented, Mar 12, 2020

Addressed in #43. Please re-open issue if it’s not working for your use case.

Read more comments on GitHub >

github_iconTop Results From Across the Web

Pytest monkeypatch isn't working on imported function
When I manually import some_package.foo.bar after the breakpoint and call bar() I get patched . On my real project the situation is even...
Read more >
How do I obtain multiple heads via ...
MultiheadAttention implementation within the transformer encoder layer, you will need to manually modify some of the source code of the PyTorch ...
Read more >
How to monkeypatch/mock modules and environments - Pytest
All modifications will be undone after the requesting test function or fixture has finished. The raising parameter determines if a KeyError or AttributeError ......
Read more >
face-api.js
This face detector is aiming towards obtaining high accuracy in detecting face ... The weights have been trained by davisking and the model...
Read more >
Source code documentation - OMFIT
readOnly=True is meant to be used only after this subtree is deployed where ... If you find yourself using this function in your...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found