question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

plot_weights hook: plot unlimited weights; access weights directly from Executor.model not external files

See original GitHub issue

Now, I have managed to plot the convolutional weights by the following kurfile:

hooks:
    - plot_weights:
        weight_file: cifar.best.valid.w
        weight_keywords1: ["convolution.0", "kernel"]
        weight_keywords2: ["convolution.1", "kernel"]

In my plot_weights_hook.py, I get weight_keywords1, weight_keywords2 into hooks through __init__():

def __init__(self, weight_file, weight_keywords1, weight_keywords2, *args, **kwargs):
		""" Creates a new plotting hook, get plot filenames and matplotlib ready.
		"""

my question:

if I want to plot more convolutional weights, say weight_keywords3, weight_keywords4, weight_keywords5, do I have to change the source code, by adding them into __init__ like above?

Can **kwargs somehow help me avoid changing source every time I want to plot more weights? If so, how?

Thanks!

Issue Analytics

  • State:closed
  • Created 6 years ago
  • Comments:11 (6 by maintainers)

github_iconTop GitHub Comments

2reactions
ajsypcommented, May 4, 2017

You can do it two ways. First, yes, you could use **kwargs to do it, but then you need to be careful to make sure that you pass through the correct pieces of kwargs to the base class constructor (there are lots of ways you can envision this, but all are harder to maintain or are fragile). A better way to do it would be to simply add another layer of indirection:

hooks:
  - plot_weights
      weight_file: cifar.best.valid.w
      with_weights:
        - ['convolution.0', 'kernel']
        - ['convolution.1', 'kernel']
        - ['convolution.2', 'kernel']
        - ...

And then your constructor signature looks like this: def init(self, weight_file, with_weights, *args, **kwargs)

1reaction
ajsypcommented, May 4, 2017

That’s it: kur.model.Model.save(). But it only calls keras_backend._save_keras() IF the Keras backend is selected. If the PyTorch backend is in use, then it will use pytorch_backend.save() instead. That’s the reason for using inheritance (base/derived classes), and the whole point of designing an API.

Read more comments on GitHub >

github_iconTop Results From Across the Web

plotWeights - MathWorks
This MATLAB function visualizes the weights for the autoencoder, autoenc.
Read more >
Model Summary - Getting Layers With Weights - YouTube
Model Summary | Plotting Model | Getting Layers With Weights | Saving Models ... how to access individual layer with its weights and...
Read more >
pyABC Documentation - Read the Docs
pyABC is a framework for distributed, likelihood-free inference. That means, if you have a model and some data and.
Read more >
Allow overriding weights file while constructing pretrained ...
The current way of using _pretrained to load a pretrained model using from_params doesn't offer a way to override weight files.
Read more >
https://raw.githubusercontent.com/Fraser-Greenlee/...
{"text": "assert issubclass(initialState.dtype.type, np.integer) and initialState.ndim==1, \"initialState %r is not a one-dimensional integer numpy array\" ...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found