question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Accessing layers and their weights/parameters inside Sequential

See original GitHub issue

Hello,

First, thanks a lot for your work on this framework, it’s a very powerful tool!

I had a concern about your customized nn.Sequential. It does not allowed to access to the layers inside it:

m = Sequential(
  (0): BatchNorm(200)
  (1): GCNConv(200, 10)
  (2): ELU(alpha=1.0, inplace=True)
  (3): <function global_mean_pool at 0x0000026F7CA36AF0>
  (4): BatchNorm(10)
  (5): Dropout(p=0.2, inplace=False)
  (6): Linear(in_features=10, out_features=2, bias=True)
)
m[1]     
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
TypeError: 'Sequential_dbe74c' object is not subscriptable

It would be useful to be able to access them, for example to check the weights and parameters of the layers.

For example, the initial nn.Sequential allows it, and propose to name our layer like that:

# Using Sequential with OrderedDict. This is functionally the same as the above code
model = nn.Sequential(OrderedDict([
          ('conv1', nn.Conv2d(1,20,5)),
          ('relu1', nn.ReLU()),
          ('conv2', nn.Conv2d(20,64,5)),
          ('relu2', nn.ReLU())
        ]))

(from https://pytorch.org/docs/stable/generated/torch.nn.Sequential.html)

Would it be possible to add a similar feature, to get a direct access to our layers?

Thanks for your attention,

Issue Analytics

  • State:closed
  • Created 2 years ago
  • Comments:9 (5 by maintainers)

github_iconTop GitHub Comments

1reaction
rusty1scommented, Jul 12, 2022

Every layer is different, some layers may utilize multiple linear layers, some may use MLPs, etc. It is best to look up the code on a GNN layer in order to see how to access its parameters.

1reaction
rusty1scommented, Feb 15, 2022

Added support for OrderedDict as well, see https://github.com/pyg-team/pytorch_geometric/pull/4075.

Read more comments on GitHub >

github_iconTop Results From Across the Web

How to access the network weights while using PyTorch 'nn ...
So to access the weights of each layer, we need to call it by its own unique layer name. For example to access...
Read more >
Access weights of a specific module in nn.Sequential()
Is there any way in Pytorch to get access to the layers of a model and weights in each layer without typing the...
Read more >
The Sequential model - Keras
Models built with a predefined input shape like this always have weights (even before seeing any data) and always have a defined output...
Read more >
Create Neural Network with PyTorch | by ifeelfree - Medium
In this example, we check all the convolutional layers in the defined neural network. for module in self.model.modules(): if isinstance(module, nn.
Read more >
How to Calculate the Number of Parameters in Keras Models
The “Param #” column shows you the number of parameters that are trained for each layer. The total number of parameters is shown...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found