Couple of suggestions for `nn.Sequential`.
See original GitHub issueHi @patrick-kidger ,
I was wondering if the following modifications are worth adding to nn.Sequential
-
Supporting OrderedDicts. They have an ordering but also make it easier to access specific indices based on names. Possible bug: At the moment it is possible to only initialize a
sequential
with anOrderedDict
(but it shouldn’t?). -
Supporting
delete
,insert
,update
,append
to edit the state oflayers
. Given that__getitem__
already exists, these would also be qol additions.
Thanks.
Issue Analytics
- State:
- Created a year ago
- Comments:5 (5 by maintainers)
Top Results From Across the Web
PyTorch - Neural Network I - nn.Sequential - YouTube
In this video I cover nn. Sequential API and details of training a sequential network.
Read more >A simple extension of nn.Sequential - vision - PyTorch Forums
Hi there! I'm working through some Udacity courses on PyTorch and decided to go the extra mile to extend the nn.Sequential class.
Read more >When should I use nn.ModuleList and when ... - Stack Overflow
I am new to Pytorch and one thing that I don't quite understand is the usage of nn.ModuleList and nn.Sequential . Can I...
Read more >Three Ways to Build a Neural Network in PyTorch
Having said this, the goal of this article is to illustrate a few different ways that one can create a neural network in...
Read more >CS231n/PyTorch.ipynb at master - GitHub
Module to define arbitrary neural network architecture. PyTorch Sequential API: we will use nn.Sequential to define a linear feed-forward network very ...
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
Yeah, that’d make sense.
For performing model surgery - or indeed any kind of PyTree manipulation - I’d recommend
eqx.tree_at
. For example:Which replaces the last layer of the MLP.
Does this generally solve your use cases here?
I’ve been thinking of adding an “advanced tricks” section to the docs describing this kind of thing.
This is pretty common with transfer learning – i.e. doing “model surgery” on a pretrained model to adapt it to a new task. This is where the mutability of PyTorch Modules is really convenient. It would be pretty cool if equinox Modules could be unfrozen and refrozen to allow PyTorch-like manipulation