einops does not support torch.jit.script ?
See original GitHub issueDescribe the bug
Thanks for this great work. In recent updates, it is mentioned that einops supports torch.jit.script for PyTorch layers. I and @yiheng-wang-nv have been looking into this to support TorchScript for a number of models. However, we are not able to utilize this functionality for simple operations such as Rearrange
.
Reproduction steps The following snippet should illustrate a concise way of reproducing this issue:
import torch
import torch.nn as nn
from einops.layers.torch import Rearrange
class SimpleRearrange(nn.Module):
def __init__(self):
super().__init__()
self.layer = nn.Sequential(Rearrange('b c h w -> b h w c'))
def forward(self, x):
result = self.layer(x)
return result
net = SimpleRearrange()
net.eval()
with torch.no_grad():
torch.jit.script(net)
Expected behavior This is the expected output:
Your platform einops version: 0.3.2 Python version: 3.8.12 PyTorch version: 1.6.0 CUDA version: 10.2
Based on this, I believe einops does not support torch.jit.script, unless we are missing something. Appreciate your inputs here.
Issue Analytics
- State:
- Created 2 years ago
- Comments:5 (3 by maintainers)
Top Results From Across the Web
PyTorch jit.script interoperability #115 - arogozhnikov/einops
I'm not sure if this is the expected behavior, but not being able to script modules that use einops would be very unfortunate....
Read more >TorchScript Unsupported Pytorch Constructs
Below are listed the modules that TorchScript does not support, and an incomplete list of PyTorch classes that are not supported. For unsupported...
Read more >Deep Learning Operations Reinvented for Tensorflow & Pytorch
In this tutorial, we will share a repository of Einops: Deep learning ... torch.jit.script is supported for pytorch layers; powerful EinMix added to...
Read more >Pytorch - Dan MacKinlay
Of course the overhead is not truly zero; rather they have shifted the user ... use @torch.jit.script , e.g. esp to fuse long...
Read more >Logits Ensemble : Transformer - Kaggle
It supports both of shifted and non-shifted window. ... x): for blk in self.blocks: if not torch.jit.is_scripting() and self.use_checkpoint: x ...
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
@arogozhnikov Thanks for the kind tips. My mind was not in a proper state yesterday. The bug is indeed fixed in another issue. It is now so embarrassing to recall my moment yesterday.
Also regarding this deleted comment:
If that is true (again, strange, as torch should do that for you), you should be able to use hot-patching: