support LSTM
See original GitHub issuei use pytorch to build my lstm network, it looks like
class TestNet(torch.nn.Module):
def __init__(self):
super(TestNet, self).__init__()
self.lstm = nn.LSTM(256,
128, 2,
batch_first=True, bidirectional=True)
return
def forward(self, x):
self.lstm.flatten_parameters()
res = self.lstm(x)
return res
input_tensor = torch.randn(30, 61, 256)
and lstm converter
op = trt.RNNOperation.LSTM
ctx.network.add_rnn_v2(input_tensor._trt, layer_count, hidden_size, max_seq_length, op)
max_seq_length should be input_tensor.shape[0]?
got a error
[TensorRT] ERROR: Parameter check failed at: ../builder/Network.cpp::addRNNCommon::397, condition: input.getDimensions().d[di.seqLen()] == maxSeqLen
and, how could i set a reverse weights?
please any help…
Issue Analytics
- State:
- Created 4 years ago
- Comments:24
Top Results From Across the Web
tf.keras.layers.LSTM | TensorFlow v2.11.0
Can only be used when RNN layer is constructed with stateful = True . Args: states: Numpy arrays that contains the value for...
Read more >Long Short-Term Memory Networks - MATLAB & Simulink
LSTM networks support input data with varying sequence lengths. When passing data through the network, the software pads, truncates, or splits sequences so ......
Read more >LSTM Networks | A Detailed Explanation
This post explains long short-term memory (LSTM) networks. I find that the best way to learn a topic is to read many different...
Read more >Support for LSTM and GRU #46390 - tensorflow ... - GitHub
Hello, I wonder if there is support for LSTM and GRU within TensorFlow Lite for Microcontrollers (audio NN models) or any plan to...
Read more >Introduction to Long Short Term Memory (LSTM)
LSTM is a special kind of recurrent neural network capable of handling long-term dependencies. Lets understand it for deep learning.
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
If this is your first time converting an LSTM, I found it helpful to use the raw TensorRT API instead to make sure you understand what’s happening underneath the surface. The torch2trt repo is good for basic stuff, but for anything non-trivial, you’ll have to use the TensorRT API directly. I would start there.
https://docs.nvidia.com/deeplearning/sdk/tensorrt-api/python_api/index.html
It will take more time initially, but it will save you tons of time in the future.
A small update to forward fixed it for me:
From this
To this: