question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

single timestep prediction using LSTM

See original GitHub issue

I am not sure how the prediction works with LSTM. I can correctly train a model using this code:

model.add(Masking(mask_value= -1.0, input_shape=(None, 5)))
model.add(LSTM(units=nr_units, return_sequences=True, activation='relu'))
model.add(Dropout(0.2))
model.add(TimeDistributed(Dense(6, activation='sigmoid')))
model.compile(loss="categorical_crossentropy", optimizer='adam', metrics=['accuracy'])
model.fit(x_train, y_train, epochs=nr_epochs, batch_size=nr_batch, validation_split=0.2)

My question is: can I predict the output of every timestep in a sequence by providing only one timestep as input? I can’t do otherwise because only the first element of the sequence is known while the 2,…,n depends on output of the prediction.

I am making the prediction like this inside a loop: pred = model.predict(x_test) Where x_test is a single ‘frame’ of a sequence. Does the model retains its internal state like this? Or do I need to provide the full input sequence?

Thanks

Issue Analytics

  • State:closed
  • Created 6 years ago
  • Comments:8

github_iconTop GitHub Comments

1reaction
mturnshekcommented, Dec 13, 2017

To train in batch, but predict in sequence, you can make two models which are exactly the same, but have different ‘batch_input_shapes’.

After training the first model in batch, you transfer the weights to the second model to predict.

Make the training model with the first dimension in batch_input_shape as your training batch size. Then, make a new prediction model which is the same, except the first dimension in batch_input_shape is 1.

In both models, set stateful=True, and return_sequences=True.

You can then use model.reset_states to reset your recurrent states during the appropriate times during training or predicting.

Here’s some code that might help. https://github.com/mturnshek/deep-learning/blob/master/realtime_rnn_predictions/batch_train_realtime_predict.py

0reactions
lmxhappycommented, Mar 21, 2018

Supposed that past 7 days’ history data influences the current data. A model is trained and its timestep = 3. When predicting if one timestep of x1, x2, x3 is passed to the model, the result is not satisfying. @mturnshek hi, do you mean that the state that is produced by one predict call is used by the next call? If yes, for the targeted prediction, more sequences and more time steps are passed to the model and get more predictions although one the last prediction is targeted.

Read more comments on GitHub >

github_iconTop Results From Across the Web

Why LSTM still works with only 1 time step - Cross Validated
The only valid way to use LSTM is when you have multiple timepoints per each sample or use stateful=True where the LSTM passes...
Read more >
How to Use Timesteps in LSTM Networks for Time Series ...
With one time step, you have no temporal correlation into the network. Hence LSTM cannot learn it. However, MLP and LSTM differ because...
Read more >
Single and Multi-Step Temperature Time Series Forecasting ...
Single and Multi-Step Temperature Time Series Forecasting for Vilnius Using LSTM Deep Learning Model. Weather time series forecasting using deep ...
Read more >
Time Series Prediction with LSTM
A simple recurrent neural network works well only for a short-term memory. We will see that it suffers from a fundamental problem (vanishing...
Read more >
LSTM with different timestep when predicting - Stack Overflow
If you want a variable number of timesteps, simply set that size in the input shape to None ; that is, inputs =...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found