question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

LSTM fully connected architecture

See original GitHub issue

Hi everyone, First, I would like to express my gratitude to all people who work daily trying to improve Keras software and its documentation. Thank you guys 😉

After reading many posts trying to sort out all my questions…I still have some doubts related to LSTM recurrent networks, so, I hope you can help me.

Input shape: (nb_samples, timesteps, input_dim). So, I have 11200 nb_samples. Each nb_samples contains 3000 timesteps and finally, each timesteps contains 22 values. Therefore, this is my input shape (11200, 3000, 22).

Output: every nb_samples must be classified in one class (‘0’ or ‘1’).

Goal: classify every nb_samples in one of the two possibles classes (‘0’ or ‘1’) using a LSTM fully-connected network.

Architecture to follow:

image

In the following posts I found very useful information related to my problem: #2673 and #2496. However, I still have many doubts:

  1. As far as I know, a LSTM layer at the beginning of the model is not fully connected as @carlthome and @fchollet explained in #2673.
  2. As the goal is to classify each nb_samples to one class (‘0’ or ‘1’), TimeDistributed(Dense(...)) shouldn’t be used due to, as far as I know, this layer provides an output per timestep and what I want is to classify one nb_samples on the whole to ‘0’ or ‘1’ classes.
  3. In this easy architecture there is only one LSTM layer and therefore, the use of return_sequence doesn’t matter. However, in case of having two LSTM layers, should return_secuence = true or false? I think in my model it should be return_sequence=true as it was explained in #2496 but 'm not pretty sure about it.

Let’s start with the first approach of the model (although I know it is wrong).

timesteps = 3000
input_dim = 22 

model = Sequential()
model.add(LSTM(22, input_shape=(timesteps, data_dim)))
model.add(Dense(1, activation='sigmoid'))

Does anyone can help me to build my model and solve all my questions? Thank you very much in advanced!

Issue Analytics

  • State:closed
  • Created 7 years ago
  • Comments:20 (8 by maintainers)

github_iconTop GitHub Comments

4reactions
carlthomecommented, Nov 18, 2016

No, that doesn’t help.

1reaction
aisoposcommented, May 17, 2017

OK, will go through these, thanks again 😃

Read more comments on GitHub >

github_iconTop Results From Across the Web

LSTMs In PyTorch. Understanding the LSTM Architecture and…
After an LSTM layer (or set of LSTM layers), we typically add a fully connected layer to the network for final output via...
Read more >
Convolutional, Long Short-Term Memory, Fully Connected ...
ABSTRACT. Both Convolutional Neural Networks (CNNs) and Long Short-Term. Memory (LSTM) have shown improvements over Deep Neural Net-.
Read more >
Model Architecture: embedding, LSTM, and fully connected ...
The LSTM layer is composed of hidden state with 100 nodes. The fully connected layer is of size 50 with dropout. As activation...
Read more >
Convolutional, Long Short-Term Memory, fully connected ...
Abstract: Both Convolutional Neural Networks (CNNs) and Long Short-Term Memory (LSTM) have shown improvements over Deep Neural Networks (DNNs) across a wide ...
Read more >
Understanding LSTM Internal blocks and Intuition | by Chunduri
Part-1: Physical structure of LSTM: ... Even though we have four gates, four Fully connected Neural networks, five activation ...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found