LSTM fully connected architecture
See original GitHub issueHi everyone, First, I would like to express my gratitude to all people who work daily trying to improve Keras software and its documentation. Thank you guys 😉
After reading many posts trying to sort out all my questions…I still have some doubts related to LSTM recurrent networks, so, I hope you can help me.
Input shape: (nb_samples, timesteps, input_dim). So, I have 11200 nb_samples. Each nb_samples contains 3000 timesteps and finally, each timesteps contains 22 values. Therefore, this is my input shape (11200, 3000, 22).
Output: every nb_samples must be classified in one class (‘0’ or ‘1’).
Goal: classify every nb_samples in one of the two possibles classes (‘0’ or ‘1’) using a LSTM fully-connected network.
Architecture to follow:
In the following posts I found very useful information related to my problem: #2673 and #2496. However, I still have many doubts:
- As far as I know, a LSTM layer at the beginning of the model is not fully connected as @carlthome and @fchollet explained in #2673.
- As the goal is to classify each nb_samples to one class (‘0’ or ‘1’),
TimeDistributed(Dense(...))
shouldn’t be used due to, as far as I know, this layer provides an output per timestep and what I want is to classify one nb_samples on the whole to ‘0’ or ‘1’ classes. - In this easy architecture there is only one LSTM layer and therefore, the use of
return_sequence
doesn’t matter. However, in case of having two LSTM layers, shouldreturn_secuence = true
orfalse
? I think in my model it should bereturn_sequence=true
as it was explained in #2496 but 'm not pretty sure about it.
Let’s start with the first approach of the model (although I know it is wrong).
timesteps = 3000
input_dim = 22
model = Sequential()
model.add(LSTM(22, input_shape=(timesteps, data_dim)))
model.add(Dense(1, activation='sigmoid'))
Does anyone can help me to build my model and solve all my questions? Thank you very much in advanced!
Issue Analytics
- State:
- Created 7 years ago
- Comments:20 (8 by maintainers)
No, that doesn’t help.
OK, will go through these, thanks again 😃