question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Problem with TimeDistributed() and Learning Phase

See original GitHub issue

(EDIT: The following issue is only a minimal example of how to produce the error. My actual goal is to use a more complicated model instead of Dropout() here.)

When executing the following script a MissingInputError occurs:

from keras.models import Model
from keras.layers import Input, TimeDistributed, Dropout

in1 = Input(batch_shape=(10, 8, 6), name="in1")
out1 = TimeDistributed(Dropout(0.5))(in1)

model = Model(input=in1, output=out1)
model.compile("adam", "mse")
model._make_predict_function()

This is the simplest model that produces the error (In my original architecture, I tried to distribute a more complex model). The same issue occurs when replacing the Dropout() layer with e.g. GaussianNoise(), GRU(dropout_W=0.5), but not for e.g. Dense(). I think the error boils down to the combination of TimeDistributed() and any layer (or model) that uses the learning phase.

Maybe there is a conceptual problem with TimeDistributed() and the learning phase input?

These issues seem to be somewhat related: #3834, #2609, #3686, #2391

The full stack trace is this:

... 
  File "/homes/sjebbara/git/keras-original/keras/engine/training.py", line 752, in _make_predict_function
    **kwargs)
  File "/homes/sjebbara/git/keras-original/keras/backend/theano_backend.py", line 787, in function
    return Function(inputs, outputs, updates=updates, **kwargs)
  File "/homes/sjebbara/git/keras-original/keras/backend/theano_backend.py", line 773, in __init__
    **kwargs)
  File "/homes/sjebbara/.local/lib/python2.7/site-packages/Theano-0.9.0.dev3-py2.7.egg/theano/compile/function.py", line 326, in function
    output_keys=output_keys)
  File "/homes/sjebbara/.local/lib/python2.7/site-packages/Theano-0.9.0.dev3-py2.7.egg/theano/compile/pfunc.py", line 486, in pfunc
    output_keys=output_keys)
  File "/homes/sjebbara/.local/lib/python2.7/site-packages/Theano-0.9.0.dev3-py2.7.egg/theano/compile/function_module.py", line 1776, in orig_function
    output_keys=output_keys).create(
  File "/homes/sjebbara/.local/lib/python2.7/site-packages/Theano-0.9.0.dev3-py2.7.egg/theano/compile/function_module.py", line 1430, in __init__
    accept_inplace)
  File "/homes/sjebbara/.local/lib/python2.7/site-packages/Theano-0.9.0.dev3-py2.7.egg/theano/compile/function_module.py", line 176, in std_fgraph
    update_mapping=update_mapping)
  File "/homes/sjebbara/.local/lib/python2.7/site-packages/Theano-0.9.0.dev3-py2.7.egg/theano/gof/fg.py", line 180, in __init__
    self.__import_r__(output, reason="init")
  File "/homes/sjebbara/.local/lib/python2.7/site-packages/Theano-0.9.0.dev3-py2.7.egg/theano/gof/fg.py", line 351, in __import_r__
    self.__import__(variable.owner, reason=reason)
  File "/homes/sjebbara/.local/lib/python2.7/site-packages/Theano-0.9.0.dev3-py2.7.egg/theano/gof/fg.py", line 396, in __import__
    variable=r)
theano.gof.fg.MissingInputError: An input of the graph, used to compute Shape(<TensorType(float32, matrix)>), was not provided and not given a value.Use the Theano flag exception_verbosity='high',for more information on this error.

Backtrace when the variable is created:
  File "/homes/sjebbara/PyCharmProjects/NeuralSentiment/src/Test2.py", line 5, in <module>
    out1 = TimeDistributed(Dropout(0.5))(in1)
  File "/homes/sjebbara/git/keras-original/keras/engine/topology.py", line 514, in __call__
    self.add_inbound_node(inbound_layers, node_indices, tensor_indices)
  File "/homes/sjebbara/git/keras-original/keras/engine/topology.py", line 572, in add_inbound_node
    Node.create_node(self, inbound_layers, node_indices, tensor_indices)
  File "/homes/sjebbara/git/keras-original/keras/engine/topology.py", line 149, in create_node
    output_tensors = to_list(outbound_layer.call(input_tensors[0], mask=input_masks[0]))
  File "/homes/sjebbara/git/keras-original/keras/layers/wrappers.py", line 131, in call
    initial_states=[], input_length=input_length, unroll=unroll)
  File "/homes/sjebbara/git/keras-original/keras/backend/theano_backend.py", line 947, in rnn
    go_backwards=go_backwards)

Please make sure that the boxes below are checked before you submit your issue. Thank you!

  • [ x] Check that you are up-to-date with the master branch of Keras. You can update with: pip install git+git://github.com/fchollet/keras.git --upgrade --no-deps
  • [x ] If running on Theano, check that you are up-to-date with the master branch of Theano. You can update with: pip install git+git://github.com/Theano/Theano.git --upgrade --no-deps
  • [x ] Provide a link to a GitHub Gist of a Python script that can reproduce your issue (or just copy the script here if it is short).

Issue Analytics

  • State:closed
  • Created 7 years ago
  • Comments:16 (5 by maintainers)

github_iconTop GitHub Comments

3reactions
brayan07commented, Oct 4, 2017

I was having a similar issue with Tensorflow. Whenever I used the TimeDistributed wrapper on a model containing layers that used the learning phase, the resulting tensor would have the property _uses_learning_phase = False. This meant that when I created a final model containing that tensor, the model’s _uses_learning_phase would incorrectly be set to False.

In the case below, my intermediate_model had a Dropout layer; before passing it through the wrapper, intermediate_model.uses_learning_phase=True.

input_scan = Input(shape=(ANGLES,FINAL_WIDTH,FINAL_HEIGHT//2,CHANNELS))
#Time distributed model
sequenced_model = TimeDistributed(intermediate_model)(input_scan)

sequenced_model._uses_learning_phase = True #Manually setting the tensor's property fixed the issue.

out = GlobalAveragePooling1D()(sequenced_model)
#Complete model
model = Model(input_scan,out)
0reactions
davideboschettocommented, Sep 17, 2018

sequenced_model._uses_learning_phase = True #Manually setting the tensor’s property fixed the issue.

This was the key to solve this for me, too. The model contained in the timedistributed was indeed not training without this.

Read more comments on GitHub >

github_iconTop Results From Across the Web

How to Use the TimeDistributed Layer in Keras
This tutorial is divided into 5 parts; they are: TimeDistributed Layer; Sequence Learning Problem; One-to-One LSTM for Sequence Prediction; Many ...
Read more >
tf.keras.layers.TimeDistributed | TensorFlow v2.11.0
Binary tensor of shape (samples, timesteps) indicating whether a given timestep should be masked. This argument is passed to the wrapped layer ( ......
Read more >
Challenges with distributed systems - AWS - Amazon.com
Introducing properties of distributed systems that make them so challenging, including non-determinism and testing.
Read more >
rnn - How to implement "one-to-many" and "many-to-many ...
Using TimeDistributed makes it possible to have a layer operate on every item in ... In this code I create sine waves (of...
Read more >
Why is TimeDistributed not needed in my Keras LSTM?
My input shape in the LSTM is (10,24,2) and my hidden_size is 8. model = Sequential() model.add(LSTM(hidden_size, return_sequences=True, ...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found