question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

"Cannot compute the outputs" for dynamic ops (TensorArrayStack)

See original GitHub issue

TensorFlow.js version

1.2.11

Browser version

Google Chrome Version 77.0.3865.120 (Official Build) (64-bit)

Describe the problem or feature request

I have a graph model (built using tf.contrib.seq2seq) which uses control flow ops for dynamic RNN decoding. I exported it with TF 1.15 (using simple_save) and converted with TF.js 1.2.11 (command: tensorflowjs_converter --input_format=tf_saved_model --output_format=tfjs_graph_model --saved_model_tags=serve).

When trying to run the model with executeAsync, I get the following error:

Uncaught (in promise) Error: Cannot compute the outputs [decoder/decode_sample/decoder_1/transpose_1] from the provided inputs [inputs,softmax_temperature]. Consider providing the following inputs: []. Alternatively, to avoid the dynamic ops, use model.execute() and specify the inputs [decoder/decode_sample/decoder_1/TensorArrayStack_1/TensorArrayGatherV3]
    at t.<anonymous> (graph_executor.ts:318)
    at callbacks.ts:253
    at Object.next (callbacks.ts:253)
    at o (callbacks.ts:253)

Does this mean TF.js currently cannot handle these ops? Is it the TensorArrayStack/TensorArrayGatherV3 operation or something else?

Issue Analytics

  • State:closed
  • Created 4 years ago
  • Comments:13

github_iconTop GitHub Comments

1reaction
pyu10055commented, Nov 2, 2019

yes, that is the exact result I got [37, 37, 162, 73, 162, 55, 55, 55, 55, 55, 55, 55, 55, 55, 55, 55, 55, 55, 55, 55, 55, 55, 55, 55, 55, 55, 55, 55, 55, 55, 55, 55, 55, 55, 55, 55, 55, 55, 55, 55, 55, 55, 55, 55, 55, 55, 55, 55, 55, 55] length: 1 proto: Array(0)

0reactions
pyu10055commented, Nov 4, 2019

@cifkao in that case you can rely on our converter to do that, and the fix I had will fix your issue.

Read more comments on GitHub >

github_iconTop Results From Across the Web

Using a Custom tensorflow Model for Object Detection in ...
Uncaught (in promise) Error: Cannot compute the outputs [Identity ... Alternatively, to avoid the dynamic ops, use model.execute() and ...
Read more >
TensorArray GlobalVar and GlobalTypeVar Confusion
I am working with the new TensorArray functionality from the TensorFlow frontend, and am getting into an area that I'm a little confused...
Read more >
Tensorflow RetinaNet Object Detector on Deepstream
I have a trained RetinaNet Object Detector network that I have been using for some time with good success.
Read more >
tensorflow.python.ops.tensor_array_ops 源代码
"""TensorArray: a dynamically sized array of Tensors.""" # Mixture of pep8 and non-pep8 ... _handle): with ops.name_scope(name, "TensorArrayStack", [self.
Read more >
Learn Hands-On Machine Learning with Scikit-Learn and ...
You cannot use multiple running of session to simulate time step ... Each cell computes the outputs at one time step taking the...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found