question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Issues running exported SavedModel

See original GitHub issue

After using the collab to export a SavedModel. I attempted to inference it locally with the following code.

from __future__ import absolute_import, division, print_function, unicode_literals
import tensorflow as tf
import tensorflow_text
import numpy as np
physical_devices = tf.config.experimental.list_physical_devices('GPU')
if physical_devices:
  tf.config.experimental.set_memory_growth(physical_devices[0], True)

# %%
tf.compat.v1.enable_resource_variables()
path ="PATH_TO_SAVEDMODEL"
loaded = tf.saved_model.load(path, tags='serve')
print(list(loaded.signatures.keys()))
# %%

x = tf.constant(["translate English to German: this is a test.","translate English to German: this is a test."])
print(x)
# loaded.signatures['serving_default'](x)
print(loaded.signatures['serving_default'](x))

and got the following error:

InvalidArgumentError:  Could not parse example input, value: 'translate English to German: this is a test.'
	 [[{{node ParseSingleExample/ParseSingleExample}}]]
	 [[DatasetToSingleElement]] [Op:__inference_pruned_275843]

Function call stack:
pruned

for inference I am using:

ubuntu: 18.04 CUDA Version: 10.2 tensorflow: 2.1

I’m sure that I’m missing something simple, but any advice on how to solve this would be greatly appreciated.

Issue Analytics

  • State:closed
  • Created 4 years ago
  • Comments:5

github_iconTop GitHub Comments

1reaction
adarobcommented, Feb 18, 2020

I believe it is expecting a serialized tf.train.Example proto, not a raw string.

You should try:

feature = {"input": tf.train.Feature(bytes_list=tf.train.BytesList(value=[b"test"]))}
input_str = tf.train.Example(features=tf.train.Features(feature=feature)).SerializeAsString()

Clearly this is not a great API. We should look into being able to pass strings directly.

Read more comments on GitHub >

github_iconTop Results From Across the Web

TensorFlow SavedModel export fails with AttributeError #34592
I'm following the tutorial exactly as it is here: https://www.tensorflow.org/tutorials/keras/text_classification_with_hub Finally, ...
Read more >
Migrate the SavedModel workflow | TensorFlow Core
To export your model in TensorFlow 2, you must define a tf.Module or a tf.keras.Model to hold all of your model's variables and...
Read more >
Issue exporting trained Tensorflow model parameters to ...
My issue is that while the model trains and learns parameters correctly, I cannot get the serialized export in the binary SavedModel format ......
Read more >
Exporting a SavedModel for prediction | AI Platform Prediction
Exporting your trained model as a SavedModel saves your training graph with its assets, variables and metadata in a format that AI Platform...
Read more >
[RLlib] Problem with TFModelV2 loading after having saved ...
The model is exported succesfully. What's the problem? I am unable to load with tf.saved_model.load() or use the model with SavedModel CLI.
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found