Issues running exported SavedModel
See original GitHub issueAfter using the collab to export a SavedModel. I attempted to inference it locally with the following code.
from __future__ import absolute_import, division, print_function, unicode_literals
import tensorflow as tf
import tensorflow_text
import numpy as np
physical_devices = tf.config.experimental.list_physical_devices('GPU')
if physical_devices:
tf.config.experimental.set_memory_growth(physical_devices[0], True)
# %%
tf.compat.v1.enable_resource_variables()
path ="PATH_TO_SAVEDMODEL"
loaded = tf.saved_model.load(path, tags='serve')
print(list(loaded.signatures.keys()))
# %%
x = tf.constant(["translate English to German: this is a test.","translate English to German: this is a test."])
print(x)
# loaded.signatures['serving_default'](x)
print(loaded.signatures['serving_default'](x))
and got the following error:
InvalidArgumentError: Could not parse example input, value: 'translate English to German: this is a test.'
[[{{node ParseSingleExample/ParseSingleExample}}]]
[[DatasetToSingleElement]] [Op:__inference_pruned_275843]
Function call stack:
pruned
for inference I am using:
ubuntu: 18.04 CUDA Version: 10.2 tensorflow: 2.1
I’m sure that I’m missing something simple, but any advice on how to solve this would be greatly appreciated.
Issue Analytics
- State:
- Created 4 years ago
- Comments:5
Top Results From Across the Web
TensorFlow SavedModel export fails with AttributeError #34592
I'm following the tutorial exactly as it is here: https://www.tensorflow.org/tutorials/keras/text_classification_with_hub Finally, ...
Read more >Migrate the SavedModel workflow | TensorFlow Core
To export your model in TensorFlow 2, you must define a tf.Module or a tf.keras.Model to hold all of your model's variables and...
Read more >Issue exporting trained Tensorflow model parameters to ...
My issue is that while the model trains and learns parameters correctly, I cannot get the serialized export in the binary SavedModel format ......
Read more >Exporting a SavedModel for prediction | AI Platform Prediction
Exporting your trained model as a SavedModel saves your training graph with its assets, variables and metadata in a format that AI Platform...
Read more >[RLlib] Problem with TFModelV2 loading after having saved ...
The model is exported succesfully. What's the problem? I am unable to load with tf.saved_model.load() or use the model with SavedModel CLI.
Read more >
Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free
Top Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
check out https://github.com/google-research/text-to-text-transfer-transformer/pull/100
I believe it is expecting a serialized tf.train.Example proto, not a raw string.
You should try:
Clearly this is not a great API. We should look into being able to pass strings directly.