question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

SavedModel export for ML Cloud serving

See original GitHub issue

I have fine-tuned the 3B model in Colab notebook provided in this repo as notebooks/t5-trivia.ipynb. After fine-tuning as recommended in this notebook, I would like to export my model as SavedModel to be served by ML Cloud.

To do this I use the following code fragment placed as the last cell in the notebook (so I have model already created):

vocabulary = t5.data.SentencePieceVocabulary(t5.data.DEFAULT_SPM_PATH)
estimator = model.estimator(vocabulary)

your_feature_spec = {
    # "inputs": tf.FixedLenFeature([], dtype=tf.string, default_value=""),
    "inputs": tf.VarLenFeature(dtype=tf.string),
}

def _serving_input_receiver_fn():
    serialized_tf_example = tf.placeholder(dtype=tf.string, shape=None, 
                                           name='inputs')
    # key (e.g. 'examples') should be same with the inputKey when you 
    # buid the request for prediction
    receiver_tensors = {'inputs': serialized_tf_example}
    features = tf.parse_example(serialized_tf_example, your_feature_spec)
    return tf.estimator.export.ServingInputReceiver(features, receiver_tensors)

estimator.export_savedmodel(os.path.join(MODEL_DIR, "saved_model/"), _serving_input_receiver_fn)

I get the following error when executing this code:

image

Complete stack-trace is here: stack-trace-SavedModel.txt

I assume that the error tells that this model is only suitable for TPU inference and is not supposed to work on ML Cloud (where you have only CPU instances available).

Is this a feasible task to make this model served by ML Cloud and if so what are the steps I should follow to accomplish that?

Thank you!

Issue Analytics

  • State:closed
  • Created 4 years ago
  • Comments:5

github_iconTop GitHub Comments

1reaction
adarobcommented, Jan 6, 2020

I have pushed some changes to add a mtf_model.export call. However, I believe I will need to update both the mesh and t5 packages to get everything working for you. I’ll try to test this today to and update the packages.

0reactions
adarobcommented, Mar 2, 2020

Ah, it’s because we call export_estimator_savedmodel (https://github.com/tensorflow/estimator/blob/08dd1e6ca94248691dfe00aadafc65e8b875e44f/tensorflow_estimator/python/estimator/tpu/tpu_estimator.py#L4219), which creates a new TPUEstimator with export_to_tpu set to True by default. Should be fairly easy to fix. I’ll check it out tomorrow.

Read more comments on GitHub >

github_iconTop Results From Across the Web

Exporting a SavedModel for prediction | AI Platform Prediction
To deploy your trained models to AI Platform Prediction and use them to serve predictions, you must first export them in the TensorFlow...
Read more >
Using the SavedModel format | TensorFlow Core
Creating a SavedModel from Keras. For a quick introduction, this section exports a pre-trained Keras model and serves image classification requests with it....
Read more >
Generate SavedModel from Tensorflow model to serve it on ...
To deploy your model to Google Cloud ML, you need a SavedModel which can be produced from tf.saved_model api.
Read more >
Export TensorFlow models in the SavedModel format
Export TensorFlow models in the SavedModel format,Machine Learning Platform for AI:This topic describes how to export TensorFlow models in ...
Read more >
How to export a BigQuery ML model and deploy it for online ...
This is the TensorFlow SavedModel format. You can deploy it to any TF Serving environment for online prediction. On GCP, the fully managed ......
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found