SavedModel export for ML Cloud serving
See original GitHub issueI have fine-tuned the 3B model in Colab notebook provided in this repo as notebooks/t5-trivia.ipynb. After fine-tuning as recommended in this notebook, I would like to export my model as SavedModel to be served by ML Cloud.
To do this I use the following code fragment placed as the last cell in the notebook (so I have model already created):
vocabulary = t5.data.SentencePieceVocabulary(t5.data.DEFAULT_SPM_PATH)
estimator = model.estimator(vocabulary)
your_feature_spec = {
# "inputs": tf.FixedLenFeature([], dtype=tf.string, default_value=""),
"inputs": tf.VarLenFeature(dtype=tf.string),
}
def _serving_input_receiver_fn():
serialized_tf_example = tf.placeholder(dtype=tf.string, shape=None,
name='inputs')
# key (e.g. 'examples') should be same with the inputKey when you
# buid the request for prediction
receiver_tensors = {'inputs': serialized_tf_example}
features = tf.parse_example(serialized_tf_example, your_feature_spec)
return tf.estimator.export.ServingInputReceiver(features, receiver_tensors)
estimator.export_savedmodel(os.path.join(MODEL_DIR, "saved_model/"), _serving_input_receiver_fn)
I get the following error when executing this code:
Complete stack-trace is here: stack-trace-SavedModel.txt
I assume that the error tells that this model is only suitable for TPU inference and is not supposed to work on ML Cloud (where you have only CPU instances available).
Is this a feasible task to make this model served by ML Cloud and if so what are the steps I should follow to accomplish that?
Thank you!
Issue Analytics
- State:
- Created 4 years ago
- Comments:5
I have pushed some changes to add a mtf_model.export call. However, I believe I will need to update both the mesh and t5 packages to get everything working for you. I’ll try to test this today to and update the packages.
Ah, it’s because we call
export_estimator_savedmodel
(https://github.com/tensorflow/estimator/blob/08dd1e6ca94248691dfe00aadafc65e8b875e44f/tensorflow_estimator/python/estimator/tpu/tpu_estimator.py#L4219), which creates a new TPUEstimator withexport_to_tpu
set toTrue
by default. Should be fairly easy to fix. I’ll check it out tomorrow.