Keras Tutorial produces model that does not integrate with the TF Serving Rest API
See original GitHub issueApologies if I am missing something obvious, but i’ve been struggling with this for a about 5 days and it looks like critical documentation is missing. I’m building a text classifier based off of the Keras TFX tutorial. As far as i can tell, the serving_default
signaturedef created by the serve_tf_examples_fn. literally only parses tf.examples.:
@tf.function
def serve_tf_examples_fn(serialized_tf_examples):
"""Returns the output to be used in the serving signature."""
feature_spec = tf_transform_output.raw_feature_spec()
feature_spec.pop(_LABEL_KEY)
parsed_features = tf.io.parse_example(serialized_tf_examples, feature_spec). #Causes graph to require tf.example input. what about raw string input?
transformed_features = model.tft_layer(parsed_features)
transformed_features.pop(_transformed_name(_LABEL_KEY))
return model(transformed_features)
This makes perfect sense in terms of TFMA input for the Evaluator component, which works beautifully, but makes no sense in terms of running the model in Serving, as it requires the REST client to import the necessary Tensorflow libraries in order to format and serialize a tf.example for every request as is expected by the line above:
parsed_features = tf.io.parse_example(serialized_tf_examples, feature_spec)
Surely this is not intended as acceptable overhead expected by clients?
I’ve successfully built an entire TFX pipeline to classify text as a tf.string and am ready to deploy to production but for this one roadblock. I’d be happy to submit a pull request with a documentation update if any experts could point me in the right direction toward documenting the appropriate way to parse in raw text vs a serialized TF.example.
Thank you very much.
Issue Analytics
- State:
- Created 3 years ago
- Reactions:7
- Comments:8 (1 by maintainers)
Hi @jason-brian-anderson,
I came across your question. Someone asked a similar question regarding the TFX BERT Pipeline example and therefore, I created an example export tf function to avoid the tf.Example dependency. Maybe the colab example below can be helpful.
Previously, in the Estimator world, we had methods like
build_raw_serving_input_receiver_fn
to avoid the tf.Example dependency. As far as I understand, we can now define the inputs through theTensorSpecs
in theconcrete_fuunction()
when we export the model. TF is looking for theserve_tf_examples_fn()
function but we can remove the tf.Example parsing and add our input handling as shown in the Colab notebook below:model.tft_layer
expects adict
of the inputs. Because of our input signature oftext
has the shape[None]
, we need to reshape the Tensor to match the Keras input requirements.Colab version of the BERT Pipeline which exports a model for simple REST requests: https://colab.research.google.com/gist/hanneshapke/f0980b7422d367808dae409536fe9b46/tfx_pipeline_for_bert_preprocessing_wo_tf-example.ipynb
Please note: I am not working for Google and this answer doesn’t represent the reply a Tensorflower. It just shows a way of how I solved the same problem.
@jason-brian-anderson Let me know if this solution works for you.
Hi, @jason-brian-anderson, you can try change the signature to take tensors instead of serialized tf.example see if that works for tf.serving restful api.
here is an example