question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Keras Tutorial produces model that does not integrate with the TF Serving Rest API

See original GitHub issue

Apologies if I am missing something obvious, but i’ve been struggling with this for a about 5 days and it looks like critical documentation is missing. I’m building a text classifier based off of the Keras TFX tutorial. As far as i can tell, the serving_default signaturedef created by the serve_tf_examples_fn. literally only parses tf.examples.:

@tf.function
  def serve_tf_examples_fn(serialized_tf_examples):
    """Returns the output to be used in the serving signature."""
    feature_spec = tf_transform_output.raw_feature_spec()
    feature_spec.pop(_LABEL_KEY)
    parsed_features = tf.io.parse_example(serialized_tf_examples, feature_spec). #Causes graph to require tf.example input.  what about raw string input?

    transformed_features = model.tft_layer(parsed_features)
    transformed_features.pop(_transformed_name(_LABEL_KEY))

    return model(transformed_features)

This makes perfect sense in terms of TFMA input for the Evaluator component, which works beautifully, but makes no sense in terms of running the model in Serving, as it requires the REST client to import the necessary Tensorflow libraries in order to format and serialize a tf.example for every request as is expected by the line above:

parsed_features = tf.io.parse_example(serialized_tf_examples, feature_spec)

Surely this is not intended as acceptable overhead expected by clients?

I’ve successfully built an entire TFX pipeline to classify text as a tf.string and am ready to deploy to production but for this one roadblock. I’d be happy to submit a pull request with a documentation update if any experts could point me in the right direction toward documenting the appropriate way to parse in raw text vs a serialized TF.example.

Thank you very much.

Issue Analytics

  • State:closed
  • Created 3 years ago
  • Reactions:7
  • Comments:8 (1 by maintainers)

github_iconTop GitHub Comments

2reactions
hanneshapkecommented, Jun 4, 2020

Hi @jason-brian-anderson,

I came across your question. Someone asked a similar question regarding the TFX BERT Pipeline example and therefore, I created an example export tf function to avoid the tf.Example dependency. Maybe the colab example below can be helpful.

Previously, in the Estimator world, we had methods like build_raw_serving_input_receiver_fn to avoid the tf.Example dependency. As far as I understand, we can now define the inputs through the TensorSpecs in the concrete_fuunction() when we export the model. TF is looking for the serve_tf_examples_fn() function but we can remove the tf.Example parsing and add our input handling as shown in the Colab notebook below:

    @tf.function
    def serve_tf_examples_fn(text):
        """Returns the output to be used in the serving signature.
        TF is currently looking for a function serve_tf_examples_fn(),
        therefore the function name can't be changed, even though we aren't 
        parsing tf.Example records.
        """
        reshaped_text = tf.reshape(text, [-1, 1])
        transformed_features = model.tft_layer({"text": reshaped_text})
        outputs = model(transformed_features)
        return {'outputs': outputs}

model.tft_layer expects a dict of the inputs. Because of our input signature of text has the shape [None], we need to reshape the Tensor to match the Keras input requirements.

Colab version of the BERT Pipeline which exports a model for simple REST requests: https://colab.research.google.com/gist/hanneshapke/f0980b7422d367808dae409536fe9b46/tfx_pipeline_for_bert_preprocessing_wo_tf-example.ipynb

Please note: I am not working for Google and this answer doesn’t represent the reply a Tensorflower. It just shows a way of how I solved the same problem.

@jason-brian-anderson Let me know if this solution works for you.

1reaction
1025KBcommented, Jun 5, 2020

Hi, @jason-brian-anderson, you can try change the signature to take tensors instead of serialized tf.example see if that works for tf.serving restful api.

here is an example

Read more comments on GitHub >

github_iconTop Results From Across the Web

Training and Serving ML models with tf.keras
In this blogpost, we will work through the process of training, exporting and serving a neural network with tf.keras .
Read more >
Deploying Keras models using TensorFlow Serving and Flask
TensorFlow Serving makes the process of taking a model into production easier and faster. It allows you to safely deploy new models and...
Read more >
How to serve a model with TensorFlow - Cnvrg.io
Make a request to your model in TensorFlow Serving ... The final step in this process is to use the REST API endpoint...
Read more >
Training and Serving ML models with tf.keras - ML6 blog
The tf.data API is a powerful library that allows to consume data from various sources and pass it to TensorFlow models. Can we...
Read more >
Deep Learning Tutorial 48 (Tensorflow, Python) - YouTube
Are you using flask or Fast API to serve your machine learning models ? tf serving is a tool that allows you to...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found