question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Could you please give a code example of how to export a model for TensorFlow Serving? No luck with estimator.export_saved_model or tf.estimator.BestExporter. I must be doing something wrong with feature_spec.

Issue Analytics

  • State:closed
  • Created 4 years ago
  • Reactions:1
  • Comments:21 (7 by maintainers)

github_iconTop GitHub Comments

1reaction
eggie5commented, Apr 3, 2019

I ran the exporter (above) on my ranker. (my model only has 1 input, a sparse id (used for an embedding lookup) and it has group size 1. )

I understand the input shape (-1) is a protobuf string. However, I thought the output shape was interesting. I would score my documents using the predict signature right? It has a (-1, -1) shape. What does this mean? I would expect [-1, 1] for a batch of scalars, where each scalar is a document score… @ramakumar1729 ?

$ saved_model_cli show --dir . --tag_set serve --signature_def predict
The given SavedModel SignatureDef contains the following input(s):
  inputs['examples'] tensor_info:
      dtype: DT_STRING
      shape: (-1)
      name: input_example_tensor:0
The given SavedModel SignatureDef contains the following output(s):
  outputs['output'] tensor_info:
      dtype: DT_FLOAT
      shape: (-1, -1)
      name: groupwise_dnn_v2/accumulate_scores/truediv:0
Method name is: tensorflow/serving/predict

The example below will score 1 document. How can I score 2,3 or a batch of docs?

saved_model_cli run \
    --dir . \
    --tag_set serve \
    --signature_def predict \
    --input_examples 'examples=[{"1":[534]}]'
Result for output key output:
[[0.8916328]]
1reaction
ramakumar1729commented, Mar 26, 2019

Thanks for the quick reaction. I think I’ve figured it out (assuming this is executed in examples/tf_ranking_libsvm.py):

def serving_input_receiver_fn():
    feature_names = ["{}".format(i + 1) for i in range(FLAGS.num_features)]
    feature_columns = [tf.feature_column.numeric_column(
          name, shape=(1,), default_value=0.0) for name in feature_names]
    feature_spec = tf.feature_column.make_parse_example_spec(feature_columns)

    return  tf.estimator.export.build_parsing_serving_input_receiver_fn(feature_spec)()

I’m not sure if the last brackets are needed. Rest of the code looks fine.

estimator.export_savedmodel(FLAGS.output_dir + ‘/export’, serving_input_receiver_fn)

Does it make sense?
Read more comments on GitHub >

github_iconTop Results From Across the Web

Exporting models for prediction - AI Platform - Google Cloud
This guide describes the different ways to export trained models for deployment on AI Platform Prediction. The following methods of exporting your model...
Read more >
Model export—ArcGIS CityEngine Resources | Documentation
Model export CityEngine can export graphs (street networks), shapes (such as lots or street shapes), and models (such as buildings).
Read more >
Import/Export a model | MeaningCloud
To export a model's categories to a file, just click on the sidebar's Export button and it will create a file with the...
Read more >
Linear regression: Model export - IBM
The Model export dialog provides options for You can export parameter estimates and their covariances to a specified file (in XML format).
Read more >
Export a SavedModel | TensorFlow Hub
Overview. SavedModel is TensorFlow's standard serialization format for trained models or model pieces. It stores the model's trained weights ...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found