question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

How to use the saved_model.pb to make prediction on python ?

See original GitHub issue

On running the code through the method provided in Running Script, saved_model.pb & variables folder is obtained. I was able to make a prediction using

saved_model_cli run \
    --dir ./ \
    --tag_set serve \
    --signature_def predict \
    --input_examples 'examples=[{"2":[0.5348837209302325],"3":[0.75],"4":[33.68298368298368],"5":[14.4054],"6":[1707.0],"8":[1.0],"11":[1.0],"14":[1.0],"15":[1.0]},{"2":[0.3409090909090909],"4":[50.0],"5":[0.8047000000000001],"6":[833.0],"12":[1.0]}]'

The above CLI command contains 2 test cases provides me with 2 scores. I want to know a method wherein I could load the saved model on python or host it using docker on tf-serving & then make predictions on multiple inputs. My data is of the form of LibSVM with multiple features on each entry as shown below.

1 qid:0 2:0.6842105263157895 4:29.1951604418727 5:1.4356 6:1346 8:1 11:1 14:1 15:1
0 qid:0 2:0.8125 3:0.8571428571428571 4:32.55459624174708 5:35.69730000000001 6:1328 8:1 11:1 14:1 15:1

Issue Analytics

  • State:closed
  • Created 4 years ago
  • Comments:13 (6 by maintainers)

github_iconTop GitHub Comments

4reactions
hdattelncommented, Aug 18, 2019

I am also a bit stuck on this. However, I am using TFRecords as training/test input data; It would be great to have a bit more information in this example notebook on how to save & load the trained model and make predictions: https://github.com/tensorflow/ranking/blob/master/tensorflow_ranking/examples/handling_sparse_features.ipynb

2reactions
ramakumar1729commented, Aug 19, 2019

I am also a bit stuck on this. However, I am using TFRecords as training/test input data; It would be great to have a bit more information in this example notebook on how to save & load the trained model and make predictions: https://github.com/tensorflow/ranking/blob/master/tensorflow_ranking/examples/handling_sparse_features.ipynb

@hdatteln : Created a separate issue for this. #104

Read more comments on GitHub >

github_iconTop Results From Across the Web

Using the SavedModel format | TensorFlow Core
For a quick introduction, this section exports a pre-trained Keras model and serves image classification requests with it. The rest of the guide...
Read more >
Exporting a SavedModel for prediction | AI Platform Prediction
To deploy your trained models to AI Platform Prediction and use them to serve predictions, you must first export them in the TensorFlow...
Read more >
Simple prediction from frozen .pb saved model - Stack Overflow
I found the solution to pass a dict in wrapped model. This is a slightly modified synthesis of these given solutions with modifications...
Read more >
How to use a saved model in Tensorflow 2.x | by Rahul Bhadani
The saved model can be used to make predictions using a brand new data set. model.predict(X_test). A more descriptive example is given in...
Read more >
How to convert trained Keras model to a single TensorFlow ...
How to convert trained Keras model to a single TensorFlow .pb file and make prediction ; import os os ; from keras import...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found