question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

tf.saved_model.save and predict a single value

See original GitHub issue

In order to save the model, I have added this line after the training loop: tf.saved_model.save(model, os.path.join(FLAGS.output_dir, "1") ) in order to get: assets, saved_model.pb and variables

from there, I am trying to load the model and predict a single value:

loaded = tf.saved_model.load( os.path.join(model_dir, "1") )

tokenizer = tokenization.FullTokenizer(vocab_file=None,spm_model_file=spm_model_file, do_lower_case=True)

text_a = "the movie was not good"
example = classifier_data_lib.InputExample(guid=0, text_a=text_a, text_b=None, label=0)

labels = [0, 1]
max_seq_length = 128

feature = classifier_data_lib.convert_single_example(ex_index=0, example=example, label_list=labels, max_seq_length=max_seq_length, tokenizer=tokenizer)

test_input_word_ids =tf.convert_to_tensor([feature.input_ids], dtype=tf.int32, name='input_word_ids')
test_input_mask     =tf.convert_to_tensor([feature.input_mask], dtype=tf.int32, name='input_mask')
test_input_type_ids =tf.convert_to_tensor([feature.segment_ids], dtype=tf.int32, name='input_type_ids')

logit = loaded.signatures["serving_default"]( input_mask=test_input_mask,input_type_ids=test_input_type_ids,input_word_ids=test_input_word_ids )

pred = tf.argmax(logit['output'], axis=-1, output_type=tf.int32)
prob = tf.nn.softmax(logit['output'], axis=-1)

print(f'Prediction: {pred} Probabilities: {prob}')

This solution works for a single value. Thanks

Issue Analytics

  • State:closed
  • Created 4 years ago
  • Comments:9

github_iconTop GitHub Comments

2reactions
birdmwcommented, Jan 3, 2020

We’re in the same boat. We are doing it on DataBricks. Had some extra errors with the callbacks. So we commented them out, borrowed your code, and put it in a custom callback and it works now. So cheers. `class MyCustomCallback(tf.keras.callbacks.Callback):

def on_train_batch_begin(self, batch, logs=None): print(‘Training: batch {} begins at {}’.format(batch, datetime.datetime.now().time()))

def on_train_batch_end(self, batch, logs=None): print(‘Training: batch {} ends at {}’.format(batch, datetime.datetime.now().time())) print(“saving model as per callback to:”, os.path.join(FLAGS.output_dir, “1”)) tf.saved_model.save(model, os.path.join(FLAGS.output_dir, “1”) ) print(“model saved”)

def on_test_batch_begin(self, batch, logs=None): print(‘Evaluating: batch {} begins at {}’.format(batch, datetime.datetime.now().time()))

def on_test_batch_end(self, batch, logs=None): print(‘Evaluating: batch {} ends at {}’.format(batch, datetime.datetime.now().time()))`

a la https://www.tensorflow.org/guide/keras/custom_callback

0reactions
008karancommented, Mar 3, 2020

@birdmw Can tell what changes you did to add your custom callbacks. When I am running training its printing out results at the end of each epoch. I want to see at every step. I am not able to figure it out. can you help?

Read more comments on GitHub >

github_iconTop Results From Across the Web

Using the SavedModel format | TensorFlow Core
A SavedModel contains a complete TensorFlow program, including trained parameters (i.e, tf.Variable s) and computation. It does not require the original ...
Read more >
TensorFlow: How to predict from a SavedModel?
Assuming you want predictions in Python, SavedModelPredictor is probably the easiest way to load a SavedModel and get predictions.
Read more >
Save and load a TensorFlow Estimator model for predictions.
This article will describe in detail the process to save a TensorFlow (V2) Estimator model and then re-load it for prediction.
Read more >
Exporting a SavedModel for prediction | AI Platform Prediction
To deploy your trained models to AI Platform Prediction and use them to serve predictions, you must first export them in the TensorFlow...
Read more >
A quick complete tutorial to save and restore Tensorflow models
Remember that Tensorflow variables are only alive inside a session. So, you have to save the model inside a session by calling save...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found