question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

[Question] How to serving model with the tensorflow serving

See original GitHub issue

问题已经解决,谢谢BrikerMan的热心帮助,希望这个解决方案可以让其他人节约一点探索时间,更好的享受项目带来的便利!

我尝试把训练模型保存成saved_model模型,代码如下:

import tensorflow as tf
from kashgari.tasks.seq_labeling import BLSTMCRFModel
from keras import backend as K

# K.set_learning_phase(1)
# 关键修改
K.set_learning_phase(0)

model = BLSTMCRFModel.load_model('./model')
legacy_init_op = tf.group(tf.tables_initializer())

xmodel = model.model

with K.get_session() as sess:
    export_path = './saved_model/14'
    builder = tf.saved_model.builder.SavedModelBuilder(export_path)

    signature_inputs = {
        'token_input': tf.saved_model.utils.build_tensor_info(xmodel.input[0]),
        'seg_input': tf.saved_model.utils.build_tensor_info(xmodel.input[1]),
    }

    signature_outputs = {
        tf.saved_model.signature_constants.CLASSIFY_OUTPUT_CLASSES: tf.saved_model.utils.build_tensor_info(
            xmodel.output)
    }

    classification_signature_def = tf.saved_model.signature_def_utils.build_signature_def(
        inputs=signature_inputs,
        outputs=signature_outputs,
        method_name=tf.saved_model.signature_constants.CLASSIFY_METHOD_NAME)

    builder.add_meta_graph_and_variables(
        sess,
        [tf.saved_model.tag_constants.SERVING],
        signature_def_map={
            'predict_webshell_php': classification_signature_def
        },
        legacy_init_op=legacy_init_op
    )

    builder.save()

成功保存后调用saved_model模型预测,结果全是0,请问是什么原因? 调用代码:

import json

import tensorflow as tf
from tensorflow.python.saved_model import signature_constants
from tensorflow.python.saved_model import tag_constants

export_dir = './saved_model/14/'

with open('./model/words.json', 'r', encoding='utf-8') as f:
    dict = json.load(f)

s = ['[CLS]', '国', '正', '学', '长', '的', '文', '章', '与', '诗', '词', ',', '早', '就', '读', '过', '一', '些', ',', '很', '是', '喜',
     '欢', '。', '[CLS]']
s1 = [dict[x] for x in s]
if len(s1) < 100:
    s1 += [0] * (100 - len(s1))
print(s1)
s2 = [0] * 100

with tf.Session() as sess:
    meta_graph_def = tf.saved_model.loader.load(sess, [tag_constants.SERVING], export_dir)
    signature = meta_graph_def.signature_def

    x1_tensor_name = signature['predict_webshell_php'].inputs['token_input'].name
    x2_tensor_name = signature['predict_webshell_php'].inputs['seg_input'].name

    y_tensor_name = signature['predict_webshell_php'].outputs[
        signature_constants.CLASSIFY_OUTPUT_CLASSES].name
    x1 = sess.graph.get_tensor_by_name(x1_tensor_name)
    x2 = sess.graph.get_tensor_by_name(x2_tensor_name)
    y = sess.graph.get_tensor_by_name(y_tensor_name)
    result = sess.run(y, feed_dict={x1: [s1], x2: [s2]})  # 预测值
    print(result.argmax(-1))
    print(result.shape)

Issue Analytics

  • State:closed
  • Created 5 years ago
  • Comments:18 (6 by maintainers)

github_iconTop GitHub Comments

5reactions
BrikerMancommented, Mar 11, 2019

Actually the model is complete keras/TF h5 data format, just renamed. The detail is, considering the situation we would make use of customized layers and other objects, the save/load process make an automation of save and load custom objects to simplify the users’ workload. On Mon, 11 Mar 2019 at 18:24, Ilya Kuznetsov @.***> wrote: @phoenixkillerli https://github.com/phoenixkillerli thank you, with this code saving is working. I’ll try to figure out how to load and use that saved model now. — You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub <#17 (comment)>, or mute the thread https://github.com/notifications/unsubscribe-auth/AA0SqmD_o1UHCpgtkwMN4CzR6cr-563bks5vVi7LgaJpZM4bGmOe .

I think we should add API to save and load the model for tensorflow serving.

2reactions
alexwwangcommented, Mar 11, 2019

Actually the model is complete keras/TF h5 data format, just renamed. The detail is, considering the situation we would make use of customized layers and other objects, the save/load process make an automation of save and load custom objects to simplify the users’ workload.

On Mon, 11 Mar 2019 at 18:24, Ilya Kuznetsov notifications@github.com wrote:

@phoenixkillerli https://github.com/phoenixkillerli thank you, with this code saving is working. I’ll try to figure out how to load and use that saved model now.

— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/BrikerMan/Kashgari/issues/17#issuecomment-471483604, or mute the thread https://github.com/notifications/unsubscribe-auth/AA0SqmD_o1UHCpgtkwMN4CzR6cr-563bks5vVi7LgaJpZM4bGmOe .

Read more comments on GitHub >

github_iconTop Results From Across the Web

Train and serve a TensorFlow model with TensorFlow Serving
Train and serve a TensorFlow model with TensorFlow Serving ; Create your model. Import the Fashion MNIST dataset; Train and evaluate your model....
Read more >
How to Serve Machine Learning Models With TensorFlow ...
You can install Tensorflow Serving without Docker, but using Docker is recommended and is certainly the easiest. In your terminal run the following...
Read more >
Deploy a Servable Question Answering Model Using ...
Now this model object is servable. You will see standard file structures that TensorFlow serving requires: a pb file and a folder named ......
Read more >
How to deploy Machine Learning models with TensorFlow ...
Basically, there are three steps — export your model for serving, create a Docker container with your model and deploy it with Kubernetes...
Read more >
Python - Model Deployment Using TensorFlow Serving
The other way is to deploy a model using TensorFlow serving. Since it also provides API (in form of REST and gRPC), so...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found