question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Serving problem: "Op type not registered 'PyFunc'"

See original GitHub issue

Hello!

I’ve successfully trained model and want to use TensorFlow Serving components to export a trained TensorFlow model and use the standard tensorflow_model_server to serve it. So, using this code I can export model to serving:

 net = resnetv1(batch_size=1, num_layers=101)
 net.create_architecture( "TEST", 2,
                          tag='default', anchor_scales=[8, 16, 32])

variables_to_restore = slim.get_variables_to_restore()
 init_fn = slim.assign_from_checkpoint_fn(checkpoint_path, variables_to_restore)

 saver = tf.train.Saver(sharded=True)
 model_exporter = exporter.Exporter(saver)

 with tf.Session() as sess:
       # for i in range(run_size):    
            init_fn(sess)
            print('Exporting trained model to', export_path)
            model_exporter.init(
                sess.graph.as_graph_def(),
                named_graph_signatures={
                'inputs': exporter.generic_signature({'image': net._image, 'size': net._im_info }),
                'outputs': exporter.generic_signature({ 'scores' : net._predictions['cls_prob'],'bbox_pred':net._predictions['bbox_pred'] ,'rois':net._predictions['rois']})})

            model_exporter.export(export_path, tf.constant(export_version), sess)
            print('model ', export_version, ' exported')

But when I try to serve model, serving show me an error: “Op type not registered ‘PyFunc’”.

It is bacause:

anchors, anchor_length = tf.py_func(generate_anchors_pre,
                                          [height, width,
                                           self._feat_stride, self._anchor_scales, self._anchor_ratios],
                                          [tf.float32, tf.int32], name="generate_anchors"
 rois, rpn_scores = tf.py_func(proposal_layer,
                                    [rpn_cls_prob, rpn_bbox_pred, self._im_info, self._mode,
                                     self._feat_stride, self._anchors, self._num_anchors],
                                    [tf.float32, tf.float32])

How can rewrite this functions using tensorflow and python to use serving?

Gratefull for any thougths!

Issue Analytics

  • State:closed
  • Created 6 years ago
  • Comments:6 (2 by maintainers)

github_iconTop GitHub Comments

7reactions
markusnagelcommented, Jun 6, 2017

Hi @vaklyuenkov,

I have this model running in TF serving. As you already pointed out the issue is the use of tf.py_func in some of the layers. TF serving is written in C++ and therefore does not support custom python layers. As far as I’m aware there is one way how to solve it, and that’s by replacing all python layers with equivalent tensorflow operations or layers.

In my own fork I made a basic implementations of all layers required for inference. It also contains an example script (in tools/export_tf_serving.py) which should work out of the box if you have the demo running. I have all layer implementations (without the example) in a separate branch and plan to soon make a PR so it can be merged into the main repository. I hope that helps you.

1reaction
rishabhmalhotracommented, Mar 7, 2018

Is there any parallel implementation/workaround this since I am interested in knowing whether I can deploy this using TF_Serving (and the tensorflow official docs say that it’s not possible to serialize with py_func) ?

Read more comments on GitHub >

github_iconTop Results From Across the Web

Op type not registered 'PyFunc' · Issue #54 · tensorflow/serving
I am working on a project and have use the tf.py_func() to create my own OP in python. I can compile the training...
Read more >
NotFoundError: No registered 'PyFunc' OpKernel for 'CPU ...
I firstly have to emphasize that the code is running on CPU and GPU without error. I only have this error when I...
Read more >
tft.apply_pyfunc | TFX - TensorFlow
py_func op will not work and will cause an error. This means that TensorFlow Serving will not be able to serve this graph....
Read more >
Building Python function-based components - Kubeflow
Building your own lightweight pipelines components using Python.
Read more >
Dataset map function error : TypeError: Expected list for 'input ...
py:385 _apply_op_helper (input_name, op_type_name, values)) TypeError: Expected list for 'input' argument to 'EagerPyFunc' Op, not Tensor(" ...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found