question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

How to deploy models where the shape of output tensor is not known

See original GitHub issue

I have a tensorflow frozen graph of a objection detection model, i am unclear about creating a config.pbtxt file for this model since i cannot determine the output shapes before hand and i cannot start the inference server without the “dim” specified. i wanted to know how can i create a config file for this

name: "NF1"
    platform: "tensorflow_graphdef"
    max_batch_size: 16
    
    input [
      {
        name: "image_tensor"
        data_type: TYPE_UINT8
        format: FORMAT_NHWC
        dims: [ 1024, 800, 3 ]
      }
    ]
    
    output [
      {
        name: "num_detections"
        data_type: TYPE_FP32
        dims: [ 300 ]
      },

      {
        name: "detection_boxes"
        data_type: TYPE_FP32
        dims: [ 300, 4  ]
      },

      {
        name: "detection_scores"
        data_type: TYPE_FP32
        dims: [ 300 ]        
      },

      {
        name: "detection_classes"
        data_type: TYPE_FP32
        dims: [ 300 ]        
      }
    ]
    instance_group [    
      {
        gpus: [ 0 ]
      },
      {
        gpus: [ 1 ]
      },
      {
        gpus: [ 2 ]
      },
      {
        gpus: [ 3 ]
      }                  
    ]    
    dynamic_batching {
      preferred_batch_size: [ 16 ]
      max_queue_delay_microseconds: 100
    }

this my config which does not work, i tried fixing the shape to the maximum proposals ie 300. Which i knew wouldn’t work

Issue Analytics

  • State:closed
  • Created 5 years ago
  • Reactions:1
  • Comments:6 (1 by maintainers)

github_iconTop GitHub Comments

4reactions
deadeyegoodwincommented, Nov 30, 2018

TRTIS only supports variable-sized dimension for batching, but this is a common request so we are planning on fixing it. Issue #8 is tracking this request so add upvotes there to indicate that you are interested in it.

0reactions
tilabacommented, Dec 25, 2018

hello,have you solved it?

Read more comments on GitHub >

github_iconTop Results From Across the Web

How to deploy models where the shape of output tensor is not ...
I have a tensorflow frozen graph of a objection detection model, i am unclear about creating a config.pbtxt file for this model since...
Read more >
How to get input tensor shape of an unknown PyTorch model
I am writing a python script, which converts any deep learning models from popular frameworks (TensorFlow, Keras, PyTorch) ...
Read more >
How to deploy Machine Learning models with TensorFlow ...
Export the model into Protobuf. TensorFlow Serving provides SavedModelBuild class to save the model as Protobuf. It is pretty good described here. My...
Read more >
Understanding inputs and outputs for explanation | AI Platform ...
Finding input and output tensors ... After training a TensorFlow model, export it as a SavedModel. The TensorFlow SavedModel contains your trained TensorFlow ......
Read more >
Introduction to Tensors | TensorFlow Core
You can reshape a tensor into a new shape. The tf.reshape operation is fast and cheap as the underlying data does not need...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found