question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

TF 2.1 SavedModel Format : unexpected input format FORMAT_NONE, expecting FORMAT_NHWC or FORMAT_NCHW

See original GitHub issue

Description I am getting unexpected input format FORMAT_NONE, expecting FORMAT_NHWC or FORMAT_NCHW error while Running The Image Classification Example for my custom model.

I am using Tensorflow 2.1 for Training purposes and save the model in the SavedModel format of Tensorflow. I am also able to successfully start the tritonserver. The model configuration doesn’t have format tag.

input {
        name: "input"
        data_type: TYPE_FP32
        dims: 512
        dims: 512
        dims: 3
}

And because of this, I think I am getting that particular error while I try to execute the inference from the client-side.

Also As mentioned here, I believe if we are using SavedModel format we don’t need config.pbtxt file. So I am not using a custom config file.

TRTIS Information What version of TRTIS are you using? 20.02 Are you using the TRTIS container or did you build it yourself? Using TRTIS container

To Reproduce Steps to reproduce the behavior. Follow steps here for a custom image classification model : https://docs.nvidia.com/deeplearning/sdk/triton-inference-server-master-branch-guide/docs/quickstart.html#quickstart

Describe the models (framework, inputs, outputs), ideally include the model configuration file. If using ensemble include the model configuration file.

model_status {
  key: "classifier"
  value {
    config {
      name: "classifier"
      platform: "tensorflow_savedmodel"
      version_policy {
        latest {
          num_versions: 1
        }
      }
      max_batch_size: 1
      input {
        name: "input"
        data_type: TYPE_FP32
        dims: 512
        dims: 512
        dims: 3
      }
      output {
        name: "label_prob"
        data_type: TYPE_FP32
        dims: 1
      }
      instance_group {
        name: "classifier"
        count: 1
        kind: KIND_CPU
      }
      default_model_filename: "model.savedmodel"
      optimization {
        input_pinned_memory {
          enable: true
        }
        output_pinned_memory {
          enable: true
        }
      }
    }
    version_status {
      key: 1
      value {
        ready_state: MODEL_READY
        ready_state_reason {
        }
      }
    }
  }
}

Expected behavior A clear and concise description of what you expected to happen.

Issue Analytics

  • State:closed
  • Created 3 years ago
  • Comments:5 (1 by maintainers)

github_iconTop GitHub Comments

1reaction
Jawnycommented, Aug 29, 2022

@deadeyegoodwin - how do I find the automatically generated config from the triton server?

run curl localhost:8000/v2/models/<model_name>/config to get the config.

0reactions
RajezMarinercommented, Jul 8, 2022

@deadeyegoodwin - how do I find the automatically generated config from the triton server?

Read more comments on GitHub >

github_iconTop Results From Across the Web

TF 2.1 SavedModel Format : unexpected input format ... - GitHub
I am getting unexpected input format FORMAT_NONE, expecting FORMAT_NHWC or FORMAT_NCHW error while Running The Image Classification Example ...
Read more >
Using the SavedModel format | TensorFlow Core
You can save and load a model in the SavedModel format using the following APIs: Low-level tf.saved_model API. This document describes how to...
Read more >
Can't save in SavedModel format Tensorflow - Stack Overflow
If you would like to use tensorflow saved model format, then use: tms_model = tf.saved_model.save(model,"export/1").
Read more >
Export TensorFlow models in the SavedModel format
It stores supporting files for the prediction service. The variables directory stores the variables saved by calling the tf.train.Saver method.
Read more >
Model saving & serialization APIs - Keras
Saves the model to Tensorflow SavedModel or a single HDF5 file. Please see tf.keras.models.save_model or the Serialization and Saving guide for details.
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found