Issue with simple_string model in the examples
See original GitHub issueHi, when trying to run trtis nvidia-docker image, I get the following error:
nvidia-docker run --rm --shm-size=1g --ulimit memlock=-1 --ulimit stack=67108864 -p8000:8000 -p8001:8001 -p8002:8002 -v/mnt/workspace/jo/trtis/tensorrt-inference-server/docs/examples/model_repository:/models nvcr.io/nvidia/tensorrtserver:19.01-py3 trtserver --model-store=/models
[libprotobuf ERROR external/protobuf_archive/src/google/protobuf/text_format.cc:307] Error parsing text-format nvidia.inferenceserver.ModelConfig: 9:5: Unknown enumeration value of “TYPE_STRING” for field “data_type”. E0203 11:52:45.284641 1 server.cc:574] Can’t parse /models/simple_string/config.pbtxt as text proto E0203 11:52:47.226951 1 metrics.cc:238] failed to get energy consumption for GPU 0, NVML_ERROR 3
This is after I downloaded the models with the script.
Thanks
Issue Analytics
- State:
- Created 5 years ago
- Comments:8 (4 by maintainers)
Top GitHub Comments
The STRING type support and simple_string example is added in 19.02 version. To try it out, you will need to wait for the release of 19.02 container or build the TRTIS from source with
docker build --pull -t tensorrtserver .
If you just want to try running the 19.01 container, you can checkout the r19.01 branch and retrieve the models there.
Thanks much for your thorough answer ! It helped much.