question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Python backend stuck at TRITONBACKEND_ModelInstanceInitialize

See original GitHub issue

Description I wanna start the python backend following the example. But the container stucks at

=============================
== Triton Inference Server ==
=============================

NVIDIA Release 22.04 (build 36821869)
Triton Server Version 2.21.0

Copyright (c) 2018-2022, NVIDIA CORPORATION & AFFILIATES.  All rights reserved.

Various files include modifications (c) NVIDIA CORPORATION & AFFILIATES.  All rights reserved.

This container image and its contents are governed by the NVIDIA Deep Learning Container License.
By pulling and using the container, you accept the terms and conditions of this license:
https://developer.nvidia.com/ngc/nvidia-deep-learning-container-license

WARNING: The NVIDIA Driver was not detected.  GPU functionality will not be available.
   Use the NVIDIA Container Toolkit to start this container with GPU support; see
   https://docs.nvidia.com/datacenter/cloud-native/ .

WARNING: [Torch-TensorRT] - Unable to read CUDA capable devices. Return status: 35
I0504 12:25:52.440894 1 libtorch.cc:1381] TRITONBACKEND_Initialize: pytorch
I0504 12:25:52.441090 1 libtorch.cc:1391] Triton TRITONBACKEND API version: 1.9
I0504 12:25:52.441100 1 libtorch.cc:1397] 'pytorch' TRITONBACKEND API version: 1.9
W0504 12:25:52.441171 1 pinned_memory_manager.cc:236] Unable to allocate pinned system memory, pinned memory pool will not be available: CUDA driver version is insufficient for CUDA runtime version
I0504 12:25:52.441209 1 cuda_memory_manager.cc:115] CUDA memory pool disabled
I0504 12:25:52.442350 1 model_repository_manager.cc:1077] loading: resnet:1
I0504 12:25:52.547089 1 python.cc:1769] Using Python execution env /models/resnet/../my-pytorch.tar.gz
I0504 12:25:52.547228 1 python.cc:2054] TRITONBACKEND_ModelInstanceInitialize: resnet_0 (CPU device 0)

My machine dose not have GPU. The confg

name: "resnet"
backend: "python"

input [
  {
    name: "INPUT0"
    data_type: TYPE_FP32
    dims: [ -1, 3, 224, 224 ]
  }
]

output [
  {
    name: "OUTPUT0"
    data_type: TYPE_FP32
    dims: [ -1, 1000 ]
  }
]

instance_group [{
  count: 1
  kind: KIND_CPU
}]

parameters: {
  key: "EXECUTION_ENV_PATH",
  value: {string_value: "$$TRITON_MODEL_DIRECTORY/../my-pytorch.tar.gz"}
}

Triton Information nvcr.io/nvidia/tritonserver:22.04-pyt-python-py3

Are you using the Triton container or did you build it yourself? docker container

Issue Analytics

  • State:open
  • Created a year ago
  • Comments:11 (5 by maintainers)

github_iconTop GitHub Comments

2reactions
rmccorm4commented, May 6, 2022

@Tabrizian filed DLIS-3765

2reactions
Tabriziancommented, May 6, 2022

I tried a small example locally and it did return an error if there wasn’t enough shared memory. @rmccorm4 Could you please file a ticket for this issue so that I can take a closer look? Thanks.

Read more comments on GitHub >

github_iconTop Results From Across the Web

Cannot manually close matplotlib plot window - Stack Overflow
I solved the problem by switching the interactive backend. Either Qt4Agg or TkAgg as an interactive backend resolves the issue. But the question ......
Read more >
Python Job Board | Python.org
New Python Software Engineer Hypothesisbase Remote, Remote, US · Back end, Integration, Systems Posted: 18 December 2022 Developer / Engineer ...
Read more >
Backend Web Development with Python - Full Course
We just published a full backend web development with Python course ... So most of the times this error just stops, our programmers...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found