question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

torchserve denies access

See original GitHub issue

Please have a look at FAQ’s and Troubleshooting guide, your query may be already addressed.

Your issue may already be reported! Please search on the issue tracker before creating one.

Context

I tried to follow instructions to get torchserve working, but was denied access to torchserve service.

  • torchserve version: 0.4.2
  • torch-model-archiver version: 1.0
  • torch version: 1.9.0+cpu
  • torchvision version [if any]:
  • torchtext version [if any]:
  • torchaudio version [if any]:
  • java version: openjdk 11.0.11
  • Operating System and version: Ubuntu 20.04.2 LTS

Your Environment

  • Installed using source? [yes/no]: no
  • Are you planning to deploy it using docker container? [yes/no]: yes
  • Is it a CPU or GPU environment?: CPU
  • Using a default/custom handler? [If possible upload/share custom handler/model]: default image_classifier
  • What kind of model is it e.g. vision, text, audio?: densenet161-8d451a50.pth
  • Are you planning to use local models from model-store or public url being used e.g. from S3 bucket etc.? [If public url then provide link.]: no
  • Provide config.properties, logs [ts.log] and parameters used for model registration/update APIs:
  • Link to your project [if any]:

Expected Behavior

[
  {
    "tiger_cat": 0.46933549642562866
  },
  {
    "tabby": 0.4633878469467163
  },
  {
    "Egyptian_cat": 0.06456148624420166
  },
  {
    "lynx": 0.0012828214094042778
  },
  {
    "plastic_bag": 0.00023323034110944718
  }
]

Current Behavior

<!doctype html>
<html>
<head>
<meta http-equiv="refresh" content="0;url=http://7rx80271.ibosscloud.com/ibreports/ibp/bp.html?bu=http://127.0.0.1:8080/predictions/densenet161&bc=The%20requested%20URL%20cannot%20be%20accessed.&ip=10.239.44.142&er=ERR_ACCESS_DENIED"/>
</head>
<body>
</body>
</html>

Possible Solution

Steps to Reproduce

  1. python ./ts_scripts/install_dependencies.py
  2. pip install torchserve torch-model-archiver torch-workflow-archiver
  3. git clone https://github.com/pytorch/serve.git
  4. mkdir model_store
  5. wget https://download.pytorch.org/models/densenet161-8d451a50.pth
  6. torch-model-archiver --model-name densenet161 --version 1.0 --model-file ./serve/examples/image_classifier/densenet_161/model.py --serialized-file densenet161-8d451a50.pth --export-path model_store --extra-files ./serve/examples/image_classifier/index_to_name.json --handler image_classifier
  7. torchserve --start --ncs --model-store model_store --models densenet161.mar
  8. curl -O https://raw.githubusercontent.com/pytorch/serve/master/docs/images/kitten_small.jpg
  9. curl http://127.0.0.1:8080/predictions/densenet161 -T kitten_small.jpg …

Failure Logs [if any]

2021-10-08 22:15:34,240 [INFO ] main org.pytorch.serve.servingsdk.impl.PluginsManager - Initializing plugins manager...
2021-10-08 22:15:34,511 [INFO ] main org.pytorch.serve.ModelServer -
Torchserve version: 0.4.2
TS Home: /root/miniconda3/envs/py38/lib/python3.8/site-packages
Current directory: /workspace/dl/pytorch/srcs/serve
Temp directory: /tmp
Number of GPUs: 0
Number of CPUs: 88
Max heap size: 30688 M
Python executable: /root/miniconda3/envs/py38/bin/python
Config file: N/A
Inference address: http://127.0.0.1:8080
Management address: http://127.0.0.1:8081
Metrics address: http://127.0.0.1:8082
Model Store: /workspace/dl/pytorch/srcs/serve/model_store
Initial Models: densenet161.mar
Log dir: /workspace/dl/pytorch/srcs/serve/logs
Metrics dir: /workspace/dl/pytorch/srcs/serve/logs
Netty threads: 0
Netty client threads: 0
Default workers per model: 88
Blacklist Regex: N/A
Maximum Response Size: 6553500
Maximum Request Size: 6553500
Prefer direct buffer: false
Allowed Urls: [file://.*|http(s)?://.*]
Custom python dependency for model allowed: false
Metrics report format: prometheus
Enable metrics API: true
Workflow Store: /workspace/dl/pytorch/srcs/serve/model_store
Model config: N/A
2021-10-08 22:15:34,520 [INFO ] main org.pytorch.serve.servingsdk.impl.PluginsManager -  Loading snapshot serializer plugin...
2021-10-08 22:15:34,542 [INFO ] main org.pytorch.serve.ModelServer - Loading initial models: densenet161.mar
2021-10-08 22:15:36,381 [DEBUG] main org.pytorch.serve.wlm.ModelVersionedRefs - Adding new version 1.0 for model densenet161
2021-10-08 22:15:36,382 [DEBUG] main org.pytorch.serve.wlm.ModelVersionedRefs - Setting default version to 1.0 for model densenet161
2021-10-08 22:15:36,382 [INFO ] main org.pytorch.serve.wlm.ModelManager - Model densenet161 loaded.
2021-10-08 22:15:36,382 [DEBUG] main org.pytorch.serve.wlm.ModelManager - updateModel: densenet161, count: 88
2021-10-08 22:15:36,557 [INFO ] main org.pytorch.serve.ModelServer - Initialize Inference server with: EpollServerSocketChannel.
2021-10-08 22:15:36,884 [INFO ] main org.pytorch.serve.ModelServer - Inference API bind to: http://127.0.0.1:8080
2021-10-08 22:15:36,885 [INFO ] main org.pytorch.serve.ModelServer - Initialize Management server with: EpollServerSocketChannel.
2021-10-08 22:15:36,902 [INFO ] main org.pytorch.serve.ModelServer - Management API bind to: http://127.0.0.1:8081
2021-10-08 22:15:36,903 [INFO ] main org.pytorch.serve.ModelServer - Initialize Metrics server with: EpollServerSocketChannel.
2021-10-08 22:15:36,939 [INFO ] main org.pytorch.serve.ModelServer - Metrics API bind to: http://127.0.0.1:8082
2021-10-08 22:15:37,930 [DEBUG] W-9009-densenet161_1.0 org.pytorch.serve.wlm.WorkerThread - W-9009-densenet161_1.0 State change null -> WORKER_STARTED
...
2021-10-08 22:16:19,018 [INFO ] W-9044-densenet161_1.0 org.pytorch.serve.wlm.WorkerThread - Backend response time: 40704
2021-10-08 22:16:19,019 [DEBUG] W-9044-densenet161_1.0 org.pytorch.serve.wlm.WorkerThread - W-9044-densenet161_1.0 State change WORKER_STARTED -> WORKER_MODEL_LOADED

Issue Analytics

  • State:closed
  • Created 2 years ago
  • Comments:5 (2 by maintainers)

github_iconTop GitHub Comments

1reaction
jingxu10commented, Oct 11, 2021

Hi @msaroufim , I figured out the reason. It was because of proxy setting… Unset proxy settings solved the problem.

0reactions
jingxu10commented, Oct 11, 2021

Sure. I launched a clean ubuntu 20.04 docker container, and ran the following script inside it. Strangely I can make this script running on a newly created AWS EC2 instance outside docker. Although a clean docker environment should be clean enough, curl still failed with ERR_ACCESS_DENIED.

$ docker run --rm -it --privileged ubuntu:20.04 /bin/bash
$ bash run.sh
$ cat run.sh
#!/bin/bash

SUDO=''
if [[ $UID != 0 ]]; then
    SUDO='sudo '
fi

$SUDO apt update
$SUDO apt -y full-upgrade
DEBIAN_FRONTEND=noninteractive $SUDO apt install -y tzdata
$SUDO ln -fs /usr/share/zoneinfo/Asia/Tokyo /etc/localtime
$SUDO dpkg-reconfigure -f noninteractive tzdata
$SUDO apt install -y tmux vim git wget curl python3 python3-pip
$SUDO update-alternatives --install /usr/bin/python python /usr/bin/python3 100
cd
git clone https://github.com/pytorch/serve.git
cd serve
if [ -z $SUDO ]; then
    sed -i "s/sudo //" ./ts_scripts/install_dependencies.py
fi
python ./ts_scripts/install_dependencies.py
pip install torchserve torch-model-archiver torch-workflow-archiver
echo "export PATH=~/.local/bin:$PATH" >> ~/.bashrc
source ~/.bashrc
cd ..
mkdir model_store
wget https://download.pytorch.org/models/densenet161-8d451a50.pth
torch-model-archiver --model-name densenet161 --version 1.0 --model-file ./serve/examples/image_classifier/densenet_161/model.py --serialized-file densenet161-8d451a50.pth --export-path model_store --extra-files ./serve/examples/image_classifier/index_to_name.json --handler image_classifier
Read more comments on GitHub >

github_iconTop Results From Across the Web

Fix typo in pytorch/serve · Issue #1002 - GitHub
Unable to find image 'torchserve:gpu-latest' locally docker: Error response from daemon: pull access denied for torchserve, repository does ...
Read more >
2. Troubleshooting Guide - PyTorch
Refer to this section for common issues faced while deploying your Pytorch models using Torchserve and their corresponding troubleshooting steps.
Read more >
PyTorch PermissionError: [Errno 13] Permission denied: '/.torch'
I'm running a PyTorch based ML program for image classification using Resnet50 model for transfer learning. I am getting below error regarding ...
Read more >
Bootstrap your own Handler: How and why to create custom ...
TorchServe is a great tool to deploy trained PyTorch models, there is no denying that. But, as with any relatively new project, ...
Read more >
Identity and Access Management in AWS Deep Learning AMI
How to authenticate requests and manage access your DLAMI resources. ... Permissions in the policies determine whether the request is allowed or denied....
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found