question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Make TorchServe multi framework

See original GitHub issue

We’ve been assuming so far that Torchserve can only work with Pytorch Eager mode or Torchscripted models but our current handler is general enough to make it possible to support ONNX models.

The idea is a hack one of our partners mentioned that involves

  1. Adding onnx as a dependency in docker file or requirements.txt
  2. Loading onnx model in initialize handler
  3. Making an inference in the inference handler

It may not necessarily be the best way to serve ONNX models but it lets people avoid having to use a different serving infrastructure for each different type of model

This is a good level 3-4 bootcamp task - the goal would be to

  1. Get a Pytorch model like Resnet 18
  2. Export it using ONNX exporter
  3. Run and inference with it in an ONNX handler and submit it as an example in this repo

Issue Analytics

  • State:closed
  • Created 2 years ago
  • Reactions:2
  • Comments:7 (4 by maintainers)

github_iconTop GitHub Comments

1reaction
msaroufimcommented, Nov 14, 2022

This was just merged, will be featured in next release today

0reactions
joaquincabezascommented, Nov 14, 2022

Hello @msaroufim

Thanks for your initiative! Would love to see Torchserve serving ONNX “out-of-the-box”. Any feedback on these benchmarks?

Read more comments on GitHub >

github_iconTop Results From Across the Web

1. TorchServe — PyTorch/Serve master documentation
Multi Modal Framework - Build and deploy a classifier that combines text, audio and video input data. Dual Translation Workflow -.
Read more >
Serving PyTorch models with TorchServe | by Álvaro Bartolomé
TorchServe is the ML model serving framework developed by PyTorch. This post explains how to train and serve a CNN transfer learning model....
Read more >
Introducing TorchServe: a PyTorch model serving framework
TorchServe makes it easy to deploy PyTorch models at scale in ... With powerful TorchServe features including multi-model serving, ...
Read more >
A Practical Guide to TorchServe - Medium
... model serving framework for PyTorch that makes it easy to deploy ... This command-line call takes in the single or multiple models...
Read more >
How to Serve PyTorch Models with TorchServe - YouTube
Hamid Shojanazeri is a Partner Engineer at PyTorch, here to demonstrate the basics of using TorchServe. As the preferred model serving ...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found