question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Option (flag) to disable `optimized_execution` in pytorch backend

See original GitHub issue

Option to execute torch.jit with the optimized_execution set to false, in order to avoid the warm-up and optimisation.

Python counterpart:

with torch.jit.optimized_execution(False):
    y = model(x)

Issue Analytics

  • State:closed
  • Created 2 years ago
  • Comments:8 (5 by maintainers)

github_iconTop GitHub Comments

1reaction
ioangatopcommented, Jun 3, 2021

Hi @CoderHam! Will do, I’ll let you know. Thanks!

1reaction
CoderHamcommented, Jun 2, 2021

@ioangatop can you try to build and test the changes from this PR: triton-inference-server/pytorch_backend#24?

Read more comments on GitHub >

github_iconTop Results From Across the Web

Performance Tuning Guide - PyTorch
General optimizations · Enable async data loading and augmentation · Disable gradient calculation for validation or inference · Disable bias for convolutions ...
Read more >
TorchScript — PyTorch 1.13 documentation
Since TorchScript (scripting and tracing) is disabled with this flag, ... There are a couple of fusion backends available to optimize TorchScript execution....
Read more >
torch.backends — PyTorch 1.13 documentation
torch.backends controls the behavior of various backends that PyTorch supports. ... This flag (a str ) allows overriding those heuristics.
Read more >
5. Advanced configuration - PyTorch
If this option is disabled, TorchServe runs in the background ... Backend workers execute an arbitrary model's custom code, which might expose a...
Read more >
Distributed communication package - torch.distributed - PyTorch
This is done since CUDA execution is async and it is no longer safe to ... Options for the nccl backend, is_high_priority_stream can...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found