question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Replace uses of `apex.amp` with PyTorch's amp in references

See original GitHub issue

Some reference scripts have added support for AMP via APEX a couple of years ago, see https://github.com/pytorch/vision/blob/4bf608633f9299aeda61b08ae126961acaadec22/references/classification/train.py#L17 for one example

PyTorch now natively supports AMP in the torch.cuda.amp namespace, so we should use PyTorch’s AMP instead of apex for simplicity.

The replacement needs to happen on the following scripts:

cc @datumbox

Issue Analytics

  • State:closed
  • Created 2 years ago
  • Reactions:2
  • Comments:7 (7 by maintainers)

github_iconTop GitHub Comments

2reactions
datumboxcommented, Oct 4, 2021

@prabhat00155 I don’t think it’s a problem to change the name. We don’t have any models that actually use it for training and BC on the references is not our primary focus anyway. My view is that it’s fine to update it.

2reactions
fmassacommented, Oct 1, 2021

We won’t need to install apex anymore, and everything will be handled by PyTorch now. We only used apex for amp in the reference scripts

Read more comments on GitHub >

github_iconTop Results From Across the Web

torch.cuda.amp > apex.amp · Issue #818 · NVIDIA/apex - GitHub
For a while now my main focus has been moving mixed precision functionality into Pytorch core. It was merged about a month ago: ......
Read more >
Automatic Mixed Precision package - torch.amp - PyTorch
torch.amp provides convenience methods for mixed precision, where some operations use the torch.float32 ( float ) datatype and other operations use lower ...
Read more >
Automatic Mixed Precision — PyTorch Tutorials 1.12.1+cu102 ...
Ordinarily, “automatic mixed precision training” uses torch.autocast and torch.cuda.amp.GradScaler together. This recipe measures the performance of a simple ...
Read more >
Torch.cuda.amp vs Nvidia apex? - PyTorch Forums
Yes, apex.amp was our first implementation of mixed-precision training, is deprecated now, and replaced with torch.cuda.amp . @seungjun posted ...
Read more >
CUDA Automatic Mixed Precision examples - PyTorch
Autocasting automatically chooses the precision for GPU operations to improve performance while maintaining accuracy. Instances of torch.cuda.amp.GradScaler ...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found