Integrate Pytorch Profiler in
See original GitHub issue🚀 Feature Request
Let’s add support profiling of functions using the Pytorch profiler.
Motivation
Help understand if there are bottlenecks in code
Proposal
Possible Use Case
import torch
from torch.utils.data import DataLoader, TensorDataset
from catalyst import dl
# data
num_samples, num_features = int(1e4), int(1e1)
X, y = torch.rand(num_samples, num_features), torch.rand(num_samples)
dataset = TensorDataset(X, y)
loader = DataLoader(dataset, batch_size=32, num_workers=1)
loaders = {"train": loader, "valid": loader}
# model, criterion, optimizer, scheduler
model = .... #
criterion = torch.nn.MSELoss()
optimizer = torch.optim.Adam(model.parameters())
scheduler = torch.optim.lr_scheduler.MultiStepLR(optimizer, [3, 6])
# model training
runner = dl.SupervisedRunner()
profiler=torch.profiler.profile(
schedule=torch.profiler.schedule(
wait=2,
warmup=2,
active=6,
repeat=1),
on_trace_ready=tensorboard_trace_handler,
with_trace=True
runner.train(
model=model,
criterion=criterion,
optimizer=optimizer,
scheduler=scheduler,
loaders=loaders,
logdir="./logdir",
valid_loader="valid",
valid_metric="loss",
minimize_valid_metric=True,
num_epochs=8,
profiler=profiler
verbose=True,
)
Alternatives
We could also integrate the profiler into other parts of IRunner.
Additional context
I don’t know whether older versions of Pytorch (<1.8.1) allow you to pass profiler as the context manager to be used.
Checklist
- feature proposal description
- motivation
- extra proposal context / proposal alternatives review
FAQ
Please review the FAQ before submitting an issue:
- I have read the documentation and FAQ
- I have reviewed the minimal examples section
- I have checked the changelog for main framework updates
- I have read the contribution guide
- I have joined Catalyst slack (#__questions channel) for issue discussion
Issue Analytics
- State:
- Created 2 years ago
- Reactions:2
- Comments:12 (10 by maintainers)
Top Results From Across the Web
Profiling your PyTorch Module
Profiler can be easily integrated in your code, and the results can be printed as a table or retured in a JSON trace...
Read more >Using PyTorch Profiler with DeepSpeed for performance ...
This tutorial describes how to use PyTorch Profiler with DeepSpeed. PyTorch Profiler is an open-source tool that enables accurate and efficient performance ...
Read more >What is the new PyTorch profiler? - eduCBA
profiler ) is a tool that integrates both forms of data and then creates an interface that maximizes that data's capabilities. This new...
Read more >How to do performance profiling on PyTorch - gists · GitHub
Code snippet is here, the torch.autograd.profiler will record any PyTorch operator (including external operators registered in PyTorch as extension, e.g. ...
Read more >Two Ways to Profile PyTorch Models on Remote Server
It will not only provide you with the best experience when working with Python files and Jupyter notebook, but it also comes integrated...
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
That’s the easy version to play with. The more annoying bit of what I envision is passing the profiler rather than a flag in order to have it be fully configurable. This means that we validate the profiler after its passed, but that’s not too bad either.
dear @ssktotoro, thanks for the issue Am I correct, that we could do something like
? So, just make a code injection here or. here?
The example is taken from the PyTorch docs https://pytorch.org/docs/master/profiler.html.