question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Pathology Transforms inherit `torch.nn.Module`

See original GitHub issue

Is your feature request related to a problem? Please describe. While we are working on improving Digital Pathology pipelines, there is a need to chain our transforms with torchvision transforms and run them on them GPU. For some reason, implementing it using ToDevice and TorchVision transform did not show an improvement whereas using the same torchvision transforms and chaining them with the model (using nn.Sequential showed a speed up.

Describe the solution you’d like Implement #2883 for all pathology transforms.

  1. Derive all transform from torch.nn.Module instead of Transform
  2. Implement forward for all transforms instead of __call__
  3. Make all of them work with torch.Tensors

This solution should not have any impact on the rest of MONAI and should work with full backward compatibility.

Issue Analytics

  • State:closed
  • Created 2 years ago
  • Comments:5 (5 by maintainers)

github_iconTop GitHub Comments

1reaction
drbehcommented, Sep 3, 2021

@Nic-Ma @wyli, as we discussed in our meeting, the main benefit that it can provide us with is to chain the transforms with the model (using nn.Sequential) and run it on the GPU.

It seems that the main reason that we are getting more speedup in this manner is that the transforms are being applied on the batch of images instead of individual one but I will look more into this and see why using ToDevice and Compose did not give us a significant speedup. Will keep you posted.

0reactions
Nic-Macommented, Sep 3, 2021

Hi @drbeh ,

And I think we need to check the speedup when achieving the same target metrics.

Thanks.

Read more comments on GitHub >

github_iconTop Results From Across the Web

Why most torchvision.transforms inherit nn.Module? - vision
They inherit from nn.Module for two key reasons;. It makes the transforms runnable on GPUs. It makes them compatible with TorchScript. Unless ...
Read more >
[feature request] Derive all transforms classes from nn.Module?
I think there is a lot of merit to make all classes inherit from nn.Module. The new Transforms API is the place to...
Read more >
Applications — MONAI 1.1.0 Documentation
MedNISTDataset(root_dir, section, transform=(), download=False, seed=0, ... See also: https://pytorch.org/docs/stable/generated/torch.nn.functional.
Read more >
torch.nn.modules.module — transformers 4.4.2 documentation
Source code for torch.nn.modules.module. from collections import OrderedDict, namedtuple import ...
Read more >
Custom Dataset and Dataloader in PyTorch - DebuggerCafe
import torch.nn as nn ... as F import torch.optim as optim from torchvision.transforms import transforms from torch.utils.data ... Module):.
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found