question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Reproduce a given transform

See original GitHub issue

🚀 Feature Given the history information saved in the sample for a given transform, how can I apply the exact same transform on a new volume

Motivation After doing a lot of data augmentation for training and validation, I would like to visually check some specific volume (the one having the worst loss for instance). Since saving all the transformed volume can become a very high cost (in terme of disk space) one prefer to save the transform’s parameter, and have the possibility to re-create the exact same transformed volume

Pitch I implemented once for the RandomElasticDeformation

    def apply_given_transform(self, sample: Subject, bspline_params) -> dict:
        for image_dict in sample.get_images(intensity_only=False):
            if image_dict[TYPE] == LABEL:
                interpolation = Interpolation.NEAREST
            else:
                interpolation = self.interpolation
            image_dict[DATA] = self.apply_bspline_transform(
                image_dict[DATA],
                image_dict[AFFINE],
                bspline_params,
                interpolation,
            )
        return sample

Alternatives may be a more generic way would be to separate the apply_transform code in 2 disctinct par one to get the random param one to apply them so that the apply code could be reuse in both case …

Additional context

Issue Analytics

  • State:closed
  • Created 3 years ago
  • Reactions:1
  • Comments:55 (55 by maintainers)

github_iconTop GitHub Comments

1reaction
fepegarcommented, Jul 13, 2020

yes we discuss, about the difficulty to save large number of parameter, and this was the reason why you remove them from the sample dictionary, (with the argument it will make torch collate job to complicated (although it was working …)

The parameters were not being saved in the batch! That has been added in #226.

I think they are 2 distinct objectives, that are overlapping in this discussion: 1_ having a way to reproduce any given transform 2_ having the need to log / access each transform parameters, for further used (: for instance to use them to train a regression task, or just to log them to be able to review (check) a given experience , or just study the prediction bias correlation with any transform’s parameter I thought it was more convenient to have the 2_ functionality directly in torchio (as a user option), but having a way to program it outside torchio, is fine to me (with get_parameters_from_seed should make work) I agree reproducibility is important (but never used), but not sure most user will never care to get the transform parameters. My point 2_ is not for reproducibility, but is needed, to look at transform induce bias on model prediction. (ok nobody is doing that today, they only want the data augmentation stuff, and the deep model will solve it for you, but I think it is important to better understand model generalization …)

I think that the current implementation is a good compromise between feasibility in terms of how much information must be stored to reproduce/extract parameters and amount of work for the user.

By the way, having the option for controling different random distribution for parameters (normal uniform poisson ect …) worth a new issue ?

Yes, please.

1reaction
GReguigcommented, Jul 9, 2020

How about randomly generating a seed for a given transform (and saving it) and setting it specifically for this transform (and the given subject) ? This would only require to save the generated seed and resetting it at the beginning of the transform.

I would imagine something like that (the key point would be to systematically manually reset the seed)

import torch
torch.manual_seed(25074390)


def gen_seed():
    """
    Random seed generator to avoid overflow
    :return: 
    """
    return torch.randint(0, 10**8, (1,))


class RandomMultiply:

    def get_params(self):
        torch.manual_seed(seed=self.seed)
        random_number = torch.rand(1).item()
        return self.seed, random_number

    def __call__(self, x, seed=gen_seed()):
        self.seed = seed
        seed, random_number = self.get_params()
        x.n = random_number
        random_params = {
            'seed': seed,
            'random_number': random_number,
        }
        x.history.append(random_params)
        return x


class Number:
    def __init__(self):
        self.n = 100
        self.history = []


def main():
    number = Number()
    num_iterations = 1000
    transform = RandomMultiply()
    results = []
    for _ in range(num_iterations):
        number = transform(number)
        results.append(number.n)

    print('I love this result:', results[600])
    random_params = number.history[600]
    print('The number was {random_number} and the seed was {seed}'.format(**random_params))

    new_number = Number()
    reproducer = RandomMultiply()
    reproduction = reproducer(new_number, seed=random_params['seed'])
    print('\nReproduced result:', reproduction.n)
    print('Random parameters:', reproduction.history[0])
    torch.manual_seed(reproduction.history[0]["seed"])
    print('A reproducible random number:', torch.rand(1).item())


if __name__ == "__main__":
    main()

Would it work ?

Read more comments on GitHub >

github_iconTop Results From Across the Web

Transformation of Functions | Algebra and Trigonometry
Given a function, reflect the graph both vertically and horizontally. Multiply all outputs by –1 for a vertical reflection. The new graph is...
Read more >
Transformation matrix with respect to a basis - Khan Academy
Like if I said "ola" my pc would reproduce "ola ola" , why would he care if it was english or portuguese? Shouldn't...
Read more >
Reproduction - Transformers Wiki
The vast majority of Cybertronians reproduce asexually, but how exactly a new Transformer is created, given life, and introduced to the world is...
Read more >
Using specific transformation in R sf::st_transform() to ...
I can't find a way to reference a specific transformation with this package. I even tried building a custom pipeline to pass to...
Read more >
Rights Granted Under Copyright Law - BitLaw
... from the rights given to a person who merely owns a copy of the work. ... A derivative work usually involves a...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found