question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. ItĀ collects links to all the places you might be looking at while hunting down a tough bug.

And, if youā€™re still stuck at the end, weā€™re happy to hop on a call to see how we can help out.

RandomAffine with 2 values for scale specified

See original GitHub issue

šŸ› Bug: RandomAffine and scale

Iā€™m getting the following error when using RandomAffine initialized with a tuple of two values for scale:

TypeError: scale_y should be a float number or a tuple with length 2 whose values between (-inf, inf).Got tensor([]).

I havenā€™t quite ironed down the cause of the bug, but I think this happens when exactly 4 entries of a batch are selected to have the scaling operation applied to them. Then, we hit this line and enter the _joint_range_check function, which then fails because I only specified a tuple of two values for scale. I can confirm I donā€™t run into this issue when I specify 4 values for scale.
This looks related to #714 but Iā€™m running Kornia 0.4.1 which includes this fix so it may still be relevant?

To Reproduce

Essentially, I instantiated a RandomAffine class

# The only transformation I'm applying to my image prior to stuffing them into the kornia 'function'
transform = transforms.Compose([transforms.ToTensor()])

transform_fcn = torch.nn.Sequential(
    K.augmentation.RandomAffine(degrees=(-45., 45.), 
                                scale=(0.8, 1.4), 
                                shear=(0., 0.15), 
                                return_transform=False, p=1),   # for sake of argument
    K.augmentation.RandomHorizontalFlip(),
    K.augmentation.Normalize(mean=0.5, std=0.5)
)

To emphasis this issue, I instantiated a dataloader with a batch size of 4 and then tried to pass this through the augmentation:

loader = DataLoader(train_ds, batch_size=4)
x = next(iter(loader))
out = transform_fcn(x)

This resulted in the stack trace below.

---------------------------------------------------------------------------
TypeError                                 Traceback (most recent call last)
<ipython-input-30-cf0a915755df> in <module>
----> 1 out = transform_fcn(x[0])

~/default-env/lib/python3.8/site-packages/torch/nn/modules/module.py in _call_impl(self, *input, **kwargs)
    725             result = self._slow_forward(*input, **kwargs)
    726         else:
--> 727             result = self.forward(*input, **kwargs)
    728         for hook in itertools.chain(
    729                 _global_forward_hooks.values(),

~/default-env/lib/python3.8/site-packages/torch/nn/modules/container.py in forward(self, input)
    115     def forward(self, input):
    116         for module in self:
--> 117             input = module(input)
    118         return input
    119 

~/default-env/lib/python3.8/site-packages/torch/nn/modules/module.py in _call_impl(self, *input, **kwargs)
    725             result = self._slow_forward(*input, **kwargs)
    726         else:
--> 727             result = self.forward(*input, **kwargs)
    728         for hook in itertools.chain(
    729                 _global_forward_hooks.values(),

~/default-env/lib/python3.8/site-packages/kornia/augmentation/base.py in forward(self, input, params, return_transform)
    196             return_transform = self.return_transform
    197         if params is None:
--> 198             params = self.__forward_parameters__(batch_shape, self.p, self.p_batch, self.same_on_batch)
    199         if 'batch_prob' not in params:
    200             params['batch_prob'] = torch.tensor([True] * batch_shape[0])

~/default-env/lib/python3.8/site-packages/kornia/augmentation/base.py in __forward_parameters__(self, batch_shape, p, p_batch, same_on_batch)
     92             batch_prob = batch_prob.repeat(batch_shape[0])
     93         # selectively param gen
---> 94         return self.__selective_param_gen__(batch_shape, batch_prob)
     95 
     96     def apply_func(self, input: torch.Tensor, params: Dict[str, torch.Tensor],

~/default-env/lib/python3.8/site-packages/kornia/augmentation/base.py in __selective_param_gen__(self, batch_shape, to_apply)
     63     def __selective_param_gen__(
     64             self, batch_shape: torch.Size, to_apply: torch.Tensor) -> Dict[str, torch.Tensor]:
---> 65         _params = self.generate_parameters(
     66             torch.Size((int(to_apply.sum().item()), *batch_shape[1:])))
     67         if _params is None:

~/default-env/lib/python3.8/site-packages/kornia/augmentation/augmentation.py in generate_parameters(self, batch_shape)
    483 
    484     def generate_parameters(self, batch_shape: torch.Size) -> Dict[str, torch.Tensor]:
--> 485         return rg.random_affine_generator(
    486             batch_shape[0], batch_shape[-2], batch_shape[-1], self.degrees, self.translate, self.scale, self.shear,
    487             self.same_on_batch)

~/default-env/lib/python3.8/site-packages/kornia/augmentation/random_generator/random_generator.py in random_affine_generator(batch_size, height, width, degrees, translate, scale, shear, same_on_batch)
    173         _scale = _adapted_uniform((batch_size,), scale[0], scale[1], same_on_batch).unsqueeze(1).repeat(1, 2)
    174         if len(_scale) == 4:
--> 175             _joint_range_check(cast(torch.Tensor, scale[2:]), "scale_y")
    176             _scale[:, 1] = _adapted_uniform((batch_size,), scale[2], scale[3], same_on_batch)
    177     else:

~/default-env/lib/python3.8/site-packages/kornia/augmentation/utils/param_validation.py in _joint_range_check(ranged_factor, name, bounds)
     45             raise ValueError(f"{name}[0] should be smaller than {name}[1] got {ranged_factor}")
     46     else:
---> 47         raise TypeError(
     48             f"{name} should be a float number or a tuple with length 2 whose values between {bounds}."
     49             f"Got {ranged_factor}.")

TypeError: scale_y should be a float number or a tuple with length 2 whose values between (-inf, inf).Got tensor([]).

Environment

  • Ubuntu 20.04
  • Tried PyTorch 1.6.0 and 1.7.0 (via pip)
  • Kornia 0.4.1
  • Python 3.8

Issue Analytics

  • State:closed
  • Created 3 years ago
  • Comments:6

github_iconTop GitHub Comments

2reactions
shijianjiancommented, Nov 17, 2020

I see the problem. It happens only when batchsize == 4 due to a typo. Fixed in #786.

0reactions
busycalibratingcommented, Nov 17, 2020

Awesome, thanks for the quick fix!

Read more comments on GitHub >

github_iconTop Results From Across the Web

RandomAffine ā€” Torchvision main documentation - PyTorch
RandomAffine (degrees, translate=None, scale=None, shear=None, ... Else if shear is a sequence of 2 values a shear parallel to the x axis in...
Read more >
How to perform random affine transformation of an image?
Example 2. In the following Python3 program, we translate and scale the image along with affine transformation. # import required librariesĀ ...
Read more >
MATLAB randomAffine2d - MathWorks
Create a 2-D affine transformation object that rotates images. To select a rotation angle from a custom range, specify the Rotation name-value argument...
Read more >
Random affine transformation of the image keeping center... in ...
If shear is a number, a shear parallel to the x axis in the range (-shear, +shear) will be applied. Else if shear...
Read more >
Is there a way to retrieve the specific parameters used in a ...
I can augment my data during training by applying a random transform (rotation/translation/rescaling) but I don't know the value that wasĀ ...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found