RandomAffine with 2 values for scale specified
See original GitHub issueš Bug: RandomAffine and scale
Iām getting the following error when using RandomAffine
initialized with a tuple of two values for scale
:
TypeError: scale_y should be a float number or a tuple with length 2 whose values between (-inf, inf).Got tensor([]).
I havenāt quite ironed down the cause of the bug, but I think this happens when exactly 4 entries of a batch are selected to have the scaling operation applied to them. Then, we hit this line and enter the _joint_range_check
function, which then fails because I only specified a tuple of two values for scale. I can confirm I donāt run into this issue when I specify 4 values for scale
.
This looks related to #714 but Iām running Kornia 0.4.1 which includes this fix so it may still be relevant?
To Reproduce
Essentially, I instantiated a RandomAffine
class
# The only transformation I'm applying to my image prior to stuffing them into the kornia 'function'
transform = transforms.Compose([transforms.ToTensor()])
transform_fcn = torch.nn.Sequential(
K.augmentation.RandomAffine(degrees=(-45., 45.),
scale=(0.8, 1.4),
shear=(0., 0.15),
return_transform=False, p=1), # for sake of argument
K.augmentation.RandomHorizontalFlip(),
K.augmentation.Normalize(mean=0.5, std=0.5)
)
To emphasis this issue, I instantiated a dataloader with a batch size of 4 and then tried to pass this through the augmentation:
loader = DataLoader(train_ds, batch_size=4)
x = next(iter(loader))
out = transform_fcn(x)
This resulted in the stack trace below.
---------------------------------------------------------------------------
TypeError Traceback (most recent call last)
<ipython-input-30-cf0a915755df> in <module>
----> 1 out = transform_fcn(x[0])
~/default-env/lib/python3.8/site-packages/torch/nn/modules/module.py in _call_impl(self, *input, **kwargs)
725 result = self._slow_forward(*input, **kwargs)
726 else:
--> 727 result = self.forward(*input, **kwargs)
728 for hook in itertools.chain(
729 _global_forward_hooks.values(),
~/default-env/lib/python3.8/site-packages/torch/nn/modules/container.py in forward(self, input)
115 def forward(self, input):
116 for module in self:
--> 117 input = module(input)
118 return input
119
~/default-env/lib/python3.8/site-packages/torch/nn/modules/module.py in _call_impl(self, *input, **kwargs)
725 result = self._slow_forward(*input, **kwargs)
726 else:
--> 727 result = self.forward(*input, **kwargs)
728 for hook in itertools.chain(
729 _global_forward_hooks.values(),
~/default-env/lib/python3.8/site-packages/kornia/augmentation/base.py in forward(self, input, params, return_transform)
196 return_transform = self.return_transform
197 if params is None:
--> 198 params = self.__forward_parameters__(batch_shape, self.p, self.p_batch, self.same_on_batch)
199 if 'batch_prob' not in params:
200 params['batch_prob'] = torch.tensor([True] * batch_shape[0])
~/default-env/lib/python3.8/site-packages/kornia/augmentation/base.py in __forward_parameters__(self, batch_shape, p, p_batch, same_on_batch)
92 batch_prob = batch_prob.repeat(batch_shape[0])
93 # selectively param gen
---> 94 return self.__selective_param_gen__(batch_shape, batch_prob)
95
96 def apply_func(self, input: torch.Tensor, params: Dict[str, torch.Tensor],
~/default-env/lib/python3.8/site-packages/kornia/augmentation/base.py in __selective_param_gen__(self, batch_shape, to_apply)
63 def __selective_param_gen__(
64 self, batch_shape: torch.Size, to_apply: torch.Tensor) -> Dict[str, torch.Tensor]:
---> 65 _params = self.generate_parameters(
66 torch.Size((int(to_apply.sum().item()), *batch_shape[1:])))
67 if _params is None:
~/default-env/lib/python3.8/site-packages/kornia/augmentation/augmentation.py in generate_parameters(self, batch_shape)
483
484 def generate_parameters(self, batch_shape: torch.Size) -> Dict[str, torch.Tensor]:
--> 485 return rg.random_affine_generator(
486 batch_shape[0], batch_shape[-2], batch_shape[-1], self.degrees, self.translate, self.scale, self.shear,
487 self.same_on_batch)
~/default-env/lib/python3.8/site-packages/kornia/augmentation/random_generator/random_generator.py in random_affine_generator(batch_size, height, width, degrees, translate, scale, shear, same_on_batch)
173 _scale = _adapted_uniform((batch_size,), scale[0], scale[1], same_on_batch).unsqueeze(1).repeat(1, 2)
174 if len(_scale) == 4:
--> 175 _joint_range_check(cast(torch.Tensor, scale[2:]), "scale_y")
176 _scale[:, 1] = _adapted_uniform((batch_size,), scale[2], scale[3], same_on_batch)
177 else:
~/default-env/lib/python3.8/site-packages/kornia/augmentation/utils/param_validation.py in _joint_range_check(ranged_factor, name, bounds)
45 raise ValueError(f"{name}[0] should be smaller than {name}[1] got {ranged_factor}")
46 else:
---> 47 raise TypeError(
48 f"{name} should be a float number or a tuple with length 2 whose values between {bounds}."
49 f"Got {ranged_factor}.")
TypeError: scale_y should be a float number or a tuple with length 2 whose values between (-inf, inf).Got tensor([]).
Environment
- Ubuntu 20.04
- Tried PyTorch 1.6.0 and 1.7.0 (via pip)
- Kornia 0.4.1
- Python 3.8
Issue Analytics
- State:
- Created 3 years ago
- Comments:6
Top GitHub Comments
I see the problem. It happens only when batchsize == 4 due to a typo. Fixed in #786.
Awesome, thanks for the quick fix!