[BUG] Inconsistent batch size output
See original GitHub issueThanks for the great package. The output of kornia.augmentation.RandomRotation
has an inconsistent batch size when the input batch size is 1:
import torch
from kornia.augmentation import RandomRotation # version 0.2.0
rotation = RandomRotation((-90.,90.))
x = torch.rand(1,3,10,10)
print(rotation(x).shape) # torch.Size([3, 10, 10]) // expecting torch.Size([1, 3, 10, 10])
x = torch.rand(2,3,10,10)
print(rotation(x).shape) # torch.Size([2, 3, 10, 10]) // fine
x = torch.rand(6,3,10,10)
print(rotation(x).shape) # torch.Size([6, 3, 10, 10]) // fine
Is this desired behavior?
Issue Analytics
- State:
- Created 4 years ago
- Comments:7 (4 by maintainers)
Top Results From Across the Web
A model results inconsistent output values depending on input ...
keras.applications.mobilenet.MobileNet , gives different output values when the batch sizes are different even though all input value are the ...
Read more >Why am I getting inconsistent batch size error in SageMaker ...
I wanted to test it out with same small, sample input, which should work because the model accepts tensors of arbitrary size within...
Read more >Inconsistent behavior of Non-Accumulated batch (NA) in ...
Describes possible root cause and workaround of a situation where N.A Batch fails intermittently on consecutive runs with same resources.
Read more >Incompatible Shapes due to Batch Size : r/tensorflow - Reddit
I'm training 28x28 size characters with the EMNIST dataset. No matter how what I change the batch size to, when I try to...
Read more >Rethinking "Batch" in BatchNorm - arXiv Vanity
Classification error under different normalization batch sizes, ... Our results show that the large inconsistency between mini-batch statistics and ...
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
I think that keeping (B, C, H, W) even for B=1 and/or C=1 is good convention to enforce.
I agree. I think input and output shape should always be (B, C, H, W), even when B=1 and/or C=1.