SSIM update does not work for the last batch (if batch size is different)
See original GitHub issue🐛 Bug description
Environment
- PyTorch Version :1.10.1
- Ignite Version ():0.4.8
- OS : Ubuntu
- How you installed Ignite (
conda
,pip
, source): - Python version: 3.9
- Any other relevant information:
If the previous batch and the current batch are different then it throws the error. Attaching the screen shot of error.
Screenshot of the basic code is also attached.
Issue Analytics
- State:
- Created a year ago
- Comments:7 (1 by maintainers)
Top Results From Across the Web
Python - Torchmetric SSIM depends on batch size
I got the following sample, where I print the ssim of the torchmetrics library of two tensors with the batchsize 8 and the...
Read more >pytorch lightning metric.SSIM increasingGPU memory #5733
tried smaller batch size to see if gpu memory is cleared when new epoch is start, but memory is still increasing.. What's your...
Read more >Unable to Update Batch Size of Queue during Model Run
Hello, I am trying to create a queue (palletizer) that changes batch sizes halfway during the model run, even if the last batch...
Read more >Custom Dataset (Cannot load final batch of different size ...
Hi there, My custom dataset code (multilabel classifier): class MultiLabelDataset(Dataset): def __init__(self,csv_path,image_path ...
Read more >The Underlying Dangers Behind Large Batch Training Schemes
Note that the batch size is inversely proportional to the number of weight updates; that is, the larger the batch size, the fewer...
Read more >
Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free
Top Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Hey @vfdev-5, I am on holiday for the next 15 days. (It was a sudden plan). Please feel free to unassign me if its priority has been increased.
@vinayak015 yes, thanks for proposing to help. Let me expose here how I would tackle the issue.
Please, take a look of our contributing guide for info : https://github.com/pytorch/ignite/blob/master/CONTRIBUTING.md