Image priming gives "The size of tensor a (224) must match the size of tensor b (512) at non-singleton dimension 3"
See original GitHub issueHi, Unless I’m missing something obvious, it looks like the image priming no longer works? E.g.
imagine "A pizza on fire" --open_folder=False --start_image_path=./samples/prime-orig.jpg
siren_pytorch/siren_pytorch.py:101: UserWarning: Using a target size (torch.Size([1, 3, 512, 512])) that is different to the input size (torch.Size([1, 3, 224, 224])) … RuntimeError: The size of tensor a (224) must match the size of tensor b (512) at non-singleton dimension 3
Issue Analytics
- State:
- Created 3 years ago
- Comments:5 (1 by maintainers)
Top Results From Across the Web
PyTorch: RuntimeError: The size of tensor a (224) must match ...
Right now I am getting errors while calculating the loss. RuntimeError: The size of tensor a (224) must match the size of tensor...
Read more >RuntimeError: The size of tensor a (224) must match the size ...
Hello, I'm training a vision transformer on the custom dataset for regression purpose. The predicted size resulted from the network torch.
Read more >Trainer RuntimeError: The size of tensor a (462) must match ...
The issue is with your target label sequences. Some of the label sequences have a length that exceeds the model's maximum generation length....
Read more >How to perform an expand operation in PyTorch?
If you set a particular dimension as -1, the tensor will not be expanded along this dimension. For example, if we have a...
Read more >COMSOL - Script - extras
and the sizes match; all elements with the same i th index must have the same size along dimension i. Example cell2mat({[1 2...
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
I fixed the error that it does not run at all in the PR above (#103), but the generated images are completely white for large start_image_steps. When I checked out this repo first it already had the same issue - so now we are at least back to the old non-working version.
@nerdyrodent what you can do instead is to pass your image as
img
to the Imagine class, along with your text. Then Deepdaze will optimize for your image features too. That’s quite different from image priming, but still interesting.Same error every time. It can be worked around by using “–image_width=224”. I’m not that great at coding, so I’m still looking through the rest of the code to see how it handles different image sizes.