question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Image priming gives "The size of tensor a (224) must match the size of tensor b (512) at non-singleton dimension 3"

See original GitHub issue

Hi, Unless I’m missing something obvious, it looks like the image priming no longer works? E.g.

imagine "A pizza on fire" --open_folder=False --start_image_path=./samples/prime-orig.jpg

siren_pytorch/siren_pytorch.py:101: UserWarning: Using a target size (torch.Size([1, 3, 512, 512])) that is different to the input size (torch.Size([1, 3, 224, 224])) … RuntimeError: The size of tensor a (224) must match the size of tensor b (512) at non-singleton dimension 3

Issue Analytics

  • State:closed
  • Created 3 years ago
  • Comments:5 (1 by maintainers)

github_iconTop GitHub Comments

1reaction
NotNANtoNcommented, Mar 22, 2021

I fixed the error that it does not run at all in the PR above (#103), but the generated images are completely white for large start_image_steps. When I checked out this repo first it already had the same issue - so now we are at least back to the old non-working version.

@nerdyrodent what you can do instead is to pass your image as img to the Imagine class, along with your text. Then Deepdaze will optimize for your image features too. That’s quite different from image priming, but still interesting.

0reactions
nerdyrodentcommented, Mar 20, 2021

Same error every time. It can be worked around by using “–image_width=224”. I’m not that great at coding, so I’m still looking through the rest of the code to see how it handles different image sizes.

Read more comments on GitHub >

github_iconTop Results From Across the Web

PyTorch: RuntimeError: The size of tensor a (224) must match ...
Right now I am getting errors while calculating the loss. RuntimeError: The size of tensor a (224) must match the size of tensor...
Read more >
RuntimeError: The size of tensor a (224) must match the size ...
Hello, I'm training a vision transformer on the custom dataset for regression purpose. The predicted size resulted from the network torch.
Read more >
Trainer RuntimeError: The size of tensor a (462) must match ...
The issue is with your target label sequences. Some of the label sequences have a length that exceeds the model's maximum generation length....
Read more >
How to perform an expand operation in PyTorch?
If you set a particular dimension as -1, the tensor will not be expanded along this dimension. For example, if we have a...
Read more >
COMSOL - Script - extras
and the sizes match; all elements with the same i th index must have the same size along dimension i. Example cell2mat({[1 2...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found