question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Discriminative loss error: IndexError: Dimension out of range (expected to be in range of [-1, 0], but got 1)

See original GitHub issue

Hi,

When running python3 lanenet/train.py --dataset ./data/training_data_example, I’m seeing the following exception:

Traceback (most recent call last):
  File "lanenet/train.py", line 156, in <module>
    main()
  File "lanenet/train.py", line 144, in main
    train_iou = train(train_loader, model, optimizer, epoch)
  File "lanenet/train.py", line 68, in train
    total_loss, binary_loss, instance_loss, out, train_iou = compute_loss(net_output, binary_label, instance_label)
  File "/usr/local/lib/python3.6/dist-packages/lanenet-0.1.0-py3.6.egg/lanenet/model/model.py", line 75, in compute_loss
  File "/home/lashar/.local/lib/python3.6/site-packages/torch/nn/modules/module.py", line 722, in _call_impl
    result = self.forward(*input, **kwargs)
  File "/usr/local/lib/python3.6/dist-packages/lanenet-0.1.0-py3.6.egg/lanenet/model/loss.py", line 33, in forward
  File "/usr/local/lib/python3.6/dist-packages/lanenet-0.1.0-py3.6.egg/lanenet/model/loss.py", line 71, in _discriminative_loss
  File "/home/lashar/.local/lib/python3.6/site-packages/torch/functional.py", line 1100, in norm
    return _VF.frobenius_norm(input, _dim, keepdim=keepdim)
IndexError: Dimension out of range (expected to be in range of [-1, 0], but got 1)

Any clues about how I can fix this? Not sure if I’m doing something incorrectly.

Thanks!

Issue Analytics

  • State:open
  • Created 3 years ago
  • Reactions:1
  • Comments:11 (1 by maintainers)

github_iconTop GitHub Comments

4reactions
mummy2358commented, May 6, 2021

Managed to fix it. It should be dim=1. The norm should be calculated along the “embedding” axis. Problem comes from

embedding_i = embedding_b[seg_mask_i]

which break the dims to get a single dimensional vector.

Changing it to: embedding_I = embedding_b * seg_mask_i

works for me

2reactions
sharmalakshay93commented, Jan 5, 2021

PS: I just saw issues/12, and its related commit. Changing it back to dim=0 makes it work. However, since the aforementioned issue says that the correct value is dim=1, not sure if this works as intended? Would appreciate it if clarification could be provided!

Thanks!

Read more comments on GitHub >

github_iconTop Results From Across the Web

RuntimeError: dimension out of range (expected to be in ...
The problem is that you are passing in bad arguments to torch.nn.CrossEntropyLoss in your classification problem.
Read more >
IndexError: Dimension out of range (expected to be in ...
Hi, I use training code of model.zero_grad() out = model() print(y) print(out) loss = criterion(out, y) loss.backward(retain_graph = True) ...
Read more >
[Solved][PyTorch] IndexError: Dimension out of range ...
Today I got an error message as following (In a team project source code): ... out of range (expected to be in range...
Read more >
prajnan1993/pytorch-functions
Example 1 - working t1 = torch.tensor([0, 1, 2]) print('Original tensor') ... Dimension out of range (expected to be in range of [-1,...
Read more >
PyTorch Dimension out of range (expected to be in range of
I want to compute the Cross Entropy Loss (as part of an Logistic ... IndexError: Dimension out of range (expected to be in...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found