question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

L1 loss or SmoothL1Loss?

See original GitHub issue

Hi, I’ve been reading through the code and I found that L1 loss is used instead of Smooth L1 loss for localization loss. This is quite different from the paper’s procedure, where as far as I know SSD uses Smooth L1 loss.

https://github.com/sgrvinod/a-PyTorch-Tutorial-to-Object-Detection/blob/master/model.py#L549

self.smooth_l1 = nn.L1Loss()

https://github.com/sgrvinod/a-PyTorch-Tutorial-to-Object-Detection/blob/master/model.py#L612

loc_loss = self.smooth_l1(predicted_locs[positive_priors], true_locs[positive_priors]) # (), scalar


My questions are:

  1. Has anyone tried changing the loss function to SmoothL1Loss as implemented in PyTorch as of right now?
  2. If it has been tried, is the result similar to what SSD achieves?

Thank you in advance.

Issue Analytics

  • State:closed
  • Created 3 years ago
  • Comments:7

github_iconTop GitHub Comments

1reaction
jonathan016commented, May 23, 2021

With the experiment results provided by @adityag6994, I believe my questions have been answered. Closing this issue for now. Thanks a lot @adityag6994!

1reaction
adityag6994commented, May 23, 2021

That makes sense now. Thank you @jonathan016

Read more comments on GitHub >

github_iconTop Results From Across the Web

How to interpret smooth l1 loss? - Cross Validated
Smooth L1-loss can be interpreted as a combination of L1-loss and L2-loss. It behaves as L1-loss when the absolute value of the argument...
Read more >
Smooth L1 Loss - WordPress.com
The Smooth L1 loss is used for doing box regression on some object detection systems, (SSD,. Fast/Faster RCNN) according to those papers this...
Read more >
Self-Adjusting Smooth L1 Loss Explained | Papers With Code
Self-Adjusting Smooth L1 Loss is a loss function used in object detection that was introduced with RetinaMask. This is an improved version of...
Read more >
Trying to understand PyTorch SmoothL1Loss Implementation
... an element-wise selection instead (if you think about the implementation of the vanilla L1 loss, and the motivation for smooth L1 loss)....
Read more >
Plots of the L1, L2 and smooth L1 loss functions. - ResearchGate
More specifically, smooth L1 uses L2(x) for x ∈ (−1, 1) and shifted L1(x) elsewhere. Fig. 3 depicts the plots of these loss...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found