question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

questions about the loss function

See original GitHub issue

Hi, Clément Pinard It’s an impressive job that you have done in this project and the code is written beautifully. May I ask you some questions about the loss function? For the photometric_reconstruction_loss, it use the predicted depth and pose to get the grid from ref_imgs to tgt_img. Thus, we can use the F.grid_sample to generate tgt_img from ref_imgs. But in theory, we can also generate ref_imgs from tgt_img. Do you have any idea about how to realize this operation?

Issue Analytics

  • State:closed
  • Created 4 years ago
  • Comments:8 (4 by maintainers)

github_iconTop GitHub Comments

1reaction
huaglcommented, Mar 30, 2019

I see. I found the problems that you told when I tried to implement my naive idea. Thanks for your patience~

0reactions
ClementPinardcommented, May 7, 2021

If you do that you won’t have any parallax effects : for instance in a lateral movement, close points will move faster on the screen than far points. If you use a constant ref_depth they will all move in the image at the same speed, as if you were looking at a textured plane instead of the real world.

I agree that the need for 2 depth map is counter-intuitive, you feel that with 1 depth map and 1 pose vector, you have all the information to compute the new depth map, but this is only true for direct warping. The problem is that direct warping is much more difficult to implement as a differentiable than inverse warping.

Inverse warping needs the depth map corresponding the new pose view point, and the thing you are trying to warp (be it image or depth) corresponding to the original pose viewpoint. That way, contrary to direct warp, each point of the warped image/depth has a coordinate in the source image, and gradient of pixel color with respect to coordinate is very easy to compute. I wrote a piece on that in my thesis (https://pastel.archives-ouvertes.fr/tel-02285215/document chapter 4)

If you are searching for direct warp based optimization, you can have a look at Pytorch3D. They implement a method for 2D image optimization based on direct rendering of 3D primitives. You just need to convert your depth map to 3D mesh (should be easy) and plug their solution.

Read more comments on GitHub >

github_iconTop Results From Across the Web

Anatomy of Loss Functions: Questions and Answers - SDS Club
Anatomy of Loss Functions: Questions and Answers · What is a loss function in Machine Learning? · How many types of loss functions...
Read more >
Newest 'loss-function' Questions - Data Science Stack Exchange
Car buyer wants to pay by actual cash - what are the motivations and risks? ... What is the word for a belief...
Read more >
What is Loss Function?
The loss function is bread and butter for machine learning. It is quite simple to understand and used to evaluate how well our...
Read more >
CS230: Deep Learning - Stanford University
For each of the following questions, circle the letter of your choice. ... (iii) Normalizing the input impacts the landscape of the loss...
Read more >
Questions lecture 1 - Clopinet
4. What is a loss function? What is a risk functional? Give examples. A loss function is a function measuring the discrepancy between...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found