find_homography_dlt - possible improvement and usage
See original GitHub issue❓ Usage
Hi, Kornia community. Did anyone use find_homography_dlt
as a part of an end-to-end network? I am working on end-to-end image registration project and tried to incorporate it into the network architecture, without any success so far. I’ve set a small learning rate (i.e. 1e-6/8/10/12) and after a couple of iterations weights of my network explode - what’s interesting, gradients do not explode… I am having a hard time understanding what’s wrong here - maybe the whole idea of inserting find_homography_dlt
in end-to-end fashion is wrong?
Any tips?
❓ Possible improvement
I could have a couple of possible improvements over the current version, let me start with this one: https://github.com/kornia/kornia/blob/master/kornia/geometry/homography.py#L39 , where we have a manual iteration over all points.
w_list = []
axy_list = []
for i in range(points1.shape[1]):
axy_list.append(ax[:, i])
axy_list.append(ay[:, i])
w_list.append(weights[:, i])
w_list.append(weights[:, i])
A = torch.stack(axy_list, dim=1)
w = torch.stack(w_list, dim=1)
Is there any reason why we do this manually? Why don’t we use:
A = torch.cat((ax, ay), dim=-1).reshape(ax.shape[0], -1, ax.shape[-1])
w = weights.repeat(1, 2)
Issue Analytics
- State:
- Created 3 years ago
- Comments:11 (8 by maintainers)
Top GitHub Comments
@dkoguciuk no, just from the fact that displacements == optical flow in your example was a)negative -> do not survive ReLU and big, e.g. 8,therefore would be scaled down by batchnorm. I am glad, that I helped 😃
@edgarriba Because for that particular case, it not that matrix was somehow singular, it was, in fact all zeros or all ones. That is why noise helped. Not sure for the general case…
Merged in #693