question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Dilated layer takes more than `k` neightbours

See original GitHub issue

The Dilated layer doesn’t take into account k. This can lead to taking more neighbours than intended.

t = torch.tensor([
    [0, 0, 0, 0, 0, 0, 1, 1, 1, 1, 1, 1],
    [0, 1, 2, 3, 4, 5, 0, 1, 2, 3, 4, 5]
])

res = Dilated(k=2, dilation=2)(t)
print(res) # here 3 neighbours are taken even though the constructor specified 2.
# tensor([[0, 0, 0, 1, 1, 1],
#         [0, 2, 4, 0, 2, 4]])

Issue Analytics

  • State:open
  • Created a year ago
  • Comments:5 (3 by maintainers)

github_iconTop GitHub Comments

1reaction
lightaimecommented, Jun 5, 2022

Thanks for the suggestion @zademn. That is definitely a good idea if we are dealing with a more complex case. But in our example, we always build knn graphs with k*d neighbors. To keep it simple, we prefer to leave it as it is.

0reactions
zademncommented, Jun 5, 2022

A possible solution would be (using einops):

from einops import rearrange

t = torch.tensor(
    [
        [0, 0, 0, 0, 0, 1, 1, 1, 1, 1, 2, 2, 2, 2, 2],
        [0, 1, 2, 3, 4, 0, 1, 2, 3, 4, 0, 1, 2, 3, 4],
    ]
)
k = 2
d = 2

u, counts = torch.unique(t[0], return_counts=True)
k_constructed = counts[0]  # assume we always find k neighbours. We can give this as a parameter too
res1 = rearrange(t, "e (n2 k_constructed) -> e n2 k_constructed", k_constructed=k_constructed)

# tensor([[[0, 0, 0, 0, 0],
#          [1, 1, 1, 1, 1],
#          [2, 2, 2, 2, 2]],

#         [[0, 1, 2, 3, 4],
#          [0, 1, 2, 3, 4],
#          [0, 1, 2, 3, 4]]])
res2 = res1[:, :, ::d]  # Res dilated
print(res2)
# tensor([[[0, 0, 0],
#          [1, 1, 1],
#          [2, 2, 2]],

#         [[0, 2, 4],
#          [0, 2, 4],
#          [0, 2, 4]]])
res3 = res2[:, :, :k] # Take first k neighbours
print(res3)
# tensor([[[0, 0],
#          [1, 1],
#          [2, 2]],

#         [[0, 2],
#          [0, 2],
#          [0, 2]]])
res4 = rearrange(res3, "e d1 d2 -> e (d1 d2)")
print(res4)
# tensor([[0, 0, 1, 1, 2, 2],
#         [0, 2, 0, 2, 0, 2]])

Read more comments on GitHub >

github_iconTop Results From Across the Web

Dilated Convolution - GeeksforGeeks
Dilated convolution helps expand the area of the input image covered without pooling. The objective is to cover more information from the output ......
Read more >
A Comprehensive Introduction to Different Types of ...
Transposed Convolution (Deconvolution, checkerboard artifacts); Dilated Convolution (Atrous Convolution); Separable Convolution (Spatially ...
Read more >
Graph Dilated Network with Rejection Mechanism - MDPI
At each layer k, multiple graph dilated convolution kernels are applied to aggregate information from neighbors (Lines 8–9), and a rejection mechanism based ......
Read more >
Why Dilated Convolutional Neural Networks: A Proof of Their ...
It turns out that the most efficient case—called dilated convolution—is when we select the neighboring points whose differences in both ...
Read more >
arXiv:1904.03751v2 [cs.CV] 19 Aug 2019
most state-of-the-art GCNs are no deeper than 4 layers [53]. ... a Dilated k-NN to find dilated neighbors after every GCN layer and ......
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found