Dilated layer takes more than `k` neightbours
See original GitHub issueThe Dilated
layer doesn’t take into account k
. This can lead to taking more neighbours than intended.
t = torch.tensor([
[0, 0, 0, 0, 0, 0, 1, 1, 1, 1, 1, 1],
[0, 1, 2, 3, 4, 5, 0, 1, 2, 3, 4, 5]
])
res = Dilated(k=2, dilation=2)(t)
print(res) # here 3 neighbours are taken even though the constructor specified 2.
# tensor([[0, 0, 0, 1, 1, 1],
# [0, 2, 4, 0, 2, 4]])
Issue Analytics
- State:
- Created a year ago
- Comments:5 (3 by maintainers)
Top Results From Across the Web
Dilated Convolution - GeeksforGeeks
Dilated convolution helps expand the area of the input image covered without pooling. The objective is to cover more information from the output ......
Read more >A Comprehensive Introduction to Different Types of ...
Transposed Convolution (Deconvolution, checkerboard artifacts); Dilated Convolution (Atrous Convolution); Separable Convolution (Spatially ...
Read more >Graph Dilated Network with Rejection Mechanism - MDPI
At each layer k, multiple graph dilated convolution kernels are applied to aggregate information from neighbors (Lines 8–9), and a rejection mechanism based ......
Read more >Why Dilated Convolutional Neural Networks: A Proof of Their ...
It turns out that the most efficient case—called dilated convolution—is when we select the neighboring points whose differences in both ...
Read more >arXiv:1904.03751v2 [cs.CV] 19 Aug 2019
most state-of-the-art GCNs are no deeper than 4 layers [53]. ... a Dilated k-NN to find dilated neighbors after every GCN layer and ......
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
Thanks for the suggestion @zademn. That is definitely a good idea if we are dealing with a more complex case. But in our example, we always build knn graphs with
k*d
neighbors. To keep it simple, we prefer to leave it as it is.A possible solution would be (using einops):