Why we need remove and add self loop in GAT?
See original GitHub issue❓ Questions & Help
When I read the code in nn.conv.gat_conv.py, I found it firstly remove the loop and then add it. Why this is necessary?
def forward(self, x, edge_index):
""""""
edge_index, _ = remove_self_loops(edge_index)
edge_index = add_self_loops(edge_index, num_nodes=x.size(0))
Issue Analytics
- State:
- Created 5 years ago
- Comments:17 (16 by maintainers)
Top Results From Across the Web
Why we need remove and add self loop in GAT? #138 - GitHub
I think that the origin self loop containing weight, should not be removed as well. Do I make sense?
Read more >Feeding the isolated nodes with features into GCN or GAT ...
I am wondering if adding self-loop, will it change the original node features? (for my understanding, the node features should stay same after ......
Read more >torch_geometric.utils — pytorch_geometric documentation
Removes every self-loop in the graph given by edge_index , so that ( i , i ) ∉ E for every i ∈...
Read more >Pytorch Geometric tutorial: Graph attention networks (GAT ...
In this video we will see the math behind GAT and a simple implementation in Pytorch geometric.Outcome:- Recap- Introduction- GAT - Message ...
Read more >The Graph Neural Network Model
graphs, we need to define a new kind of deep learning architecture. ... In the case of the basic GNN, adding self-loops is...
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
Thank you. Added to master.
We do this in case there are already (a few) self-loops in the graph. Normally,
remove_self_loops
is a no-op.