question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Best way to learn adjacency matrix for a graph?

See original GitHub issue

❓ Questions & Help

Hi,

Apologies if this has already been posted (though I spent a good half an hour trying to find a question like this). I am trying to figure out what the best way is to learn a parameterisation of a graph (i.e. have a neural net predict from some input: the nodes, their features, and the adjacency matrix).

I see that many of the graph conv layers take in a 2D tensor of edge indices, for edge_index, though we would not be able to backprop through this. It seems like either one would have to (a) define a fully-connected graph and instead infer the edge weights (where a weight of 0 between nodes (i,j) would effectively simulate two nodes not being connected), or if it’s possible, directly pass in the adjacency matrix as one dense (n,n) matrix (though I assume this can only be binary, so that may also be problematic).

Any thoughts? Thanks in advance.

Issue Analytics

  • State:closed
  • Created 3 years ago
  • Comments:6 (3 by maintainers)

github_iconTop GitHub Comments

5reactions
rusty1scommented, Jun 24, 2020

Note that we also provide GNNs that can operate on dense input. For example, this is done in the DiffPool model. An alternative way would be to sparsify your dense adjacency matrix based on a user-defined threshold (similar to a ReLU activation):

edge_index = (adj > 0.5).nonzero().t()
edge_weight = adj[edge_index[0], edge_index[1]]

If you utilize both edge_index and edge_weight in your follow-up GNN, your graph generation is fully-trainable (except for the values you remove).

3reactions
christopher-beckhamcommented, Jun 23, 2020

Hi,

Thanks for your response!

In my case, I’d want to use the inferred outputs in a downstream manner (i.e., both the nodes’ features and the adjacency matrix) and have that all be backproppable, e.g.:

input -> [mlp] -> {X, E} -> [GNNs] -> output

where E is the adjacency matrix and X are the node features. I assume that E however needs to be sparse in order for it to work with the GNNs later on in the network

In the case of the autoencoder its output (a dense adjacency matrix) just happens to also be the end of the network, which is convenient. In my case, it still seems like the most plausible option would be to fix the adjacency matrix to have the graph be fully-connected, and instead have the network infer edge weights instead. Let me know if you agree with this line of thinking.

Thanks again!

Read more comments on GitHub >

github_iconTop Results From Across the Web

Best way to learn adjacency matrix for a graph? #1361 - GitHub
The general consensus for an Graph-AE is to train against the dense adjacency matrix. However, you only need a dense output. In contrast,...
Read more >
Adjacency matrix - Graph Theory Tutorial - Micro-PedSim
The graph family argues that one of the best ways to represent them into a matrix is by counting the number of edge...
Read more >
Is possible to learn the graph structure (I.e. the adjacency ...
Hi all! I understood that learning the graph structure corresponds to learn the adjacency matrix together with the weights and that is usually...
Read more >
Graph Adjacency Matrix (With code examples in C ... - Programiz
An adjacency matrix is a way of representing a graph as a matrix of booleans. In this tutorial, you will understand the working...
Read more >
How to Represent an Undirected Graph as an Adjacency Matrix
To represent this graph as the adjacency matrix A, we'll let the indices of the rows and columns represent nodes, or vertices. For...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found