question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

MessagePassing.propagate() computes the contraction with the transpose

See original GitHub issue

Short: if (edge_index, norm) represent the matrix M in the sense that M = to_scipy_sparse_matrix(edge_index, norm) then MessagePassing.propagate(edge_index, x, norm) computes M.T @ x, so M is implicitly transposed in the process

Example

# Create matrix M and vector x
row = np.array([0, 0])
col = np.array([1, 2])
norm = np.array([1, 1])
num_nodes = 3
x = np.array([[2, 3, 4], [5, 6, 7]]).T

# Compute M@x and M.T@x explicitly
M = scipy.sparse.coo_matrix((norm, (row, col)), shape=(num_nodes, num_nodes))
print("M:\n", M.toarray())
print("x:\n:", x)
print("M @ x:\n", M @ x)
print("M.T @ x:\n", M.T @ x)

# Compute MessagePassing.propagate
conv = torch_geometric.nn.MessagePassing()  # flow='source_to_target'
edge_index = torch.tensor([row, col])
conv_result = conv.propagate(edge_index=edge_index,
                             x=torch.tensor(x),
                             norm=norm)
print("conv:")
print(conv_result)

output:

M:
 [[0 1 1]
 [0 0 0]
 [0 0 0]]
x:
: [[2 5]
 [3 6]
 [4 7]]
M @ x:
 [[ 7 13]
 [ 0  0]
 [ 0  0]]
M.T @ x:
 [[0 0]
 [2 5]
 [2 5]]
conv:
tensor([[0, 0],
        [2, 5],
        [2, 5]])

So we see that M.T @ x is computed.

Comments: This is probably the way it is supposed to work, but at least for me it was not very intuitive, that’s why I wanted to mention it. If I understand correctly, the reason why it works that way is that in PyG edge_index=(row, col) = (source, target), while for M @ x one should have row=target and col=source.

While the transposition of M can be called a convention, and wouldn’t matter for symmetric M, this might be a source of confusion for directed graphs.

Issue Analytics

  • State:closed
  • Created 4 years ago
  • Comments:12 (6 by maintainers)

github_iconTop GitHub Comments

1reaction
rusty1scommented, Jul 22, 2019

That is correct. One can change this by setting flow=target_to_source. I think this is very intuitive if you think about the graph connectivity as a set of (source, target) tuples. A lot of papers also define A_ji = 1 if there is an edge from i to j, which is IMO much more confusing. I tried my best to make this clear in the message passing documentation.

0reactions
rusty1scommented, Jan 24, 2022

Sure.

Read more comments on GitHub >

github_iconTop Results From Across the Web

MessagePassing.propagate() computes the contraction with ...
Hi, hope you're doing well. In the above example I wanna calculate conv = conv.message() Now I have 3 questions: The first one...
Read more >
Creating Message Passing Networks - PyTorch Geometric
Takes in the edge indices and all additional data which is needed to construct messages and to update node embeddings. Note that propagate()...
Read more >
Graph neural networks - arXiv
Graph neural networks (GNNs) are neural models that capture the dependence of graphs via message passing between the nodes of graphs. In recent...
Read more >
An efficient tensor transpose algorithm for multicore CPU, Intel ...
An efficient parallel tensor transpose algorithm is suggested for shared-memory computing units, namely, multicore CPU, Intel Xeon Phi, ...
Read more >
Understanding Back Propagation for Transpose Convolution ...
Purple Box → Rotating the Matrix to Fit Calculate the Respect Derivative to each Weight. The question now arises, why? Why do we...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found