Graph convolution in mini-batch (manually defined graphs)
See original GitHub issue❓ Questions & Help
I’ like to perform graph convolution on “manually”-defined graphs in a mini-batch manner. I’ve read the documentation (e.g., here and here), but I haven’t figured it out.
A quick example: Let’s say that we need to define a batch of graphs, of the same structure, i.e., with the same edge_index
, but with different feature signals and different edge attributes.
For instance, let’s define a simple directed graph structure with the following edge_index
:
import torch
from torch_geometric.data import Data as gData
import torch_geometric.nn as gnn
import numpy as np
num_nodes = 7
num_node_features = 16
edge_index = torch.tensor(np.concatenate([np.arange(num_nodes), np.roll(np.arange(num_nodes), shift=1)]).reshape(-1, num_nodes), dtype=torch.long)
edge_index
tensor([[0, 1, 2, 3, 4, 5, 6],
[6, 0, 1, 2, 3, 4, 5]])
Now, let’s define a simple graph convolution operator, e.g., GCNConv
, that will act on such graphs:
gconv = gnn.GCNConv(in_channels=num_node_features, out_channels=32)
Then, if I define a graph signal as below:
x = torch.randn((num_nodes, num_node_features), dtype=torch.float)
print(x.size())
torch.Size([7, 16])
and pass it through gconv
, I have:
y = gconv(x, edge_index)
print(y.size())
torch.Size([7, 32])
which is fine.
Now, I’d like to do the same in a mini-batch manner; i.e., to define a a batch of such signals, that along with the same edge_index
will be passed through gconv
.
It seems that this could be somehow done using batch
, mentioned here, but I cannot find any reference on how this could be done.
The “problem” is that I need to dynamically define my graphs during training; though they will all have the same topology (in the sense of edge_index
). What will change and be updated will be features and edge attributes (each graph in the batch will have a signal x with shape [num_nodes, num_node_features]
and edge attributes with shape [num_nodes, num_edge_features]
).
Thanks for your time.
Issue Analytics
- State:
- Created 4 years ago
- Reactions:3
- Comments:8 (3 by maintainers)
Top GitHub Comments
You have two options here: (1) Replicating your
edge_index
by stacking them diagonally, e.g., via:or using the
node_dim
property of message passing operators:I will try to explain this here in more detail.
@rusty1s thank you for the clarification. Below, I have a simple example for using
NNConv
in batch-mode, whereedge_index
is fixed (all the graphs have the same topology – specifically, they are complete graphs without self-loops), as well as node and edge features are passed through the graph convolution layer (again, in batch mode).It’d be nice if you could confirm that’s how it should be done, but of course I’m not asking that.
A final question would be how the output of the convolution (i.e.,
y
) could be wrapped into a batch again (in this case apparently withoutedge_attr
, but only the output features andedge_index
, which remains the same).Many thanks again.