How to combine data batch with GCN?
See original GitHub issue🐛 Bug
Hi!, Thanks for your amazing framework. But recently, I am to using GCN to extract spatial-feature from node features in a batch way, but it got the error like this:
ValueError: `MessagePassing.propagate` only supports `torch.LongTensor` of shape
`[2, num_messages]` or `torch_sparse.SparseTensor` for argument edge_index`.
The code I am using
The edge_index and edge_weight is defined as below:
graph_edges = torch.tensor(graph_edges, dtype=torch.long, device="cuda:0").t().contiguous()
graph_edges_b = torch.stack([graph_edges for x in range(location_t.shape[0])], dim=0)
print("graph_weight", graph_weight.shape) # [16, 171]
print("graph_edges_b", graph_edges_b.shape) # [16, 2, 171]
print("feature_t", feature_t.shape) # [16, 19, 512]
then
enc_feature = self.g_conv1(feature_t, graph_edges_b, graph_weight)
it feeds back the error as below:
ValueError: `MessagePassing.propagate` only supports `torch.LongTensor` of shape `[2, num_messages]` or `torch_sparse.SparseTensor` for argument `edge_index`.
For more details:
The batch_size is 16, and all the graph is fully-connected
, but the weights is different
for each graph.
Issue Analytics
- State:
- Created 2 years ago
- Comments:9 (4 by maintainers)
Top Results From Across the Web
Graph Classification & Batchwise Training · Issue #4 · tkipf/gcn
I have modified the graphconv layer to dense matrix to work with parallel data loader. And the "ind" (size: N*batch, values are normalized...
Read more >Advanced Mini-Batching — pytorch_geometric documentation
In its most general form, the PyG DataLoader will automatically increment the edge_index tensor by the cumulated number of nodes of all graphs...
Read more >6.1 Training GNN for Node Classification with Neighborhood ...
To use a sampler provided by DGL, one also need to combine it with DataLoader , which iterates over a set of indices...
Read more >GCN with Neo4j and PyTorch Using MUTAG Dataset in Ten ...
I shuffled and split the original MUTAG dataset into a train and test set. Then I created data loaders for each set with...
Read more >Simple scalable graph neural networks | by Michael Bronstein
In graph-sampling approaches, for each batch, a subgraph of the original graph is sampled, and a full GCN-like model is run on the...
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
You can concatenate features as follows to avoid the for-loop:
I see! Thanks a lot! Hope you have a good day!