RandomLinkSplit' object has no attribute 'num_features'
See original GitHub issue❓ Questions & Help
I created my heterograph following steps from LOADING GRAPHS FROM CSV and now I’m trying to split the data
data.train_mask = data.val_mask = data.test_mask = None
data = T.RandomLinkSplit(data)
In the last version of PyG, train_test_split_edges is returning that its deprecated. So I used instead RandomLinkSplit and when I execute this line:
encoder = VEncoder(data.num_features, out_channels=latent_size)
It returns:
AttributeError: 'RandomLinkSplit' object has no attribute 'num_features'
What’s the equivalent of num_features ? I couldn’t find anything about this in the docs
Issue Analytics
- State:
- Created 2 years ago
- Comments:11 (11 by maintainers)
Top Results From Across the Web
torch_geometric.transforms - PyTorch Geometric
Converts the edge_index attributes of a homogeneous or heterogeneous data object into a (transposed) torch_sparse.SparseTensor type with key adj_t (functional ...
Read more >Test edges are not excluded from train edges when ... - GitHub
Initialize a Data object as undirected graph. Set the random seed and split into train, validation and test sets using RandomLinkSplit.
Read more >AttributeError: 'Batch' object has no attribute 'local_var'
I am currently working on doing graph classification on the IMDB-Binary dataset using deep learning and specifically the pytorch geometric ...
Read more >Python attributeerror: 'list' object has no attribute 'split' Solution
This error tells us we are trying to use a function that is not available on lists. The split() operation only works on...
Read more >RandomLinkSplit' object has no attribute 'num_features' #3183
It returns: AttributeError: 'RandomLinkSplit' object has no attribute 'num_features'. What's the equivalent of num_features ? I couldn't find anything about ...
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
Yes, you’re right, it is just like the behavior in deepsnap (I just double checked). I think your interface is a bit clearer, because in deepsnap you first call
and then
which could suggest to an unsuspecting user that 80% of edges are reserved for message passing, and 20% are supervision edges, and then you split those supervision edges using the [0.8, 0.1, 0.1] split ratio, yielding something like 80 message passing edges, 16 training supervision edges, 2 validation and 2 test edges. But this is not the case, the behaviour is as you implemented here.
I think the fact that you have it all handled in one function doesn’t give any counter-intuitive pointers like this and is much clearer.
Thanks again! 😃
This is correct. I think this is similar to how DeepSNAP determines the train ratio of message passing edges and supervision edges. Let me know if this is the correct behavior in your opinion 😃