Raise error "IndexError: index out of range in self" using NeighborLoader
See original GitHub issueš Describe the bug
I used the neighbor_loader to sample my graph data. Although i used proper data, raise this error.
Usage is this:
NeighborLoader(
processed_pyg_data,
batch_size=self.args.batch_size,
num_neighbors=[
16,
32,
]
)
I think that my dataset is clean and unproblematic. Because i used this dataset in other projects with neighbor_loader at M1 Sillicon environment. But, in the ubuntu, raise this error below. Whatās wrongā¦ Plz help meā¦
File "/home/ubuntu/anaconda3/envs/kisa-test/lib/python3.8/site-packages/torch_geometric/loader/base.py", line 36, in __next__
return self.transform_fn(next(self.iterator))
File "/home/ubuntu/anaconda3/envs/kisa-test/lib/python3.8/site-packages/torch_geometric/loader/neighbor_loader.py", line 405, in filter_fn
data = filter_data(self.data, node, row, col, edge,
File "/home/ubuntu/anaconda3/envs/kisa-test/lib/python3.8/site-packages/torch_geometric/loader/utils.py", line 158, in filter_data
filter_node_store_(data._store, out._store, node)
File "/home/ubuntu/anaconda3/envs/kisa-test/lib/python3.8/site-packages/torch_geometric/loader/utils.py", line 111, in filter_node_store_
out_store[key] = index_select(value, index, dim=dim)
File "/home/ubuntu/anaconda3/envs/kisa-test/lib/python3.8/site-packages/torch_geometric/loader/utils.py", line 26, in index_select
return torch.index_select(value, dim, index, out=out)
IndexError: index out of range in self
Environment
- PyG version: 2.0.4
- PyTorch version: 1.12.0
- OS: Ubuntu
- Python version: 3.8
- CUDA/cuDNN version: None (CPU)
- How you installed PyTorch and PyG (
conda
,pip
, source): pip - Any other relevant information (e.g., version of
torch-scatter
):
torch-cluster 1.6.0 torch-geometric 2.1.0.post1 torch-scatter 2.0.9 torch-sparse 0.6.15 torch-spline-conv 1.2.1
Issue Analytics
- State:
- Created a year ago
- Comments:7 (4 by maintainers)
Top Results From Across the Web
Pytorch: IndexError: index out of range in self. How to solve?
Any input less than zero or more than declared input dimension raise this error. Compare your input and the dimension mentioned in torch.nn....
Read more >IndexError: index out of range in self Ā· Issue #5611 - GitHub
IndexError : index out of range in self. I tried to set max_words length as 400, still getting same error : Data I...
Read more >Transformer Embedding - IndexError: index out of range in self
I'm training a transformer model from scratch, and was able to run a āminiā train and dev set on the CPU without any...
Read more >[LayoutLMv3] index out of range in self inside outputs = model ...
Thanks @nielsr for the implementation! I'm trying to fine-tune the model based on my own dataset following this: Google Colab.
Read more >IndexError: list index out of range for `get_pred` - Fast.ai forums
When I tried to call predict on them. pred_fwd,lbl_fwd = learn_fwd.get_preds(ordered=True). I recieve the following error:Ā ...
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
Iām wondering if we should add some better error messages around these kinds of common problems (Iāve seen a question similar to this quite a few times).
Ok. Iāll give an example. The ādamageā happens in this case below:
[STEP1] Preprocessing and Save (environment: Mac M1 sillicon)
Then, i save the processed data using pyg, and load it in step2
[STEP2] Load the saved data (environment: Ubuntu 18.04)
The damage happens in STEP2. When i identified the loaded data, edge_index is damaged (e.g. Some elements in the loaded data changed negative values) Currently, i cannot find causes of this error. If i identify this issue, iāll share you.