question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

patch_ Loader cannot iterate.

See original GitHub issue

Is there an existing issue for this?

  • I have searched the existing issues

Bug summary

I hope that the fixed size patch size will be used in model reasoning, and the data will be generated into subjects according to the tutorial.After the dataloader loads the patch, I want to iterate over the data through the for loop for reasoning. But I’ve been patching_ The loader iteration reports an error. The error message is: raise filenotfounderror (f’file not found: “{path}” ') FileNotFoundError: File not found: “affine”

Code for reproduction

class trochio_test_dataset(Dataset):
    def __init__(self,data_list):
        self.test_images=data_list
        print(self.test_images)
        self.patch_size=(80,160,160)
        self.volume=4,4,4
        self.subject=[]
        for i , image_path in enumerate(list(self.test_images)):
            print("data_process"+str(i))
            print(image_path)
            subject = tio.Subject(source=tio.data.ScalarImage(image_path))
            self.subject.append(subject)
        self.subject_dataset=tio.SubjectsDataset(self.subject)
def predict(opt):
    #define input type
    type=0
    #paths or files
    if os.path.isfile(opt.data[0]):
        pre_input=opt.data
        type=1
    elif os.path.isdir(opt.data[0]):
        pre_input=opt.data
        type=2
    else:
        logging.info("please check up your input,just accept files (.nii .dicom .mhd) or folders ")
        sys.exit(0)
    print(type)
    #load model
    model = Trans_Unet(c1=1, c2=8)
    device=torch.device('cuda' if torch.cuda.is_available() else 'cpu')
    # logging.info(f"Loading model {opt.weights}")
    logging.info(f"Using device {device}")
    model.to(device=device)

    logging.info(f"Model load!")
    """
    #load weight
    if torch.cuda.is_available():
        model=torch.load(opt.weights,map_location=torch.device("cuda"))
    else:
        model=torch.load(opt.weights,map_location=torch.device("cpu"))
    model.load_state_dict(model)
    """
    #set eval
    model.eval()
    #load file
    if type==1:
        input_files=pre_input
    elif type==2:
        input_files=[]
        for f in range(len(pre_input)):
            print(f"input folders as flow:{pre_input[f]}")
            input_file=subfiles(pre_input[f],suffix='nii.gz')#[pre_input[f]]
            input_files.extend(input_file)
    #create subject dataset
    subject_dataset=trochio_test_dataset(input_files).subject_dataset
    print(f"subject dataset {subject_dataset}")
    #iter subject
    for i in range(len(subject_dataset)):
        grid_sampler = tio.inference.GridSampler(
            subject_dataset[i],  # some NumPy array
            patch_size=(80, 160, 160),
            patch_overlap=4,
        )
        patch_loader = torch.utils.data.DataLoader(grid_sampler, batch_size=8)
        print(grid_sampler.subject)
        aggregator = tio.inference.GridAggregator(grid_sampler)
        print(aggregator.patch_overlap)
        with torch.no_grad():
            for patches_batch in patch_loader:
                print(1111111111)
                input_tensor = patches_batch['source'][tio.DATA]
                #test  --not enter model
                print(input_tensor)
                locations = patches_batch[tio.LOCATION]
                logging.info("OK")
                aggregator.add_batch(input_tensor, locations)
        output_tensor = aggregator.get_output_tensor()

Actual outcome

FileNotFoundError: File not found: “affine”:

Error messages

Traceback (most recent call last):
  File "D:/xxxx/MODELS/Li_Project/test.py", line 127, in <module>
    main(opt)
  File "D:/xxxx/MODELS/Li_Project/test.py", line 117, in main
    predict(opt)
  File "D:/xxxx/MODELS/Li_Project/test.py", line 82, in predict
    for patches_batch in patch_loader:
  File "D:\Anaconda3\install\envs\conda\lib\site-packages\torch\utils\data\dataloader.py", line 530, in __next__
    data = self._next_data()
  File "D:\Anaconda3\install\envs\conda\lib\site-packages\torch\utils\data\dataloader.py", line 570, in _next_data
    data = self._dataset_fetcher.fetch(index)  # may raise StopIteration
  File "D:\Anaconda3\install\envs\conda\lib\site-packages\torch\utils\data\_utils\fetch.py", line 52, in fetch
    return self.collate_fn(data)
  File "D:\Anaconda3\install\envs\conda\lib\site-packages\torch\utils\data\_utils\collate.py", line 157, in default_collate
    return elem_type({key: default_collate([d[key] for d in batch]) for key in elem})
  File "D:\Anaconda3\install\envs\conda\lib\site-packages\torch\utils\data\_utils\collate.py", line 157, in <dictcomp>
    return elem_type({key: default_collate([d[key] for d in batch]) for key in elem})
  File "D:\Anaconda3\install\envs\conda\lib\site-packages\torch\utils\data\_utils\collate.py", line 157, in default_collate
    return elem_type({key: default_collate([d[key] for d in batch]) for key in elem})
  File "D:\Anaconda3\install\envs\conda\lib\site-packages\torchio\data\image.py", line 746, in __init__
    super().__init__(*args, **kwargs)
  File "D:\Anaconda3\install\envs\conda\lib\site-packages\torchio\data\image.py", line 161, in __init__
    self.path = self._parse_path(path)
  File "D:\Anaconda3\install\envs\conda\lib\site-packages\torchio\data\image.py", line 428, in _parse_path
    return [self._parse_single_path(p) for p in path]
  File "D:\Anaconda3\install\envs\conda\lib\site-packages\torchio\data\image.py", line 428, in <listcomp>
    return [self._parse_single_path(p) for p in path]
  File "D:\Anaconda3\install\envs\conda\lib\site-packages\torchio\data\image.py", line 418, in _parse_single_path
    raise FileNotFoundError(f'File not found: "{path}"')
FileNotFoundError: File not found: "affine":

Expected outcome

I want to be able to use patch inference

System info

No response

Issue Analytics

  • State:closed
  • Created a year ago
  • Comments:7 (4 by maintainers)

github_iconTop GitHub Comments

1reaction
LucaLumetticommented, Jun 17, 2022

Hello difficult to reproduce, and the File not found error should a path problem on your side … Can you reproduce it with a minimal example involving torchio dataset ?

hello,please test this example. import torch import torch.nn as nn import torchio as tio patch_overlap = 4, 4, 4 # or just 4 patch_size = 88, 88, 60 subject = tio.datasets.Colin27() subject grid_sampler = tio.inference.GridSampler( subject, patch_size, patch_overlap, ) patch_loader = torch.utils.data.DataLoader(grid_sampler, batch_size=4) aggregator = tio.inference.GridAggregator(grid_sampler) model = nn.Identity().eval() with torch.no_grad(): for patches_batch in patch_loader: input_tensor = patches_batch['t1'][tio.DATA] locations = patches_batch[tio.LOCATION] logits = model(input_tensor) labels = logits.argmax(dim=tio.CHANNELS_DIMENSION, keepdim=True) outputs = labels aggregator.add_batch(outputs, locations) output_tensor = aggregator.get_output_tensor()

I cannot reproduce this bug too, even with your code. Can you paste the error message that this code outputs to you?

0reactions
fepegarcommented, Jun 18, 2022

Hi, @boundaryT. It is hard for us to help you if you don’t provide a minimal reproducible example, as requested in the issue template.

I don’t know what opt or subfiles are in your code, but they might be generating wrong filenames. If you are able to provide a example we can reproduce, feel free to open a new issue.

Read more comments on GitHub >

github_iconTop Results From Across the Web

TinyCore RedPill Loader (TCRP) - Page 71 - XPEnology forums
@pocopicoyes, I have updated through the GUI the latest pat file - DSM completed the update and then rebooted, then before it started...
Read more >
Get a single batch from DataLoader without iterating #1917
A workaround is to create a _DataLoaderIter outside the loop and iterate over it. The problem is that once all batches are retrieved, ......
Read more >
Patch Loader Mod - Workshop - Steam Community
This item is incompatible with Cities: Skylines. Please see the instructions page for reasons why this item might not work within Cities: Skylines....
Read more >
Patch samplers - Training - TorchIO
Samplers are used to randomly extract patches from volumes. ... border will be set to 0 as the center of the patch cannot...
Read more >
javascript - TypeError: Cannot read properties of undefined ...
You are trying to iterate through the args array ( args.forEach() ) before var args exists. If you move var args = Message.content.split(" ......
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found