Issue with DataLoader with lr_finder.range_test
See original GitHub issueI try to use:
class CustomTrainIter(TrainDataLoaderIter):
def inputs_labels_from_batch(self, batch_data):
return batch_data["img"], batch_data["target"]
to work with DataLoader for the lr_finder.range_test() but still got the error: TypeError: list indices must be integers or slices, not str
TypeError Traceback (most recent call last)
<ipython-input-60-b2a8b27d6c88> in <module>()
3 optim = torch.optim.Adam(model_ft.parameters(), lr=1e-7, weight_decay=1e-2)
4 lr_finder = LRFinder(model_ft,optim, criterion, device='cuda')
----> 5 lr_finder.range_test( custom_train_iter ,end_lr=100,num_iter=100)
6 lr_finder.plot()
7 lr_finder.reset()
3 frames
/usr/local/lib/python3.7/dist-packages/torch_lr_finder/lr_finder.py in range_test(self, train_loader, val_loader, start_lr, end_lr, num_iter, step_mode, smooth_f, diverge_th, accumulation_steps, non_blocking_transfer)
318 train_iter,
319 accumulation_steps,
--> 320 non_blocking_transfer=non_blocking_transfer,
321 )
322 if val_loader:
/usr/local/lib/python3.7/dist-packages/torch_lr_finder/lr_finder.py in _train_batch(self, train_iter, accumulation_steps, non_blocking_transfer)
369 self.optimizer.zero_grad()
370 for i in range(accumulation_steps):
--> 371 inputs, labels = next(train_iter)
372 inputs, labels = self._move_to_device(
373 inputs, labels, non_blocking=non_blocking_transfer
/usr/local/lib/python3.7/dist-packages/torch_lr_finder/lr_finder.py in __next__(self)
57 try:
58 batch = next(self._iterator)
---> 59 inputs, labels = self.inputs_labels_from_batch(batch)
60 except StopIteration:
61 if not self.auto_reset:
<ipython-input-58-f89d28995874> in inputs_labels_from_batch(self, batch_data)
4
5
----> 6 return batch_data["img"], batch_data["target"]
7
8 custom_train_iter = CustomTrainIter(train_dl)
TypeError: list indices must be integers or slices, not str
Any suggestion ? thanks !
Issue Analytics
- State:
- Created 2 years ago
- Comments:6 (4 by maintainers)
Top Results From Across the Web
Have I implemented implemenation of learning rate finder ...
Example: >>> lr_finder = LRFinder(net, optimizer, criterion, ... DataLoader, optional): if `None` the range test will only use the training ...
Read more >torch-lr-finder - PyPI
The learning rate range test is a test that provides valuable information about the optimal learning rate. During a pre-training run, the learning...
Read more >Fueling up your neural networks with the power of cyclical ...
Now we are all set to train our model. Training using cyclic learning rate. On each iteration through the data loader, we will...
Read more >Finding good learning rate for your neural nets using PyTorch ...
Nowadays, many libraries implement LR Finder or “LR Range Test”. ... you can imagine neural network adapting to the problem based on ...
Read more >torch.utils.data — PyTorch 1.13 documentation
The most important argument of DataLoader constructor is dataset , which ... Check out issue #13246 for more details on why this occurs...
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
My dataset is like as the first one you mentioned :
It works perfectly now. I appreciate your help. Is it like a abstract classes concept? I just use python in few months for deep learning project.
My pleasure 😃
It’s simply a class inheritance here. And the reason why we have to wrap a
torch.Dataset
ortorch.DataLoader
withDataLoaderIter
is that we need to make it:flexible for customization under the restriction of fixed input format: As you can see the following code, most of the PyTorch models follow this convention to build a forward pass. https://github.com/davidtvs/pytorch-lr-finder/blob/acc5e7ee7711a460bf3e1cc5c5f05575ba1e1b4b/torch_lr_finder/lr_finder.py#L376-L378 But since it’s just a convention rather than a syntax-level design, we want to minimize the effort to rewrite the code once the inputs of
model.forward()
or outputs ofdataset.__getitem__()
are complicated. In this case, user can implement their owninputs_labels_from_batch()
to make it work with current implementation ofLRFinder
without modifying it.able to iterate training dataset infinitely: Since we cannot guarantee that the length of dataset is sufficient for running a learning rate range test with given settings (related to
num_iter
andbatch_size
), we have to kept a dataset accessible until the range test is finished. And that’s why we made aTrainDataLoaderIter
like this: https://github.com/davidtvs/pytorch-lr-finder/blob/acc5e7ee7711a460bf3e1cc5c5f05575ba1e1b4b/torch_lr_finder/lr_finder.py#L51-L67Hope these information help, and wish you a good luck on this learning journey. 😉