Check that model is configured correctly
See original GitHub issue🚀 Feature
Check that model is correctly setup before training
Motivation
As of right now, there is no checking that essential methods like training_step
, configure_optimizers
and train_dataloader
are proper defined in the model.
For example if no configure_optimizers
is defined the model will train, however nothing will happen. If no training_step
is defined then this error occurs:
AttributeError: 'NoneType' object has no attribute 'items'
,
which are not very useful error message.
Additionally, it would be useful to check if a val_dataloader
is defined then validation_step
and validation_step_end
is also defined.
Pitch
Some checking is already done here:
https://github.com/PyTorchLightning/pytorch-lightning/blob/c32e3f3ea57dd4439255b809ed5519608a585d73/pytorch_lightning/trainer/trainer.py#L649-L668
but that is checked if an user pass in dataloaders to the .fit()
method. I propose to expand on this in a separate method, something like
def model_correctly_configured(self, model):
if not self.is_overriden('train_dataloader', model):
raise MisconfigurationException('No train_dataloader defined')
if not self.is_overriden('configure_optimizers', model):
raise MisconfigurationException('No configure_optimizers method defined')
if not self.is_overriden('training_step', model):
raise MisconfigurationException('No training_step method defined')
if self.is_overriden('val_dataloader', model):
if not self.is_overriden('validation_step', model):
raise MisconfigurationException('defined val_dataloader but no val_step')
else:
if not self.is_overriden('validation_epoch_end', model):
warnings.warn('You have a val_dataloader and validation_step,'
'you may want to also defined val_step_end')
if self.is_overriden('test_dataloader', model):
if not self.is_overriden('test_step', model):
raise MisconfigurationException('defined test_dataloader but no test_step')
else:
if not self.is_overriden('test_epoch_end', model):
warnings.warn('You have a test_dataloader and test_step,'
'you may want to also defined val_step_end')
possible with even more checks.
Issue Analytics
- State:
- Created 4 years ago
- Reactions:2
- Comments:8 (8 by maintainers)
I think what I propose and what
fast_dev_run
is trying to achieve is a bit different.fast_dev_run
can be used to check that the user has no bug intraining_step
andvalidation_step
. This feature is a even more basic check: aretraining_step
,validation_step
,configure_optimizers
ect defined at all.For example, right now if I run the cpu template in basic_examples with
fast_dev_run=True
and remove theconfigure_optimizers()
method, no error is thrown meaning that from a user standpoint I have no idea that something is wrong, even though the model is not training at all.I want to note, I consider this a feature to newcomers to the pytorch-lightning framework, since these are possible the most frequent user that may forget to implement specific methods.
Yeah I like… 🤖