Seq2seq finetune example: "Please save or load state of the optimizer"
See original GitHub issueWhen running the example scripts in examples/seq2seq/finetune_bart and finetune_t5, get warning messages:
Environment info
transformers
version: 3.3.1- Platform: Linux-4.15.0-66-generic-x86_64-with-glibc2.10
- Python version: 3.8.5
- PyTorch version (GPU?): 1.6.0 (True)
- Tensorflow version (GPU?): not installed (NA)
- Using GPU in script?: Ran both with and without gpus; same result
- Using distributed or parallel set-up in script?: no
Who can help
@sshleifer for examples/seq2seq, Bart @patrickvonplaten (maybe because this also happens in T5?)
Information
Model I am using (Bert, XLNet …): Occurs when running bart and also when running T5 via the examples/seq2seq/finetune
The problem arises when using:
- the official example scripts: (give details below)
- my own modified scripts: (give details below)
The tasks I am working on is:
- [X ] an official GLUE/SQUaD task: (give the name)
- my own task or dataset: (give details below)
To reproduce
Steps to reproduce the behavior: Steps to reproduce:
- clone transformers into new directory
- Set up environment (new): cd transformers && pip install .e; cd examples && pip install -r requirements.txt
- cd seq2seq && ./finetune_t5_bart_tiny.sh
Observe that warnings are printed:
…/python3.8/site-packages/pytorch_lightning/utilities/distributed.py:37: UserWarning: Could not log computational graph since the model.example_input_array
attribute is not set or input_array
was not given
warnings.warn(*args, **kwargs)
…/python3.8/site-packages/torch/optim/lr_scheduler.py:200: UserWarning: Please also save or load the state of the optimzer when saving or loading the scheduler.
warnings.warn(SAVE_STATE_WARNING, UserWarning)
(There is both the optimizer warning and the computational graph logging warning)
Expected behavior
Should not see warnings for the given example.
Other notes:
There was a related issue where supplementary files / checkpoints were not being saved, but that seems to be fixed now.
Issue Analytics
- State:
- Created 3 years ago
- Reactions:1
- Comments:5 (5 by maintainers)
I believe this warning has been hidden on the
master
branch, and will be hidden in the next release. See this.Cool! Looks like originally in #7401. I pulled master and confirmed that this is fixed.
Any notes on the computational graph warning that also pops up? …/python3.8/site-packages/pytorch_lightning/utilities/distributed.py:37: UserWarning: Could not log computational graph since the model.example_input_array attribute is not set or input_array was not given