run_summarization_no_trainer
See original GitHub issue@sgugger Hello! I just tried to run the code to explore this example https://github.com/huggingface/transformers/blob/main/examples/pytorch/summarization/run_summarization_no_trainer.py
this is my yml file to build the env
name: sum
channels:
- pytorch
- conda-forge
- defaults
dependencies:
- jupyterlab
- pip
- python=3.9
- pytorch
- tensorboard
- torchaudio
- torchvision
- tqdm
- tokenizers
- prettytable
- einops
- matplotlib
- accelerate
- datasets
- sentencepiece != 0.1.92
- protobuf
- nltk
- py7zr
- transformers
then pip install rouge-score
after that simply I ran thhe command
accelerate launch run_summarization_no_trainer.py --model_name_or_path t5-small --dataset_name cnn_dailymail --dataset_config '3.0.0' --source_prefix 'summarize: ' --output_dir output/tst-summarization
and got the error
Traceback (most recent call last): File “/home/arij/anaconda3/envs/sum/bin/accelerate”, line 10, in <module> sys.exit(main()) File “/home/arij/anaconda3/envs/sum/lib/python3.9/site-packages/accelerate/commands/accelerate_cli.py”, line 43, in main args.func(args) File “/home/arij/anaconda3/envs/sum/lib/python3.9/site-packages/accelerate/commands/launch.py”, line 568, in launch_command simple_launcher(args) File “/home/arij/anaconda3/envs/sum/lib/python3.9/site-packages/accelerate/commands/launch.py”, line 235, in simple_launcher mixed_precision = PrecisionType(args.mixed_precision.lower()) AttributeError: ‘NoneType’ object has no attribute ‘lower’
How to fix it?
Issue Analytics
- State:
- Created a year ago
- Comments:23 (6 by maintainers)
Top GitHub Comments
Sure! I just followed the steps in this link. The steps I followed are:
My config file is as follows (but it can change as per your requirements. I just wanted to run a job on 8 GPUs in a single node, without DeepSpeed or mixed precision):
I was previously running
accelerate launch script.py
without mentioning the config file when I faced the issue that you reported here.Also FYI, note that the doc says that integration of accelerate with DeepSpeed is experimental.
Yes , thanks . I am closing it.