How to use model.prunes when you are using transformers.T5ForConditionalGeneration
See original GitHub issueSystem Info
transformers
version: 4.20.1- Platform: macOS-12.4-arm64-arm-64bit
- Python version: 3.9.10
- Huggingface_hub version: 0.8.1
- PyTorch version (GPU?): 1.13.0.dev20220709 (False)
- Tensorflow version (GPU?): 2.8.0 (True)
- Flax version (CPU?/GPU?/TPU?): not installed (NA)
- Jax version: not installed
- JaxLib version: not installed
- Using GPU in script?: <fill in>
- Using distributed or parallel set-up in script?: <fill in>
Who can help?
Information
- The official example scripts
- My own modified scripts
Tasks
- An officially supported task in the
examples
folder (such as GLUE/SQuAD, …) - My own task or dataset (give details below)
Reproduction
I use this code to prune the model from T5ForConditionalGeneration
, but it went wrong. Many thanks for your time!😃
from transformers import T5ForConditionalGeneration
model = T5ForConditionalGeneration.from_pretrained('t5-base')
prune_heads = {}
prune_heads[0] = [0,1]
model.prune_heads(prune_heads)
Expected behavior
Traceback (most recent call last):
File "/Users/caffrey/Documents/research/FiD/prunetest.py", line 8, in <module>
model.prune_heads(prune_heads)
File "/Users/caffrey/miniforge3/envs/tongji/lib/python3.9/site-packages/transformers/modeling_utils.py", line 1507, in prune_heads
self.base_model._prune_heads(heads_to_prune)
File "/Users/caffrey/miniforge3/envs/tongji/lib/python3.9/site-packages/torch/nn/modules/module.py", line 1261, in __getattr__
raise AttributeError("'{}' object has no attribute '{}'".format(
AttributeError: 'T5ForConditionalGeneration' object has no attribute '_prune_heads'
Issue Analytics
- State:
- Created a year ago
- Comments:14 (4 by maintainers)
Top Results From Across the Web
Source code for transformers.models.t5.modeling_t5
Uses a device map to distribute attention modules of the model across ... model = T5ForConditionalGeneration.from_pretrained('t5-3b') device_map = {0: [0, ...
Read more >Hugging Face Uses Block Pruning to Speedup Transformer ...
On transformers, this entails selecting the weights to prune based on ... with distillation to match the performance of a teacher model.
Read more >Fine-Tuning T5 for Question Answering using HuggingFace ...
Train a T5 (text-to-text transformer ) model on a custom dataset for biomedical Question Answering. We 'll look at auto-regressive text ...
Read more >How to use Google T5-large (a transformers model ... - YouTube
In this video, we will be talking about how you can use Google T5, a transformers model, for a summarization use case.
Read more >arXiv:2203.07259v3 [cs.CL] 17 Oct 2022
code, fully integrated with Transformers and. SparseML, is available at https://github. ... pruning step we wish to prune a model to a tar-....
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
Hi @ArthurZucker , I open a PR here. https://github.com/huggingface/transformers/pull/19975
We can see the test on a colab https://colab.research.google.com/drive/1b9mHjtn2UxuHU_Sb_RXts12rDzbebBX0#scrollTo=hUSe4a1oOp6D
I use
opendelta
to visualize the pruning process.But we seems to be a forward problem
I will have a look 🤗