How to change config parameters when loading the model with `from_pretrained`
See original GitHub issueI have created a model by extending PreTrainedBertModel
:
class BertForMultiLabelClassification(PreTrainedBertModel):
def __init__(self, config, num_labels=2):
super(BertForMultiLabelClassification, self).__init__(config)
self.num_labels = num_labels
self.bert = BertModel(config)
self.dropout = nn.Dropout(config.attention_probs_dropout_prob)
self.classifier = nn.Linear(config.hidden_size, num_labels)
self.apply(self.init_bert_weights)
# some code here ...
I am creating an instance of this model:
model = BertForMultiLabelClassification.from_pretrained(args.bert_model,
cache_dir=PYTORCH_PRETRAINED_BERT_CACHE / 'distributed_{}'.format(
args.local_rank),
num_labels=num_labels)
what is an effective way to modify parameters of the default config, when creating an instance of BertForMultiLabelClassification
? (say, setting a different value for config.hidden_dropout_prob
).
Any thoughts on what is an effective way to do this?
Issue Analytics
- State:
- Created 5 years ago
- Reactions:7
- Comments:8
Top Results From Across the Web
How to change config parameters when loading the model ...
Just use the update method. For example, if you want to change the number of hidden layers, simply use config.update({'num_hidden_layers': 1}) .
Read more >Configuration - Hugging Face
Handles a few parameters common to all models' configurations as well as methods for loading/downloading/saving configurations. A configuration file can be ...
Read more >Changing config and loading Hugging Face model fine-tuned ...
from_pretrained (pretrained_model_name, config=config) C:\ProgramData\Anaconda3\envs\arcgis183\lib\site-packages\transformers\modeling_auto.py in ...
Read more >NeMo Models - NVIDIA Documentation Center
Every NeMo model has an example configuration file and training script that can ... NeMo model can be downloaded and used with the...
Read more >PyTorch-Transformers
test/saved_model/')` model = torch.hub.load('huggingface/pytorch-transformers', 'model', 'bert-base-uncased', output_attentions=True) # Update configuration ...
Read more >
Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free
Top Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Oh, I find this code works:
And
print(model)
could see that drop_out changes:Just use the
update
method. For example, if you want to change the number of hidden layers, simply useconfig.update({'num_hidden_layers': 1})
.