scibert-nli out of dace
See original GitHub issueEnvironment info
transformers
version: 3.1.0- Platform: macOS-10.15.5-x86_64-i386-64bit
- Python version: 3.8.5
- PyTorch version (GPU?): 1.6.0 (False)
- Tensorflow version (GPU?): not installed (NA)
- Using GPU in script?: yes
- Using distributed or parallel set-up in script?: no
Who can help
Owner: @gsarti Bert owner: @LysandreJik
Information
By importing the model here, two warnings prompt up
To reproduce
Steps to reproduce the behavior:
- Just import the model from the pretrained like in the example link
FutureWarning: The class AutoModelWithLMHead is deprecated and will be removed in a future version. Please use AutoModelForCausalLM for causal language models, AutoModelForMaskedLM for masked language models and AutoModelForSeq2SeqLM for encoder-decoder models.
Some weights of BertForMaskedLM were not initialized from the model checkpoint at gsarti/scibert-nli and are newly initialized: ['cls.predictions.bias', 'cls.predictions.transform.LayerNorm.bias', 'cls.predictions.transform.LayerNorm.weight', 'cls.predictions.transform.dense.bias', 'cls.predictions.decoder.weight', 'cls.predictions.decoder.bias', 'cls.predictions.transform.dense.weight']
Expected behavior
Import without any kind of errors/warnings. I suppose the first warning is due to deprecation and should be solved by importing AutoModelForMaskedLM
instead (just looking for confirmation and giving the heads-up). The second, it seems that some layers are out of date and not trained, it would be good if the owner could update the model if possible.
Issue Analytics
- State:
- Created 3 years ago
- Comments:5 (3 by maintainers)
Hey @gsarti,
Thanks for the help. If I go back to
version 2.11.0
some other dependencies can break the model itself since some functions are already deprecated.Nevertheless, and I don’t know if this makes sense, I downloaded your model from the older version (
2.11.0
) and imported as a pre-trained local model in the newer version (3.1.0
). The warning no longer appears and the model’s output seems to be stable, always giving the same results, which didn’t happen when downloading the model with the most recent transformers version (3.1.0
).@julien-c yes, saved the model locally using
save_pretrained
from the version2.11.0
and then updated the package to the latest version,3.1.0
, then used the methodfrom_pretrained
to load it up again, from the local path. By doing this, the warning no longer shows (which could be a bug) but, and this is why it’s interesting, the model starts outputting stable results, which didn’t happen before when I downloaded the model from the latest version