question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

TypeError: __init__() got an unexpected keyword argument 'force_bos_token_to_be_generated'

See original GitHub issue

Environment info

  • transformers version: 4.6.1
  • Platform: Linux-3.10.0-1160.15.2.el7.x86_64-x86_64-with-glibc2.10
  • Python version: 3.8.8
  • PyTorch version (GPU?): 1.7.1 (False)
  • Tensorflow version (GPU?): 2.4.1 (False)
  • Using GPU in script?: no
  • Using distributed or parallel set-up in script?: no

Who can help

@patrickvonplaten, @patil-suraj

Information

The model I am using is BART

The problem arises when using:

To reproduce

Steps to reproduce the behavior:

  1. Install transformers library
  2. run the following code-snipped as presented in the official example:
from transformers import BartForConditionalGeneration, BartTokenizer
model = BartForConditionalGeneration.from_pretrained("facebook/bart-large", force_bos_token_to_be_generated=True)
  1. Receive error:
---------------------------------------------------------------------------
TypeError                                 Traceback (most recent call last)
<ipython-input-21-216ff3421f95> in <module>
      1 from transformers import BartForConditionalGeneration, BartTokenizer
----> 2 model = BartForConditionalGeneration.from_pretrained("facebook/bart-large", force_bos_token_to_be_generated=True)
      3 tok = BartTokenizer.from_pretrained("facebook/bart-large")
      4 example_english_phrase = "UN Chief Says There Is No <mask> in Syria"
      5 batch = tok(example_english_phrase, return_tensors='pt')

~/.conda/envs/groundwork/lib/python3.8/site-packages/transformers/modeling_utils.py in from_pretrained(cls, pretrained_model_name_or_path, *model_args, **kwargs)
   1171         else:
   1172             with no_init_weights(_enable=_fast_init):
-> 1173                 model = cls(config, *model_args, **model_kwargs)
   1174 
   1175         if from_tf:

TypeError: __init__() got an unexpected keyword argument 'force_bos_token_to_be_generated'

Expected behavior

I expect the code to not raise an exception and that the final assertion is true.

If there are more information needed, please let me know.

Issue Analytics

  • State:closed
  • Created 2 years ago
  • Comments:5 (4 by maintainers)

github_iconTop GitHub Comments

5reactions
piegucommented, Sep 21, 2021

It’s not necessary to use forced_bos_token_id with facebook/bart-large, it’s only needed for bart-cnn models

Hello @patil-suraj,

I’m not sure to understand your answer.

  1. If you run my code about facebook/bart-large with forced_bos_token_id, you get a clear output: UN Chief Says There Is No Plan to Stop Chemical Weapons in Syria
  2. If you run it without (generated_ids = model.generate(batch['input_ids']), you get this: UNALSO SEE
  3. There is clearly a difference that shows that forced_bos_token_id has an impact with facebook/bart-large, no?
  4. bart-cnn models are finetuned model for summarization, no? (like https://huggingface.co/ainize/bart-base-cnn). How do you use forced_bos_token_id with them?
  5. I think the HF doc is not updated about this: https://huggingface.co/transformers/model_doc/bart.html#mask-filling

Note: just to give an overview of this discussion, I’m researching the right code to get the BART, mBART, and MBART-50 language models making multiple token masks (ie writing zero or more tokens in the output sentence when there is a <mask> token in the input one) with the objective to get the full output sentence.

5reactions
patil-surajcommented, May 31, 2021

Hi there,

force_bos_token_to_be_generated is now depricated, instead you could use forced_bos_token_id argument, which should be set to the token id that needs to be forced as first token.

Read more comments on GitHub >

github_iconTop Results From Across the Web

TypeError: __init__() got an unexpected keyword argument ...
I got this error ---> TypeError: init() got an unexpected keyword argument 'parent'. This is my code : class Manager(Tk): def __init__(self, ...
Read more >
gcloud error while updating google-cloud-sdk on centos 7
ERROR: gcloud crashed (TypeError): __init__() got an unexpected keyword argument 'completer'. Here's more from the update ouptut:.
Read more >
init() got an unexpected keyword argument 'seed' · Issue ...
I got this error message when running fm = pywFM.FM(task='regression', k2=2, seed=seed). TypeError Traceback (most recent call last)
Read more >
TypeError: __init__() got an unexpected keyword argument ...
Hi, I have followed the steps given in the chapter to set up mflix app. ... TypeError: init() got an unexpected keyword argument...
Read more >
init__() got an unexpected keyword argument 'max_iter'?
TypeError : init() got an unexpected keyword argument 'max_iter'. I m running the linear regression code in Community edition. Google says reinstall --....
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found