KeyError in Pipeline Question Answering with LongFormer
See original GitHub issueI’m trying to do QA with LongFormer in a Pipeline. First of all, I generate the pipeline:
MODEL_STR = "mrm8488/longformer-base-4096-finetuned-squadv2" tokenizer = AutoTokenizer.from_pretrained(MODEL_STR) model = AutoModelForQuestionAnswering.from_pretrained(MODEL_STR) QA = pipeline('question-answering', model=model, tokenizer=tokenizer)
Then, I get the paper text from which I want the answer to come from, named my_article, that’s a string containing the full body of the article (around 3000 words). Then, I try:
with torch.no_grad(): answer = QA(question=question, context=articles_abstract.body_text.iloc[0])
And it throws the following error:
` eyError Traceback (most recent call last) <ipython-input-53-b5f8dc0503c8> in <module> 1 with torch.no_grad(): ----> 2 answer = QA(question=question, context=articles_abstract.body_text.iloc[0])
~/miniconda/envs/transformers_env/lib/python3.7/site-packages/transformers/pipelines.py in call(self, *args, **kwargs) 1225 ), 1226 } -> 1227 for s, e, score in zip(starts, ends, scores) 1228 ] 1229
~/miniconda/envs/transformers_env/lib/python3.7/site-packages/transformers/pipelines.py in <listcomp>(.0) 1225 ), 1226 } -> 1227 for s, e, score in zip(starts, ends, scores) 1228 ] 1229
KeyError: 382 `
How can I solve this issue? More importantly, what do you think is causing the issue?
Thanks in advance! 😃
Issue Analytics
- State:
- Created 3 years ago
- Comments:9 (5 by maintainers)
Top GitHub Comments
@alexvaca0
Please check which architecture you are using, and then go to the docs and find the doc for QA model, it contains the example on how to use it without pipeline. So if your architecture is BERT then there will be a model BertForQuestionAnswering. You’ll find the example in the model’s doc. Basically what you’ll need to do is this
Hope this helps you.
This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.