Unexpected keyword argument 'trust_remote_code' when using `table-question-answering` pipeline
See original GitHub issueSystem Info
transformers
version: 4.21.1- Platform: Linux-5.4.188±x86_64-with-Ubuntu-18.04-bionic
- Python version: 3.7.13
- Huggingface_hub version: 0.8.1
- PyTorch version (GPU?): 1.12.1+cu113 (False)
- Tensorflow version (GPU?): 2.8.2 (False)
- Flax version (CPU?/GPU?/TPU?): not installed (NA)
- Jax version: not installed
- JaxLib version: not installed
- Using GPU in script?: no
- Using distributed or parallel set-up in script?: no
Who can help?
No response
Information
- The official example scripts
- My own modified scripts
Tasks
- An officially supported task in the
examples
folder (such as GLUE/SQuAD, …) - My own task or dataset (give details below)
Reproduction
from transformers import AutoTokenizer, AutoModelForSeq2SeqLM
tokenizer = AutoTokenizer.from_pretrained("microsoft/tapex-base-finetuned-wtq").save_pretrained("test")
model = AutoModelForSeq2SeqLM.from_pretrained("microsoft/tapex-base-finetuned-wtq").save_pretrained("test")
- create pipeline object
from transformers import pipeline
tq = pipeline("table-question-answering",model="test")
- receive error
pipeline(task, model, config, tokenizer, feature_extractor, framework, revision, use_fast, use_auth_token, device_map, torch_dtype, trust_remote_code, model_kwargs, pipeline_class, **kwargs)
655 task=task,
656 **hub_kwargs,
--> 657 **model_kwargs,
658 )
659
[/usr/local/lib/python3.7/dist-packages/transformers/pipelines/base.py](https://localhost:8080/#) in infer_framework_load_model(model, config, model_classes, task, framework, **model_kwargs)
255
256 try:
--> 257 model = model_class.from_pretrained(model, **kwargs)
258 if hasattr(model, "eval"):
259 model = model.eval()
[/usr/local/lib/python3.7/dist-packages/transformers/modeling_utils.py](https://localhost:8080/#) in from_pretrained(cls, pretrained_model_name_or_path, *model_args, **kwargs)
2104
2105 with ContextManagers(init_contexts):
-> 2106 model = cls(config, *model_args, **model_kwargs)
2107
2108 if device_map == "auto":
TypeError: __init__() got an unexpected keyword argument 'trust_remote_code'
Expected behavior
The model should normally load
Issue Analytics
- State:
- Created a year ago
- Comments:12 (8 by maintainers)
Top Results From Across the Web
transformers.pipelines.table_question_answering
Keyword argument `table` should be either of type `dict` or `list`, ... This tabular question answering pipeline can currently be loaded from ......
Read more >('Keyword argument not understood:', 'trust_remote_code ...
I am trying use the hugging face models in offline-mode. So, I created a local repository for this model and when I try...
Read more >Question Answering on Tabular Data with HuggingFace ...
In this video, I'll show you how you can use HuggingFace's Transformers pipeline : table - question - answering. You can use this...
Read more >Answering Questions with HuggingFace Pipelines and Streamlit
See how easy it can be to build a simple web app for question answering from text using Streamlit and HuggingFace pipelines.
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
The error was fixed in https://github.com/huggingface/transformers/pull/18428 @philschmid.
I’ll likely do a patch PR later today containing this fix (v4.21.2).
Closing as solved by https://github.com/huggingface/transformers/pull/18428