question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Unexpected keyword argument 'trust_remote_code' when using `table-question-answering` pipeline

See original GitHub issue

System Info

  • transformers version: 4.21.1
  • Platform: Linux-5.4.188±x86_64-with-Ubuntu-18.04-bionic
  • Python version: 3.7.13
  • Huggingface_hub version: 0.8.1
  • PyTorch version (GPU?): 1.12.1+cu113 (False)
  • Tensorflow version (GPU?): 2.8.2 (False)
  • Flax version (CPU?/GPU?/TPU?): not installed (NA)
  • Jax version: not installed
  • JaxLib version: not installed
  • Using GPU in script?: no
  • Using distributed or parallel set-up in script?: no

Who can help?

No response

Information

  • The official example scripts
  • My own modified scripts

Tasks

  • An officially supported task in the examples folder (such as GLUE/SQuAD, …)
  • My own task or dataset (give details below)

Reproduction

from transformers import AutoTokenizer, AutoModelForSeq2SeqLM

tokenizer = AutoTokenizer.from_pretrained("microsoft/tapex-base-finetuned-wtq").save_pretrained("test")
model = AutoModelForSeq2SeqLM.from_pretrained("microsoft/tapex-base-finetuned-wtq").save_pretrained("test")
  1. create pipeline object
from transformers import pipeline
tq = pipeline("table-question-answering",model="test")
  1. receive error
pipeline(task, model, config, tokenizer, feature_extractor, framework, revision, use_fast, use_auth_token, device_map, torch_dtype, trust_remote_code, model_kwargs, pipeline_class, **kwargs)
    655         task=task,
    656         **hub_kwargs,
--> 657         **model_kwargs,
    658     )
    659 

[/usr/local/lib/python3.7/dist-packages/transformers/pipelines/base.py](https://localhost:8080/#) in infer_framework_load_model(model, config, model_classes, task, framework, **model_kwargs)
    255 
    256             try:
--> 257                 model = model_class.from_pretrained(model, **kwargs)
    258                 if hasattr(model, "eval"):
    259                     model = model.eval()

[/usr/local/lib/python3.7/dist-packages/transformers/modeling_utils.py](https://localhost:8080/#) in from_pretrained(cls, pretrained_model_name_or_path, *model_args, **kwargs)
   2104 
   2105         with ContextManagers(init_contexts):
-> 2106             model = cls(config, *model_args, **model_kwargs)
   2107 
   2108         if device_map == "auto":

TypeError: __init__() got an unexpected keyword argument 'trust_remote_code'

Expected behavior

The model should normally load

Issue Analytics

  • State:closed
  • Created a year ago
  • Comments:12 (8 by maintainers)

github_iconTop GitHub Comments

1reaction
LysandreJikcommented, Aug 24, 2022

The error was fixed in https://github.com/huggingface/transformers/pull/18428 @philschmid.

I’ll likely do a patch PR later today containing this fix (v4.21.2).

0reactions
LysandreJikcommented, Aug 24, 2022
Read more comments on GitHub >

github_iconTop Results From Across the Web

transformers.pipelines.table_question_answering
Keyword argument `table` should be either of type `dict` or `list`, ... This tabular question answering pipeline can currently be loaded from ......
Read more >
('Keyword argument not understood:', 'trust_remote_code ...
I am trying use the hugging face models in offline-mode. So, I created a local repository for this model and when I try...
Read more >
Question Answering on Tabular Data with HuggingFace ...
In this video, I'll show you how you can use HuggingFace's Transformers pipeline : table - question - answering. You can use this...
Read more >
Answering Questions with HuggingFace Pipelines and Streamlit
See how easy it can be to build a simple web app for question answering from text using Streamlit and HuggingFace pipelines.
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found