question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

AutoModel.from_pretrained with torchscript flag raises a TypeError: __init__() got an unexpected keyword argument 'torchscript'

See original GitHub issue

🐛 Bug

Information

Model I am using: BertModel and AutoModel

Language I am using the model on: English

To reproduce

Steps to reproduce the behavior:

from transformers.modeling_auto import AutoModel
from transformers.modeling_bert import BertModel

bert_model = BertModel.from_pretrained('bert-base-uncased', torchscript=True)
bert_model = AutoModel.from_pretrained('bert-base-uncased', torchscript=True)

Behaviour

bert_model = AutoModel.from_pretrained('bert-base-uncased', torchscript=True) raises a TypeError: __init__() got an unexpected keyword argument 'torchscript'

Expected behaviour

Successfully create a BertModel object using AutoModel class.

Environment info

  • transformers version: 2.9.0
  • Platform: Darwin Kernel Version 19.4.0: Wed Mar 4 22:28:40 PST 2020; root:xnu-6153.101.6~15/RELEASE_X86_64
  • Python version: 3.6.10
  • PyTorch version: 1.3.1
  • Tensorflow version: Not applicable
  • Using GPU in script?: Not applicable
  • Using distributed or parallel set-up in script?: Not applicable

Issue Analytics

  • State:closed
  • Created 3 years ago
  • Reactions:2
  • Comments:5 (1 by maintainers)

github_iconTop GitHub Comments

1reaction
LysandreJikcommented, Jul 28, 2020

Hi! This was fixed by https://github.com/huggingface/transformers/pull/5665. Could you try to install from source and try again?

1reaction
benschreibercommented, Jul 15, 2020

It appears that AutoConfig accepts a torchscript keyword parameter. The AutoConfig object can then be passed as the config keyword parameter to AutoModel. Hope this workaround helps @jonsnowseven

Read more comments on GitHub >

github_iconTop Results From Across the Web

TorchScript — transformers 3.0.2 documentation - Hugging Face
This flag is necessary because most of the language models in this repository have tied weights between their Embedding layer and their Decoding...
Read more >
TypeError: __init__() got an unexpected keyword argument ...
If you're intending to use autocast module from pytorch, use it as following- torch.autocast(device_type='cuda', dtype=torch.float16).
Read more >
TorchScript — PyTorch 1.13 documentation
TorchScript is a way to create serializable and optimizable models from PyTorch code. Any TorchScript program can be saved from a Python process...
Read more >
PyTorch-Neuron trace python API
trace() method sends operations to the Neuron-Compiler ( neuron-cc ) for compilation and embeds compiled artifacts in a TorchScript graph. Compilation can be ......
Read more >
PyTorch Release v1.2.0 | Exxact Blog
PyTorch Release v1.2.0 - New TorchScript API with Improved Python Language Coverage, Expanded ONNX Export, NN.Transformer. August 8, 2019 ...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found