question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Error using inputs_embeds argument in TFXLNetModel

See original GitHub issue

While using the TFXLNetModel: xlnet = TFXLNetModel.from_pretrained('xlnet-base-cased') according to the docs, input_ids and inputs_embeds can be optionally used. However, when I tried using:

xlnet(inputs_embeds=embeddings, attention_mask=attn_masks)[0] it throws: ValueError: The first argument to Layer.call must always be passed.

which I thought is an issue with the inputs argument which must be a positional one: xlnet(inputs=None, inputs_embeds=embeddings, attention_mask=attn_masks)[0] using this gave me: RuntimeError: Attempting to capture an EagerTensor without building a function.

And finally passing both inputs and inputs_embeds gave: ValueError: You cannot specify both input_ids and inputs_embeds at the same time

Can someone suggest a workaround on this? P.S. the embeddings variable is the last_hidden_state from another bert which matches the config for the inputs_embeds shape. Note that input_ids parameter won’t count as it gave the same error if I didn’t use the inputs argument.

Issue Analytics

  • State:closed
  • Created 3 years ago
  • Comments:7 (1 by maintainers)

github_iconTop GitHub Comments

2reactions
AndyTheFactorycommented, Jan 17, 2021

While using the TFXLNetModel: xlnet = TFXLNetModel.from_pretrained('xlnet-base-cased') according to the docs, input_ids and inputs_embeds can be optionally used. However, when I tried using:

xlnet(inputs_embeds=embeddings, attention_mask=attn_masks)[0] it throws: ValueError: The first argument to Layer.call must always be passed.

try something like this: xlnet({‘attention_mask’:attention_mask, ‘token_type_ids’:token_type_ids},inputs_embeds=embeddings, training=training)

worked for me when getting the same error for a bert model

1reaction
stale[bot]commented, Oct 22, 2020

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.

Read more comments on GitHub >

github_iconTop Results From Across the Web

MPNet — transformers 4.3.0 documentation - Hugging Face
In this paper, we propose MPNet, a novel pre-training method that inherits the advantages of BERT and XLNet and avoids their limitations. MPNet...
Read more >
multimodal_transformers.model - Multimodal Transformers
Combiner module for combining text features with categorical and numerical features The methods of combining, specified by ...
Read more >
python - "ValueError: You have to specify either input_ids or ...
I want to fine-tune the AutoModelWithLMHead model ...
Read more >
Update to blurr library (huggingface-fastai integration for ...
Last week I released a new library with the goal of to support training all huggingface transformer models with fastai…
Read more >
bert_sim | Kaggle
This Python 3 environment comes with many helpful analytics libraries ... import SequenceSummary from transformers.models.xlnet.modeling_xlnet import ...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found