question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

[QuestionGeneration] RuntimeError: Integer division of tensors using div or / is no longer supported

See original GitHub issue

Environment info

  • transformers version: 4.12.0
  • Platform: Linux-5.4.0-88-generic-x86_64-with-debian-buster-sid
  • Python version: 3.7.10
  • PyTorch version (GPU?): 1.6.0 (True)
  • Tensorflow version (GPU?): 2.6.0 (False)
  • Flax version (CPU?/GPU?/TPU?): not installed (NA)
  • Jax version: not installed
  • JaxLib version: not installed
  • Using GPU in script?: No
  • Using distributed or parallel set-up in script?: No

Who can help

Models:

Information

Model I am using T5conditionalGeneration through Questgen.ai

The problem arises when using:

  • my own modified scripts: (give details below)
from transformers import AutoTokenizer, AutoModelWithLMHead

def predict(sentence):
    tokenizer = AutoTokenizer.from_pretrained("flexudy/t5-base-multi-sentence-doctor")

    model = AutoModelWithLMHead.from_pretrained("flexudy/t5-base-multi-sentence-doctor")

    input_text = f"repair_sentence: {sentence}</s>"

    input_ids = tokenizer.encode(input_text, return_tensors="pt")

    outputs = model.generate(input_ids, max_length=32, num_beams=1)

    sentence = tokenizer.decode(outputs[0], skip_special_tokens=True, clean_up_tokenization_spaces=True)

    return sentence

The tasks I am working on is:

  • my own task :
  File "apis/text/text/boolean-question-generations/questgen/questgen.py", line 12, in predict
    output = qe.predict_boolq(payload)
  File "/opt/conda/lib/python3.7/site-packages/Questgen/main.py", line 238, in predict_boolq
    output = beam_search_decoding (input_ids, attention_masks,self.model,self.tokenizer)
  File "/opt/conda/lib/python3.7/site-packages/Questgen/encoding/encoding.py", line 18, in beam_search_decoding
    early_stopping=True
  File "/opt/conda/lib/python3.7/site-packages/torch/autograd/grad_mode.py", line 15, in decorate_context
    return func(*args, **kwargs)
  File "/opt/conda/lib/python3.7/site-packages/transformers/generation_utils.py", line 1064, in generate
    **model_kwargs,
  File "/opt/conda/lib/python3.7/site-packages/transformers/generation_utils.py", line 1839, in beam_search
    next_indices = (next_tokens / vocab_size).long()
RuntimeError: Integer division of tensors using div or / is no longer supported, and in a future release div will perform true division as in Python 3. Use true_divide or floor_divide (// in Python) instead.

To reproduce

Steps to reproduce the behavior:

pip install git+https://github.com/ramsrigouthamg/Questgen.ai
pip install git+https://github.com/boudinfl/pke.git

python -m nltk.downloader universal_tagset
python -m spacy download en 
from transformers import AutoTokenizer, AutoModelWithLMHead

def predict(sentence):
    tokenizer = AutoTokenizer.from_pretrained("flexudy/t5-base-multi-sentence-doctor")

    model = AutoModelWithLMHead.from_pretrained("flexudy/t5-base-multi-sentence-doctor")

    input_text = f"repair_sentence: {sentence}</s>"

    input_ids = tokenizer.encode(input_text, return_tensors="pt")

    outputs = model.generate(input_ids, max_length=32, num_beams=1)

    sentence = tokenizer.decode(outputs[0], skip_special_tokens=True, clean_up_tokenization_spaces=True)

    return sentence

Expected behavior

No error

Fix found

https://github.com/huggingface/transformers/blob/master/src/transformers/generation_utils.py#L1839

use // instead of /

Issue Analytics

  • State:closed
  • Created 2 years ago
  • Comments:13 (4 by maintainers)

github_iconTop GitHub Comments

2reactions
mwojnarscommented, Feb 4, 2022

@patrickvonplaten Thanks. It seems to be working fine now.

2reactions
JeetRoy97commented, Jan 5, 2022

I am still getting the error in generation_utils.py. The issue is not yet resolved in Transformers 4.15.0

Read more comments on GitHub >

github_iconTop Results From Across the Web

STANZA/ RuntimeError: Integer division of tensors using div or ...
STANZA/ RuntimeError: Integer division of tensors using div or / is no longer supported · Which python version are you using? · So,...
Read more >
RuntimeError: Integer division of tensors using div or / is no ...
I am trying to pass Normalize to images but since it only works on single image I am using a syntax like below:...
Read more >
torch.div — PyTorch master documentation
Divides each element of the input input with the scalar other and returns a new resulting tensor. Warning. Integer division using div is...
Read more >
Integer division of tensors using div or / is no longer supported
这个floor_divide相当于python中的'//',即得到的结果为整型(去掉了小数点后的数字)。 如果你不想要这种除法,想得到带小数点的准确数值,您可以:.
Read more >
Integer division of tensors using div or / is no longer supported ...
【pytorch】RuntimeError: Integer division of tensors using div or / is no longer supported【解决】. 木盏 于 2020-08-26 18:22:05 发布 11160 收藏 29.
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found