Big Bird generate() "local variable 'next_tokens' referenced before assignment"
See original GitHub issueI am facing this problem when doing text summarization. I am using google/bigbird-roberta-base and I get the following error when calling model.generate(input, max_length = 4096, num_beams=4, early_stopping=True, length_penalty = 0.8):
Input length of input_ids is 4096, but ``max_length`` is set to 4096.This can lead to unexpected behavior. You should consider increasing ``config.max_length`` or ``max_length``.
---------------------------------------------------------------------------
UnboundLocalError Traceback (most recent call last)
<ipython-input-13-90a633800ba7> in <module>()
----> 1 get_ipython().run_cell_magic('time', '', ' \ni = 0\nsize = 1\nout = []\nend = False\nprint_iters = 100\nsave_iters = 5\n \nwhile True:\n if (i+size) >= n:\n last = n\n end = True\n else:\n last = i + size \n \n result = make_gen( model_sum, tokens[i:last, :].detach().clone() )\n \n for j in range(result.shape[0]):\n out.append(result[j])\n \n if last % (print_iters*size) == 0:\n print(last)\n gc.collect()\n torch.cuda.empty_cache()\n torch.cuda.synchronize()\n if last % (print_iters*size*save_iters) == 0:\n with open(path_output + name + ".pkl", \'wb\') as f:\n pickle.dump(out, f)\n print("Saved to disk")\n \n if end:\n break\n i = last')
6 frames
<decorator-gen-53> in time(self, line, cell, local_ns)
<timed exec> in <module>()
/usr/local/lib/python3.7/dist-packages/transformers/generation_utils.py in beam_search(self, input_ids, beam_scorer, logits_processor, stopping_criteria, max_length, pad_token_id, eos_token_id, output_attentions, output_hidden_states, output_scores, return_dict_in_generate, **model_kwargs)
1808
1809 sequence_outputs = beam_scorer.finalize(
-> 1810 input_ids, beam_scores, next_tokens, next_indices, pad_token_id=pad_token_id, eos_token_id=eos_token_id
1811 )
1812
UnboundLocalError: local variable 'next_tokens' referenced before assignment
Issue Analytics
- State:
- Created 2 years ago
- Comments:6 (2 by maintainers)
Top Results From Across the Web
local variable 'next_tokens' referenced before assignment ...
For some reason, this error occurs when I try to generate via beam search using GPT2's with language modelling head. Here is my...
Read more >how can i fix: UnboundLocalError: local variable 'generate ...
but you do this after trying to get value from caratteri - and this gives error local variable 'caratteri' referenced before assignment ....
Read more >Solving Python Error - UnboundLocalError: local variable 'x ...
Lets understand few things first. In python all the variables inside a function are global if they are not assigned any value to...
Read more >CSC519 Programming Languages - Computer Science
A grammar for a language describes how to generate words from it. ... The parameters are treated as local variables with the values...
Read more >Learn to Program In GSoft BASIC - The Byte Works
Before starting, let's make sure you have everything you will need. First, you need an Apple IIGS computer. (An emulator is fine, as...
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
@vasudevgupta7 I was using AutoModelForSeq2SeqLM (this is what you use for summarization right?)
I have now changed to EncoderDecoderModel but now I face a new error
I also got exact same error (
output with shape...
) when i generate on custom BigBird model. I fixed it by reducingmodel_max_length
value from 4096 to 4094 and afterwards i can use pipeline for inference without any problem.