question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

[Deprecated] Size mismatch when setting knowledge_usage to separate

See original GitHub issue

For the evaluation, I receive the following error:

...
  File "/home/pouramini/UnifiedSKG/utils/trainer.py", line 298, in prediction_step
    **gen_kwargs,
  File "/home/pouramini/UnifiedSKG/models/unified/prefixtuning.py", line 289, in generate
    bsz=bsz, sample_size=kwargs['num_beams'], description=description_representation, knowledge=knowledge_representation,
  File "/home/pouramini/UnifiedSKG/models/unified/prefixtuning.py", line 120, in get_prompt
    past_key_values = torch.cat([past_key_values, self.knowledge_trans(knowledge)], dim=1)
RuntimeError: Sizes of tensors must match except in dimension 0. Got 4 and 16 (The offending index is 0)

It seems when you use num_beam greater than 1 and it’s used as sample_size in get_prompt method and is multiplied to batch_size, the mismatch occurs.

    def get_prompt(self, bsz=None, sample_size=1, description=None, knowledge=None):
        old_bsz = bsz
        bsz = bsz * sample_size
        input_tokens = self.input_tokens.unsqueeze(0).expand(bsz, -1)
        temp_control = self.wte(input_tokens)
        if description is not None:
            temp_control = temp_control + description.repeat_interleave(sample_size, dim=0).unsqueeze(1)
        past_key_values = self.control_trans(temp_control)  # bsz, seqlen, layer*emb
        if knowledge is not None:
            past_key_values = torch.cat([past_key_values, self.knowledge_trans(knowledge)], dim=1)

Issue Analytics

  • State:closed
  • Created a year ago
  • Comments:6 (3 by maintainers)

github_iconTop GitHub Comments

2reactions
ChenWu98commented, May 3, 2022

Hi,

Yes, another bug when setting knowledge_usage to separate. I tried to fix it in the latest commit.

1reaction
ChenWu98commented, May 5, 2022

Yea I just ran it on my server, and it also works. Thanks for pointing out this issue!

Read more comments on GitHub >

github_iconTop Results From Across the Web

Discovery error messages - ServiceNow Docs
The above error messages indicate that there is a major version mismatch. Message: The multisensor will not process because its responding ...
Read more >
Deprecate Definition & Meaning - Merriam-Webster
The meaning of DEPRECATE is to express disapproval of. How to use deprecate in a sentence.
Read more >
33 Synonyms & Antonyms for DEPRECATE - Thesaurus.com
Find 33 ways to say DEPRECATE, along with antonyms, related words, and example sentences at Thesaurus.com, the world's most trusted free thesaurus.
Read more >
Deprecate Definition & Meaning - Dictionary.com
to depreciate; belittle: How can companies redress the experiences of marginalized team members whose voices are being deprecated in the workplace?
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found