question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

The following `model_kwargs` are not used by the model: `use_gpu`

See original GitHub issue

A fresh installation shows this error:

The following `model_kwargs` are not used by the model: ['use_gpu'] (note: typos in the generate arguments will also show up in this list)

Issue Analytics

  • State:open
  • Created a year ago
  • Reactions:17
  • Comments:10

github_iconTop GitHub Comments

3reactions
reshinthadithyancommented, Nov 19, 2022

Thanks for the patience, the team was busy with other stuffs. I just sent a PR at #28. This would ideally fix the issue. I’ll ping once the PR is merged.

0reactions
GitHducommented, Nov 23, 2022

@GitHdu, can you paste the error stack when encountering this issue or help us reproduce the bug? I tested it in a small file might be token length wasn’t an issue there. Okay let me try it in a big file with huge context length. Thanks.

sorry i do not know how to paste the error stack, there is only an error tip when input the code Error: Input is too long for this model, shorten your input or use 'parameters': {'truncation': 'only_first'} to run the model only on the first part.

image
Read more comments on GitHub >

github_iconTop Results From Across the Web

ValueError: The following `model_kwargs` are not used by the ...
When using "return_length=True" with the tokenizer, the error is given. This is from a change in a recent version and did not happen...
Read more >
ValueError: The following `model_kwargs` are not used by the ...
When I try to run my code for Donut for DocVQA model, I got the following error "" ...
Read more >
NLP Pretrained model model doesn't use GPU when making ...
I am using Marian MT Pretrained model for Inference for machine Translation task integrated with a flask Service . I am running the...
Read more >
NumPyro Documentation - Read the Docs
Let us infer the values of the unknown parameters in our model by running MCMC using the No-U-Turn Sampler. (NUTS).
Read more >
Uber AI Labs - NumPyro documentation
Let us infer the values of the unknown parameters in our model by running MCMC using the No-U-Turn Sampler. (NUTS).
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found