The following `model_kwargs` are not used by the model: `use_gpu`
See original GitHub issueA fresh installation shows this error:
The following `model_kwargs` are not used by the model: ['use_gpu'] (note: typos in the generate arguments will also show up in this list)
Issue Analytics
- State:
- Created a year ago
- Reactions:17
- Comments:10
Top Results From Across the Web
ValueError: The following `model_kwargs` are not used by the ...
When using "return_length=True" with the tokenizer, the error is given. This is from a change in a recent version and did not happen...
Read more >ValueError: The following `model_kwargs` are not used by the ...
When I try to run my code for Donut for DocVQA model, I got the following error "" ...
Read more >NLP Pretrained model model doesn't use GPU when making ...
I am using Marian MT Pretrained model for Inference for machine Translation task integrated with a flask Service . I am running the...
Read more >NumPyro Documentation - Read the Docs
Let us infer the values of the unknown parameters in our model by running MCMC using the No-U-Turn Sampler. (NUTS).
Read more >Uber AI Labs - NumPyro documentation
Let us infer the values of the unknown parameters in our model by running MCMC using the No-U-Turn Sampler. (NUTS).
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
Thanks for the patience, the team was busy with other stuffs. I just sent a PR at #28. This would ideally fix the issue. I’ll ping once the PR is merged.
sorry i do not know how to paste the error stack, there is only an error tip when input the code
Error: Input is too long for this model, shorten your input or use 'parameters': {'truncation': 'only_first'} to run the model only on the first part.