don't require model if model server is given
See original GitHub issueIf Rasa uses a model server endpoint it should not require a model path when running even if --enable-api
is False
(or not specified).
Issue Analytics
- State:
- Created 4 years ago
- Comments:6 (6 by maintainers)
Top Results From Across the Web
Allow loading/unloading of specific version for a given model
My aim here is basically keeping the memory usage under control. I don't want to load all the versions of a given model...
Read more >Deploying Deep Learning Models with Model Server
Once the model is loaded successfully, you should see the same printed in the docker logs and the status for the given model...
Read more >Model Server Parameters - OpenVINO™ Documentation
Some models don't support the reshape operation.If the model can't be reshaped, it remains in the original parameters and all requests with incompatible ......
Read more >A Comprehensive Guide on How to Monitor Your Models in ...
Use unsupervised learning methods to categorize model inputs and predictions, allowing you to discover cohorts of anomalous examples and ...
Read more >TensorFlow Serving: Update model_config (add additional ...
If the model being requested has not yet been served, it is downloaded from a remote URL to a folder where the server's...
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
Agree. Should we update the description/title to make clear it is only about the case when
rasa run
is executed without--enable-api
?But then we should reopen this issue since specifying
--enable-api
in order to run with a remote model is quite weird.