unable to fetch model from GCS
See original GitHub issueRasa version: 1.1.5-full (docker image)
Rasa X version (if used & relevant):
Python version: ?
Operating system (windows, osx, …): linux
Issue: not able to fetch model from GCS through kubernetes. I’ve deployed a chatbot with kubernetes on GCP and rasa is trying to fetch models.tar.gz
(obviously it is not found) GOOGLE_APPLICATION_CREDENTIALS
and BUCKET_NAME
(let’s call it toto
) are well setted. When the bot is starting, I have the following traceback.
Error (including full traceback):
2019-07-15 09:35:24 DEBUG google.auth.transport.requests - Making request: POST https://oauth2.googleapis.com/token
2019-07-15 09:35:25 ERROR rasa.core.agent - Could not load model due to 404 GET https://www.googleapis.com/download/storage/v1/b/toto/o/models.tar.gz?alt=media: ('Request failed with status code', 404, 'Expected one of', <HTTPStatus.OK: 200>, <HTTPStatus.PARTIAL_CONTENT: 206>).
[2019-07-15 09:35:25 +0000] [1] [ERROR] Experienced exception while trying to serve
Traceback (most recent call last):
File "/usr/local/bin/rasa", line 10, in <module>
sys.exit(main())
File "/usr/local/lib/python3.6/site-packages/rasa/__main__.py", line 76, in main
cmdline_arguments.func(cmdline_arguments)
File "/usr/local/lib/python3.6/site-packages/rasa/cli/run.py", line 102, in run
rasa.run(**vars(args))
File "/usr/local/lib/python3.6/site-packages/rasa/run.py", line 54, in run
**kwargs
File "/usr/local/lib/python3.6/site-packages/rasa/core/run.py", line 172, in serve_application
app.run(host="0.0.0.0", port=port)
File "/usr/local/lib/python3.6/site-packages/sanic/app.py", line 1096, in run
serve(**server_settings)
File "/usr/local/lib/python3.6/site-packages/sanic/server.py", line 742, in serve
trigger_events(before_start, loop)
File "/usr/local/lib/python3.6/site-packages/sanic/server.py", line 604, in trigger_events
loop.run_until_complete(result)
File "uvloop/loop.pyx", line 1451, in uvloop.loop.Loop.run_until_complete
File "/usr/local/lib/python3.6/site-packages/rasa/core/run.py", line 211, in load_agent_on_start
action_endpoint=endpoints.action,
File "/usr/local/lib/python3.6/site-packages/rasa/core/agent.py", line 251, in load_agent
model_server=model_server,
File "/usr/local/lib/python3.6/site-packages/rasa/core/agent.py", line 911, in load_from_remote_storage
persistor.retrieve(model_name, target_path)
File "/usr/local/lib/python3.6/site-packages/rasa/nlu/persistor.py", line 51, in retrieve
self._retrieve_tar(tar_name)
File "/usr/local/lib/python3.6/site-packages/rasa/nlu/persistor.py", line 206, in _retrieve_tar
blob.download_to_filename(target_filename)
File "/usr/local/lib/python3.6/site-packages/google/cloud/storage/blob.py", line 664, in download_to_filename
self.download_to_file(file_obj, client=client, start=start, end=end)
File "/usr/local/lib/python3.6/site-packages/google/cloud/storage/blob.py", line 638, in download_to_file
_raise_from_invalid_response(exc)
File "/usr/local/lib/python3.6/site-packages/google/cloud/storage/blob.py", line 2034, in _raise_from_invalid_response
raise exceptions.from_http_status(response.status_code, message, response=response)
google.api_core.exceptions.NotFound: 404 GET https://www.googleapis.com/download/storage/v1/b/toto/o/models.tar.gz?alt=media: ('Request failed with status code', 404, 'Expected one of', <HTTPStatus.OK: 200>, <HTTPStatus.PARTIAL_CONTENT: 206>)
Command or request that led to error:
command:
- rasa
- run
- --enable-api
- --model
- 20190715-112628.tar.gz
- --log-file
- out.log
- --remote-storage
- gcs
- --credentials
- credentials.yaml
- --debug
Content of configuration file (config.yml) (if relevant):
Content of domain file (domain.yml) (if relevant):
Issue Analytics
- State:
- Created 4 years ago
- Comments:7 (6 by maintainers)
Top Results From Across the Web
Troubleshooting | Cloud Storage
Issue: Requests to a public bucket directly, or via Cloud CDN, are failing with a HTTP 401: Unauthorized and an Authentication Required response....
Read more >Unable to read logs from GCS bucket in Airflow 1.10
Remote logging to Google Cloud Storage using an existing Airflow connection to read or write logs fails. If you don't have a connection ......
Read more >airflow.providers.google.cloud.operators.gcs
GCSCreateBucketOperator. Creates a new bucket. Google Cloud Storage uses a flat namespace,. GCSListObjectsOperator. List all objects from the bucket with ...
Read more >Connecting GCS to BigQuery: 2 Easy Methods - Hevo Data
Comprehensive step by step guide to moving your data from GCS to BigQuery using cloud storage transfer service.
Read more >googleCloudStorageR
The recommended way is to use gcs_setup() which will help you create and download an authentication JSON key, and set your default bucket....
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
@tormath1 Thanks for raising this issue. It is indeed a bug. Are you up for fixing this yourself and open a PR for it?
In case of
--enable-api
we validate that the model path exists (https://github.com/RasaHQ/rasa/blob/master/rasa/cli/run.py#L86). If not we override the argumentmodel
with the default locationmodels
. However, if you want to load the model from the remote storage the model does not need to exist locally. This needs to be changed.Additionally, we should check if this line still makes sense: https://github.com/RasaHQ/rasa/blob/master/rasa/nlu/persistor.py#L49 We add the ending
tar.gz
to the provided model name. However, it seems like that the ending is also added in case it already exists.Yes, with a great pleasure !