GCP AI Platform (unified) Python export_model FailedPrecondition: 400 Exporting artifact in format `` is not supported
See original GitHub issueI am using the Google AiPlatform (Unified) Python client to export a trained model to a Google Cloud bucket. I am following the sample code from: export_model_sample.
The application has “owner” credentials at the moment because I want to make sure it is not a permissions issue. However, when I try to execute the sample code I am getting the following error:
Traceback (most recent call last): File “/usr/local/lib/python3.8/site-packages/google/api_core/grpc_helpers.py”, line 57, in error_remapped_callable return callable_(*args, **kwargs) File “/usr/local/lib/python3.8/site-packages/grpc/_channel.py”, line 923, in call return _end_unary_response_blocking(state, call, False, None) File “/usr/local/lib/python3.8/site-packages/grpc/_channel.py”, line 826, in _end_unary_response_blocking raise _InactiveRpcError(state) grpc._channel._InactiveRpcError: <_InactiveRpcError of RPC that terminated with: status = StatusCode.FAILED_PRECONDITION details = “Exporting artifact for model
projects/101010101010/locations/us-central1/models/123123123123123
in formatis not supported." debug_error_string = "{"created":"@1611864688.554145696","description":"Error received from peer ipv4:172.217.12.202:443","file":"src/core/lib/surface/call.cc","file_line":1067,"grpc_message":"Exporting artifact for model `projects/110101010101/locations/us-central1/models/123123123123123` in format
is not supported.”,“grpc_status”:9}"The above exception was the direct cause of the following exception:
Traceback (most recent call last): File “/app/main.py”, line 667, in <module> response = aiplatform_model_client.export_model(name=name, output_config=output_config) File “/usr/local/lib/python3.8/site-packages/google/cloud/aiplatform_v1beta1/services/model_service/client.py”, line 937, in export_model response = rpc(request, retry=retry, timeout=timeout, metadata=metadata,) File “/usr/local/lib/python3.8/site-packages/google/api_core/gapic_v1/method.py”, line 145, in call return wrapped_func(*args, **kwargs) File “/usr/local/lib/python3.8/site-packages/google/api_core/grpc_helpers.py”, line 59, in error_remapped_callable six.raise_from(exceptions.from_grpc_error(exc), exc) File “<string>”, line 3, in raise_from google.api_core.exceptions.FailedPrecondition: 400 Exporting artifact for model
projects/111101010101/locations/us-central1/models/123123123123123123
in format `` is not supported.
(I have omitted the project id and the models id. Using 10101 and 123123)
I have verified my inputs but everything seems ok:
gcs_destination_output_uri_prefix = "gs://my-bucket-vcm/model-123123123123123/tflite/2021-01-28T16:00:00.000Z/"
gcs_destination = {"output_uri_prefix": gcs_destination_output_uri_prefix}
output_config = {"artifact_destination": gcs_destination,}
name = "projects/10101010101/locations/us-central1/models/123123123123123"
response = aiplatform_model_client.export_model(name=name, output_config=output_config)
print("Long running operation:", response.operation.name)
export_model_response = response.result(timeout=300)
print("export_model_response:", export_model_response)
I am also using the latest version of google-cloud-aiplatform==0.4.0 The trained model that I am trying to export is of type: MOBILE_TF_LOW_LATENCY_1
I would like to just export the model to a cloud bucket.
My requirements.txt is:
firebase-admin==4.5.0
google-api-python-client==1.12.8
google-cloud-error-reporting==1.1.0
google-cloud-secret-manager==2.1.0
google-cloud-firestore==2.0.2
google-cloud-core==1.5.0
google-cloud-pubsub==2.2.0
google-cloud-aiplatform==0.4.0
python-dateutil
pandas
numpy
pyopenssl
requests
Issue Analytics
- State:
- Created 3 years ago
- Comments:6 (3 by maintainers)
@dizcology That works!
It printed: “Long running operation…” export_model_response: done
and I have a bucket with one model.tflite file
The export_model_sample should include the “export_format_id” as well as the example from GCP.
Thank you very much for your time and help.
I think the value of
artifact_destination
needs to be a Python dict or a protobuf message. Please try something like the following: