question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

GCP AI Platform (unified) Python export_model FailedPrecondition: 400 Exporting artifact in format `` is not supported

See original GitHub issue

I am using the Google AiPlatform (Unified) Python client to export a trained model to a Google Cloud bucket. I am following the sample code from: export_model_sample.

The application has “owner” credentials at the moment because I want to make sure it is not a permissions issue. However, when I try to execute the sample code I am getting the following error:

Traceback (most recent call last): File “/usr/local/lib/python3.8/site-packages/google/api_core/grpc_helpers.py”, line 57, in error_remapped_callable return callable_(*args, **kwargs) File “/usr/local/lib/python3.8/site-packages/grpc/_channel.py”, line 923, in call return _end_unary_response_blocking(state, call, False, None) File “/usr/local/lib/python3.8/site-packages/grpc/_channel.py”, line 826, in _end_unary_response_blocking raise _InactiveRpcError(state) grpc._channel._InactiveRpcError: <_InactiveRpcError of RPC that terminated with: status = StatusCode.FAILED_PRECONDITION details = “Exporting artifact for model projects/101010101010/locations/us-central1/models/123123123123123 in format is not supported." debug_error_string = "{"created":"@1611864688.554145696","description":"Error received from peer ipv4:172.217.12.202:443","file":"src/core/lib/surface/call.cc","file_line":1067,"grpc_message":"Exporting artifact for model `projects/110101010101/locations/us-central1/models/123123123123123` in format is not supported.”,“grpc_status”:9}"

The above exception was the direct cause of the following exception:

Traceback (most recent call last): File “/app/main.py”, line 667, in <module> response = aiplatform_model_client.export_model(name=name, output_config=output_config) File “/usr/local/lib/python3.8/site-packages/google/cloud/aiplatform_v1beta1/services/model_service/client.py”, line 937, in export_model response = rpc(request, retry=retry, timeout=timeout, metadata=metadata,) File “/usr/local/lib/python3.8/site-packages/google/api_core/gapic_v1/method.py”, line 145, in call return wrapped_func(*args, **kwargs) File “/usr/local/lib/python3.8/site-packages/google/api_core/grpc_helpers.py”, line 59, in error_remapped_callable six.raise_from(exceptions.from_grpc_error(exc), exc) File “<string>”, line 3, in raise_from google.api_core.exceptions.FailedPrecondition: 400 Exporting artifact for model projects/111101010101/locations/us-central1/models/123123123123123123 in format `` is not supported.

(I have omitted the project id and the models id. Using 10101 and 123123)

I have verified my inputs but everything seems ok:

gcs_destination_output_uri_prefix = "gs://my-bucket-vcm/model-123123123123123/tflite/2021-01-28T16:00:00.000Z/"
gcs_destination = {"output_uri_prefix": gcs_destination_output_uri_prefix}
output_config = {"artifact_destination": gcs_destination,}
name = "projects/10101010101/locations/us-central1/models/123123123123123"
response = aiplatform_model_client.export_model(name=name, output_config=output_config)
print("Long running operation:", response.operation.name)
export_model_response = response.result(timeout=300)
print("export_model_response:", export_model_response)

I am also using the latest version of google-cloud-aiplatform==0.4.0 The trained model that I am trying to export is of type: MOBILE_TF_LOW_LATENCY_1

I would like to just export the model to a cloud bucket.

My requirements.txt is:

firebase-admin==4.5.0
google-api-python-client==1.12.8
google-cloud-error-reporting==1.1.0
google-cloud-secret-manager==2.1.0
google-cloud-firestore==2.0.2
google-cloud-core==1.5.0
google-cloud-pubsub==2.2.0
google-cloud-aiplatform==0.4.0
python-dateutil
pandas
numpy
pyopenssl
requests

Issue Analytics

  • State:closed
  • Created 3 years ago
  • Comments:6 (3 by maintainers)

github_iconTop GitHub Comments

2reactions
zdhernandezcommented, Jan 29, 2021

@dizcology That works!

It printed: “Long running operation…” export_model_response: done

and I have a bucket with one model.tflite file

The export_model_sample should include the “export_format_id” as well as the example from GCP.

Thank you very much for your time and help.

0reactions
dizcologycommented, Jan 29, 2021

I think the value of artifact_destination needs to be a Python dict or a protobuf message. Please try something like the following:

output_config = {
    "artifact_destination": {"output_uri_prefix": "gs://..."},
    "export_format_id": "tflite",
}
Read more comments on GitHub >

github_iconTop Results From Across the Web

GCP AI Platform (unified) Python export_model ...
GCP AI Platform (unified) Python export_model FailedPrecondition: 400 Exporting artifact in format `` is not supported · Ask Question. Asked 1 ...
Read more >
Export model artifacts for prediction | Vertex AI | Google Cloud
Use the joblib library to export a file named model.joblib . Use Python's pickle module to export a file named model.pkl . Your...
Read more >
Deploying Machine Learning models with Vertex AI (GCP)
Machine Learning model deployment w/ Vertex AI (by Google Cloud), a unified AI platform providing data scientists with tools to complete all of...
Read more >
Vertex AI: Export and deploy a BigQuery Machine Learning ...
In this lab, you will train a model with BigQuery Machine Learning, and then export and deploy that model to Vertex AI.
Read more >
Giving Vertex AI, the New Unified ML Platform on Google ...
The Google Cloud AI Platform team have been heads down the past few months ... Pipeline or directly loaded (as long as it...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found