Cloud functions python template error uploading functions
See original GitHub issueWhen using any of the examples provided to deploy functions the python template inlines all the code (compressed and codified) in a single command as shown here
And also here
This works for the small three line functions provided, (less than 100 LOC main.py file) it produces the following error:
- code: RESOURCE_ERROR
location: /deployments/my-deployment/resources/upload-function-code
message: '{"ResourceType":"gcp-types/cloudbuild-v1:cloudbuild.projects.builds.create","ResourceErrorCode":"400","ResourceErrorMessage":{"code":400,"message":"invalid
build: invalid .steps field: build step 0 arg 2 too long (max: 4000)","status":"INVALID_ARGUMENT","statusMessage":"Bad
Request","requestPath":"https://cloudbuild.googleapis.com/v1/projects/my-project/builds","httpMethod":"POST"}}'
Which means that this cmd
variable we defined above is longer than 4k characters.
Deploying extra infrastructure to copy files as suggested here https://github.com/GoogleCloudPlatform/deploymentmanager-samples/issues/40#issuecomment-339759328 produces a big overhead in development time and makes the deployment unnecessarily complex. You would have to write a:
- python template to deploy small functions (as mentioned above, big functions wont fit).
- cloud function to copy the files.
- function call resource to call the function with the inline text for each file (this can turn into a composite type or another template)
- cloudbuild pipeline resource to zip the files in the bucket
- cloud function resource, finally, to deploy the comprssed code.
On the other hand I could copy the files by downloading them from the repository itself as suggested here https://stackoverflow.com/a/49751362, but this will make the deployment “gitlab dependant”. I would need a different template for each type of repository (and a Docker image to deploy the zip utility, another dependency).
resources:
- name: my-build
action: gcp-types/cloudbuild-v1:cloudbuild.projects.builds.create
metadata:
runtimePolicy:
- CREATE
properties:
steps:
- name: gcr.io/cloud-builders/git
args: ['clone', 'git clone https://gitlab-ci-token:{{ properties ['GITLAB_CI_TOKEN'] }}@gitlab.com/myuser/myrepo']
- name: gcr.io/{{ env['project'] }}/zip
args: ['–r', 'function.zip', 'function_folder']
- name: gcr.io/cloud-builders/gsutil
args: ['cp', 'function.zip', gs://my-bucket/ ]
timeout: 120s
Could there be an alternative for moving files with no extra template/config files?
Issue Analytics
- State:
- Created 5 years ago
- Reactions:1
- Comments:6
To work around this, I split
content
into chunks:(NOTE python3 expectation in
range
on the second line)I then expanded the build steps by mapping
cmds
to an intermediate array:This modification allowed the deploy to succeed.
Thought my organization uses GCP extensively we don’t rely on DM anymore. Terraform it is