question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Error uploading to google storage: wrong content-type

See original GitHub issue

I am using boto3 to upload data to Google Storage. This works fine with Minio locally, however it fails with medium-sized files (seems to work with files under 10 MB, probably because it doesn’t multipart).

Is this a bug with Google? Is there a missing content-type in boto3 like the error suggests?

>>> import boto3
>>> from botocore.client import Config
>>> s3 = boto3.resource(
...     's3', endpoint_url='https://storage.googleapis.com',
...     aws_access_key_id='GOOGxxx',
...     aws_secret_access_key='xxx',
...     config=Config(signature_version='s3v4'),
... )
>>> s3.meta.client.upload_file('80-mb-file', 'rpz-staging-experiments', 'testup1508')
    ...
DEBUG:s3transfer.futures:Submitting task UploadPartTask(transfer_id=0, {'bucket': 'rpz-staging-experiments', 'key': 'testup1508', 'part_number': 10, 'extra_args': {}}) to executor <s3transfer.futures.BoundedExecutor object at 0x7f22e09224e0> for transfer request: 0.
DEBUG:s3transfer.tasks:UploadPartTask(transfer_id=0, {'bucket': 'rpz-staging-experiments', 'key': 'testup1508', 'part_number': 9, 'extra_args': {}}) about to wait for <s3transfer.futures.ExecutorFuture object at 0x7f22e08be0b8>
DEBUG:s3transfer.utils:Acquiring 0
DEBUG:s3transfer.futures:Submitting task CompleteMultipartUploadTask(transfer_id=0, {'bucket': 'rpz-staging-experiments', 'key': 'testup1508', 'extra_args': {}}) to executor <s3transfer.futures.BoundedExecutor object at 0x7f22e09224e0> for transfer request: 0.
DEBUG:s3transfer.utils:Acquiring 0
DEBUG:s3transfer.utils:Releasing acquire 0/None
DEBUG:urllib3.connectionpool:https://storage.googleapis.com:443 "POST /rpz-staging-experiments/testup1508?uploads HTTP/1.1" 400 188
DEBUG:botocore.parsers:Response headers: {'X-GUploader-UploadID': 'AEnB2UpbuUOI0xfCna_66UOniFYBFGM9jRmLg5bXqdWQcJGgNEl9FJvYjggDSfN11lxU2WTn8D53ygUrGF7hA-quabCwHg7EAg', 'Content-Type': 'application/xml; charset=UTF-8', 'Content-Length': '188', 'Vary': 'Origin', 'Date': 'Sat, 16 Nov 2019 20:10:59 GMT', 'Server': 'UploadServer', 'Alt-Svc': 'quic=":443"; ma=2592000; v="46,43",h3-Q050=":443"; ma=2592000,h3-Q049=":443"; ma=2592000,h3-Q048=":443"; ma=2592000,h3-Q046=":443"; ma=2592000,h3-Q043=":443"; ma=2592000'}
DEBUG:botocore.parsers:Response body:
b"<?xml version='1.0' encoding='UTF-8'?><Error><Code>InvalidArgument</Code><Message>Invalid argument.</Message><Details>POST object expects Content-Type multipart/form-data</Details></Error>"
    ...

Issue Analytics

  • State:closed
  • Created 4 years ago
  • Comments:6

github_iconTop GitHub Comments

1reaction
diehlawscommented, Nov 20, 2019

Thanks for reaching out to us @remram44. It appears this behavior is due to a difference in implementation of object storage services between S3 and Google Storage. S3 does not require including a Content-Type header in UploadPart or CompleteMultipartUpload requests, as a result boto3 does not include this header for these requests. However, judging by the error you received it looks like Google Storage does require this header to have a value of multipart/form-data for one of these operations.

The fix you mentioned works because it is not performing a multi-part upload, as such I recommend checking the limits on single-PUT uploads for Google Storage (S3 supports up to 5GB files to be uploaded in a single PUT). Another approach is to adjust the multipart_threshold value in a TransferConfig object that gets passed into the upload_file() call as shown here.

0reactions
diehlawscommented, Nov 26, 2019

No problem! Sorry to hear that modifying the request’s content-type doesn’t work as expected but I’m glad to hear you at least have a workaround to fit your use case. Please don’t hesitate to reach out again if you have additional questions for us.

Read more comments on GitHub >

github_iconTop Results From Across the Web

Google Cloud Storage seeing incorrect content type on my ...
I am linking my static files correctly and with specifying the correct content-type, but this error is still happening. <link rel="stylesheet" ...
Read more >
Troubleshooting | Cloud Storage
This page describes troubleshooting methods for common errors you may encounter while using Cloud Storage. See the Google Cloud Status Dashboard for ...
Read more >
Upload files with Cloud Storage on Android - Firebase
There are a number of reasons why errors may occur on upload, including the local file not existing, or the user not having...
Read more >
Set proper mime type for Google Cloud Bucket upload
Google Cloud Storage upload with custom metadata. Google Cloud Storage integration - "Set File Metadata" Content-Type is not working.
Read more >
Cloud Storage Go Reference
Package storage provides an easy way to work with Google Cloud Storage. ... TODO: Handle error. } The client will use your default...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found