rsync with s3 bucket BadRequestException: 400 AuthorizationHeaderMalformed
See original GitHub issueGood morning all,
I used to rsync data between one of my buckets in GCS in an S3 one. This morning I tried to run the same command as always but doesn’t work anymore.
This is the error I receiving now:
Caught non-retryable exception while listing s3://M-S3-BUCKET/: BadRequestException: 400 AuthorizationHeaderMalformed
And this one:
CommandException: Caught non-retryable exception - aborting rsync
This task for us it’s really important, how can I fix it?
thanks so much for your time.
With all the best, Fabio Rigato
Issue Analytics
- State:
- Created 3 years ago
- Reactions:1
- Comments:5
Top Results From Across the Web
gsutil rsync Google Store with AWS S3 400 ... - Stack Overflow
gsutil -m rsync -r gs://some-bucket ./localfolder/. It fails with the following error: BadRequestException: 400 ExcessHeaderValues <?xml ...
Read more >[gcp] gsutil에서 AuthorizationHeaderMalformed 오류 해결 방법
gsutil ls s3://sample-bucket/ BadRequestException: 400 AuthorizationHeaderMalformed <?xml version="1.0" encoding="UTF-8"?> ...
Read more >Rsync to AWS S3 bucket - backup - Server Fault
For a server I am hosting a website on I want to backup the data and settings to an S3 bucket. I found...
Read more >جسديا صامتة مجموع مخبز معدني حواء rsync between s3 buckets - kkkjsb ...
جسديا صامتة مجموع مخبز معدني حواء rsync between s3 buckets. ... bucket name · Issue #346 · GoogleCloudPlatform/gsutil · GitHub · عظم الوجنة...
Read more >Using BackupAssist for Rsync with Amazon S3
Then you will need to create an S3 bucket to use for your backups. ... Create a new Rsync job and choose S3Rsync...
Read more >
Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free
Top Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
This is happening because we recently started using the V4 signature for S3. The new signature depends on the region information as well. Here your bucket is not present in the default region
us-east-1
but it is in theap-southeast-2
region. Since the region was not specified, gsutil tries with the default region to calculate the signature and fails. We will soon be adding a retry logic to fix this. Meanwhile, you can try the temp fix given below:Region information is derived from the host, and to fix this, you can add the host information in the .boto file under
s3
sectionor you can pass this information from the top-level options
gsutil -o "s3:host=s3.ap-southeast-2.amazonaws.com" ...
Note that this will use region as
ap-southeast-2
for the entire execution of the command ,so if you have buckets in multiple regions, then you might want to run separate commands for each region.You’re welcome. I will keep this issue open to track the retry logic fix that I have mentioned above.