Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Unable to post to S3 from client

See original GitHub issue

Thank you for putting together this script. I’ve implemented it into an Angular / NodeJS app that I’ve been working on to upload images to an S3 bucket, where CORS has been enabled.

I’m having trouble uploading from the client side. I think I’ve narrowed the problem down to my request headers are including an Authorization, while your demo script at does not. Uploading from your tool to my bucket is working fine, btw.

Response i’m getting from S3

<?xml version="1.0" encoding="UTF-8"?>
    <Message>Unsupported Authorization Type</Message>
    <ArgumentValue>Bearer eyJ0eXAiOiJKV1QiLCJhbGciOiJIUzI1NiJ9.eyJfaWQiOiI1NDUyOWJhMGZkZDUyMjAwMDA2MGI2ZjciLCJpYXQiOjE0MTQ3MDAzNTY0NTAsImV4cCI6MTQxNDcxODM1NjQ1MH0.-7j3JrLYLwmdg4DDUNH_y_IE_aRFpgBivKYJ6g1yxp8</ArgumentValue>

Here’s the POST request comparison of your tool (top) and my app (bottom) running from localhost.



I’ll post some of my code below, if you have any ideas or want more information, let me know. It’d be great to get this resolved.

Client-side controller for the upload:

$scope.onFileSelect = function ($files) {
    $scope.files = $files;
    $scope.upload = [];

    for (var i = 0; i < $files.length; i++) {
        var file = $files[i];
        file.progress = parseInt(0);
        (function (file, i) {
            $http.get('/api/aws/s3Policy?mimeType='+ file.type).success(function(response) {
                var s3Params = response;
                var s3Bucket = '<MY_BUCKET_NAME>';

                $scope.upload[i] = $upload.upload({
                    url: 'https://' + s3Bucket + '',
                    method: 'POST',
                    data: {
                        'key' : Math.round(Math.random()*10000) + '$$' +,
                        'acl' : 'public-read-write',
                        'Content-Type' : file.type != '' ? file.type : 'application/octet-stream',
                        'AWSAccessKeyId' : s3Params.AWSAccessKeyId,
                        'policy' : s3Params.s3Policy,
                        'signature' : s3Params.s3Signature,
                        'filename' :
                    file: file

                .then(function(response) {
                    file.progress = parseInt(100);
                    if (response.status === 201) {
                        var data = xml2json.parser(,
                        parsedData = {
                            location: data.postresponse.location,
                            bucket: data.postresponse.bucket,
                            key: data.postresponse.key,
                            etag: data.postresponse.etag

                    } else {
                        alert('Upload Failed');
                }, null, function(evt) {
                    file.progress =  parseInt(100.0 * evt.loaded /;
        }(file, i));

Route for handling /api/aws/s3Policy requests

router.get('/s3Policy', controller.getS3Policy);

Server-side AWS controller:

'use strict';

 * Import required packages
var _ = require('lodash'),
    AWS = require('aws-sdk'),
    crypto = require('crypto'),

 * Load the S3 information from the environment variables.
var AWS_ACCESS_KEY = process.env.AWS_ACCESS_KEY,
      AWS_SECRET_KEY = process.env.AWS_SECRET_KEY,
      S3_BUCKET = process.env.S3_BUCKET;

getExpiryTime = function () {
    var _date = new Date();
    return '' + (_date.getFullYear()) + '-' + (_date.getMonth() + 1) + '-' +
        (_date.getDate() + 1) + 'T' + (_date.getHours() + 3) + ':' + '00:00.000Z';

createS3Policy = function(contentType, callback) {
    var date = new Date();
    var s3Policy = {
        "expiration": getExpiryTime(),
        "conditions": [
            {"bucket": S3_BUCKET},
            ["starts-with", "$key", ""],
            {"acl": "public-read-write"},
            ["starts-with", "$Content-Type", ""],
            ["starts-with", "$filename", ""],
            ["content-length-range", 0, 524288000]

    // stringify and encode the policy
    var stringPolicy = JSON.stringify(s3Policy);
    var base64Policy = new Buffer(stringPolicy, 'utf-8').toString('base64');

    // sign the base64 encoded policy
    var signature = crypto.createHmac('sha1', AWS_SECRET_KEY)
                    .update(new Buffer(base64Policy, 'utf-8')).digest('base64');

    // build the results object
    var s3Credentials = {
        s3Policy: base64Policy,
        s3Signature: signature,
        AWSAccessKeyId: AWS_ACCESS_KEY

    // send it back

exports.getS3Policy = function(req, res) {
    createS3Policy(req.query.mimeType, function (creds, err) {
        if (!err) {
            return res.send(200, creds);
        } else {
            return res.send(500, err);

And the form code in my view:

<div class="form-group">
    <label for="photos">Input Picture</label>
    <input id="photos" type="file" class="btn" ng-file-select="onFileSelect($files)" multiple>
    <p class="help-block">Select pictures</p>
    <div ng-repeat="file in files" class="container">
        <div class="progress">
            <div class="progress-bar" role="progressbar" style="width:{{file.progress}}%;">
                {{}} : {{file.progress}}
        <button class="btn btn-ttc" type="button" ng-click="abort($index)" ng-show="file.progress != 100">Abort</button>

Issue Analytics

  • State:closed
  • Created 9 years ago
  • Comments:6 (4 by maintainers)

github_iconTop GitHub Comments

timelf123commented, Dec 14, 2015

If using angular-jwt, it adds the Bearer auth token to all requests unless turned off with skipAuthorization: true

    url: 'http://example.local',
    skipAuthorization: true
    method: 'GET'


nukulbcommented, Nov 16, 2014

@mciccarelli My commit above fixes it, you can change your code slightly in the client side controller to fix the issue by removing the “Authorization” header specifically for the upload to S3 request.

To be clear the fix has been made in this example repo -

It seems to work well in my test case, please verify and let us know how it went.

@danialfarid I think the solution is sufficient, hopefully you can close out the issue.

Read more comments on GitHub >

github_iconTop Results From Across the Web

Resolve errors uploading data to or downloading data ... - AWS
Your file does not exist. Confirm that the file exists in your S3 bucket, and that the name you specified in your script...
Read more >
How to upload files to AWS S3?, I get an error when doing it ...
But the problem comes in the frontend, here is the function that first asks for the link and then it tryies to upload...
Read more >
Unable to upload and download files from S3 storage
Upload and download operations fail from all clients. The log file on the Storage Connector server indicates an exception error.
Read more >
Unable to upload file to bucket due to network error
Try uploading to s3 bucket through AWS s3 console again after clearing your browser's cache. Also try using incognito mode to access s3...
Read more >
s3 file upload Error while putting Object Unable to connect to ...
Hi I am getting this error while uploading a file. Error: s3 file upload Error while putting Object Unable to connect to endpoint...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Post

No results found

github_iconTop Related Hashnode Post

No results found