question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Cant upload files with lambda to s3

See original GitHub issue

Hi,

I’m trying to use django with zappa and AWS. I’ve created a basic project and some buckets to run with it.

Tried so many permissions and policies on the bucket and IAM role, with no success.

Any idea?

Context

Python 3.6 running with virtualenvironment

Expected Behavior

The file is uploaded to S3 and im transfered back to django admin

Actual Behavior

The action fiails, no file is created and I recieve {“message”: “Endpoint request timed out”}

Your Environment

‘storages’ is in my INSTALLED_APPS

This is in my settings file

AWS_ACCESS_KEY_ID = os.environ.get('AWS_ACCESS_KEY_ID')
AWS_SECRET_ACCESS_KEY = os.environ.get('AWS_SECRET_ACCESS_KEY')
AWS_STORAGE_BUCKET_NAME = os.environ.get('S3_BUCKET_NAME')
AWS_STORAGE_REGION = os.environ.get('S3_REGION')
AWS_HEADERS = {  # see http://developer.yahoo.com/performance/rules.html#expires
    'Expires': 'Thu, 31 Dec 2099 20:00:00 GMT',
    'Cache-Control': 'max-age=94608000',
}

MEDIA_URL = 'http://%s.s3.amazonaws.com/' % AWS_STORAGE_BUCKET_NAME
DEFAULT_FILE_STORAGE = "storages.backends.s3boto.S3BotoStorage"

in my zappa settings I have

 "aws_region": "eu-west-1",
        "vpc_config" : {
            "SubnetIds": [ "*****", "*******"], 
            "SecurityGroupIds": [ "******" ]
        },
        "profile_name": "default",
        "project_name": "****",
        "runtime": "python3.6",
        "s3_bucket": "MU BUCKET",
         "environment_variables": {
            "S3_REGION": "eu-west-1",
            "S3_BUCKET_NAME": "***********",
            "AWS_ACCESS_KEY_ID":"***********",
            "AWS_SECRET_ACCESS_KEY": "*************",
            "AWS_ACCESS_KEY":"***********",
            "AWS_SECRET_KEY": "**********************"
         },
      	"certificate_arn": "arn:aws:acm:***********,

This is my model

class Map(models.Model):
    name = models.CharField(max_length=128, default='')
    description = models.TextField(max_length=1023, default='', blank=True)

    pgm_file = models.FileField(upload_to=path_and_rename,
                                blank=False,
                                default='',
                                max_length=200)
from utils.utils import PathAndRename
path_and_rename = PathAndRename('./test')

class PathAndRename(object):

    def __init__(self, sub_path):
        self.path = sub_path

    def __call__(self, instance, filename):
        ext = filename.split('.')[-1]
        day = time.strftime('%y%m%d')
        # get filename
        if instance.pk:
            filename = '{}.{}'.format(instance.pk, ext)
        else:
            # set filename as random string
            timestamp = int(time.time())
            name = str(uuid4().hex[:3] + str(timestamp)[-5:])
            filename = '{}.{}'.format(name, ext)

        try:
            instance_type = type(instance).__name__
            username = instance.created_by.username
        except Exception:
            username = ''
            instance_type = ''

        return os.path.join(self.path, instance_type, username, day, filename)

Bucket policy

{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Sid": "PublicReadGetObject",
            "Effect": "Allow",
            "Principal": "*",
            "Action": "s3:GetObject",
            "Resource": "arn:aws:s3::MY_BUCKET/*"
        },
        {
            "Sid": "AllowS3Access",
            "Effect": "Allow",
            "Principal": {
                "AWS": [
                    "arn:aws:iam::***********:role/ZAPPA-ROLE"
                ]
            },
            "Action": "s3:*",
            "Resource": [
                "arn:aws:s3:::MY_BUCKET",
                "arn:aws:s3:::MY_BUCKET/*"
            ]
        }
    ]
}

CORS

<CORSRule>
    <AllowedOrigin>*</AllowedOrigin>
    <AllowedMethod>HEAD</AllowedMethod>
    <AllowedMethod>GET</AllowedMethod>
    <AllowedMethod>PUT</AllowedMethod>
    <AllowedMethod>POST</AllowedMethod>
    <AllowedMethod>DELETE</AllowedMethod>
    <AllowedHeader>*</AllowedHeader>
</CORSRule>

Issue Analytics

  • State:closed
  • Created 6 years ago
  • Comments:7

github_iconTop GitHub Comments

8reactions
edgarromancommented, Nov 23, 2017

If your VPC subnets are private you will either:

  1. have to create a NAT gateway to get to external S3
  2. Create an S3 endpoint in your VPC subnets to allow your lambda to get access.
1reaction
talbenbasatcommented, Nov 23, 2017

@edgarroman Thank you very very much!! The endpoint solved the problem

Read more comments on GitHub >

github_iconTop Results From Across the Web

AWS s3 upload from lambda not working and no error
I am trying to do a simple file upload from lambda to s3 using nodejs. The lambda execution works fine without any error,...
Read more >
Upload to S3 from Lambda doesn't create file in bucket, no error
I use serverless lambda to upload files to an S3 bucket with nodejs. When calling s3. ... Upload to S3 from Lambda doesn't...
Read more >
Upload to S3 From Lambda Tutorial | Step by Step Guide
In this video, I walk you through how to upload a file into s3 from a Lambda function. ... Your browser can't play...
Read more >
Uploading to Amazon S3 directly from a web or mobile ... - AWS
The user uploads the file to the application server. · Call an Amazon API Gateway endpoint, which invokes the getSignedURL Lambda function.
Read more >
Upload Files to AWS S3 Using a Serverless Framework
The API Gateway has a limitation for a payload of requests at 10MB, and AWS Lambda functions have the same limitation at 6MB....
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found