question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

AccessDenied When uploading image using S3 trigger

See original GitHub issue

Describe the bug I have updated my Bucket to add a trigger, like this:

andres@andres:~/Entrepreneurship/TeVi$ amplify update storage
? Please select from one of the below mentioned services: Content (Images, audio, video, etc.)
? Who should have access: Auth and guest users
? What kind of access do you want for Authenticated users? create/update, read
? What kind of access do you want for Guest users? read
? Do you want to add a Lambda Trigger for your S3 Bucket? Yes
? Select from the following options Create a new function
Successfully added resource S3Triggerde8bc6ca locally
? Do you want to edit the local S3Triggerde8bc6ca lambda function now? No
Successfully updated resource

So, I tried to upload an image, logged in as a user, but the lambda throws me this error:

2021-01-13T03:38:01.328Z	453e3118-2293-4632-be5e-aed92f7f2a6f	ERROR	{ AccessDenied: Access Denied
    at Request.extractError (/var/runtime/node_modules/aws-sdk/lib/services/s3.js:700:35)
    at Request.callListeners (/var/runtime/node_modules/aws-sdk/lib/sequential_executor.js:106:20)
    at Request.emit (/var/runtime/node_modules/aws-sdk/lib/sequential_executor.js:78:10)
    at Request.emit (/var/runtime/node_modules/aws-sdk/lib/request.js:688:14)
    at Request.transition (/var/runtime/node_modules/aws-sdk/lib/request.js:22:10)
    at AcceptorStateMachine.runTo (/var/runtime/node_modules/aws-sdk/lib/state_machine.js:14:12)
    at /var/runtime/node_modules/aws-sdk/lib/state_machine.js:26:10
    at Request.<anonymous> (/var/runtime/node_modules/aws-sdk/lib/request.js:38:9)
    at Request.<anonymous> (/var/runtime/node_modules/aws-sdk/lib/request.js:690:12)
    at Request.callListeners (/var/runtime/node_modules/aws-sdk/lib/sequential_executor.js:116:18)
  message: 'Access Denied',
  code: 'AccessDenied',
  region: null,
  time: 2021-01-13T03:38:01.267Z,
  requestId: '223B11A29A616F64',
  extendedRequestId:
   'B+OEiQmGOdHQTCSIqERtk7Dvto+Ohqysz+9tkB6np0u+2vfLLU/pXvGW01r3+J3gvPpjVhO3iGc=',
  cfId: undefined,
  statusCode: 403,
  retryable: false,
  retryDelay: 45.67291209389794 }

So, the lambda is not able to just do a getObject operation 😦.

Note: I have configured the project to use Cognito as the main auth method, and IAM, to allow unauthorized access.

Amplify CLI Version 4.40.1

Expected behavior Allow executing s3 operations in the lambda created without problems.

Desktop (please complete the following information):

  • OS: Ubuntu
  • Node Version. v14.15.1

Additional context The user uses the application, he signed in using Cognito, like this: await Auth.signIn(values.email, values.password);. When he tries to upload the image, like this: await Storage.put('my_image.png', values.picture[0]);. The image is uploaded successfully, but when the lambda is executed, I get the access denied error… This is the code I want to execute:

index.js:

const S3 = require('aws-sdk/clients/s3');
const sharp = require('sharp');

const s3 = new S3();

const regexUserPictureRoute = /public\/\b[0-9a-f]{8}\b-[0-9a-f]{4}-[0-9a-f]{4}-[0-9a-f]{4}-\b[0-9a-f]{12}\b\/picture\/.*/gy;
const SIZE = 130;

exports.handler = async event => {
  if (event.Records[0].eventName === 'ObjectRemoved:Delete') return;

  const Bucket = event.Records[0].s3.bucket.name;
  const Key = event.Records[0].s3.object.key;

  if (!regexUserPictureRoute.test(Key)) return;

  try {
    let image = await s3.getObject({ Bucket, Key }).promise();
    image = await sharp(image.Body);

    const metadata = await image.metadata();

    if (metadata.width > SIZE || metadata.height > SIZE) {
      const resizedImage = await image
        .resize({ width: SIZE, height: SIZE })
        .toBuffer();
      await s3
        .putObject({
          Bucket,
          Body: resizedImage,
          Key,
        })
        .promise();
    }

    return;
  } catch (err) {
    console.error(err);
  }
};

package.json:

{
  "name": "S3Triggerde8bc6ca",
  "version": "2.0.0",
  "description": "Lambda function generated by Amplify",
  "main": "index.js",
  "license": "Apache-2.0",
  "scripts": {
    "install": "npm install --arch=x64 --platform=linux sharp"
  },
  "dependencies": {
    "sharp": "^0.27.0"
  }
}

This is what I’m doing… If you need more information, let me know 🙏

Issue Analytics

  • State:closed
  • Created 3 years ago
  • Comments:19 (7 by maintainers)

github_iconTop GitHub Comments

1reaction
attilahcommented, Jan 14, 2021

Closing the issue as it is answered, if @MontoyaAndres something new comes up feel free to comment on here or open a new issue.

0reactions
github-actions[bot]commented, May 25, 2021

This issue has been automatically locked since there hasn’t been any recent activity after it was closed. Please open a new issue for related bugs.

Looking for a help forum? We recommend joining the Amplify Community Discord server *-help channels for those types of questions.

Read more comments on GitHub >

github_iconTop Results From Across the Web

amazon web services - AWS S3 thumbnail "Access denied"
When uploading the object, use ACL='public-read' , which will make the individual objects public. To make the bucket or objects public, you will ......
Read more >
Access denied error when trying to upload content in AWS S3 ...
When trying to upload content such as images or documents or moving content from a library (with storage provider set to DB or...
Read more >
Why do I get Access Denied errors when I use a Lambda ...
I get an Access Denied error when I use an AWS Lambda function to upload files to an Amazon Simple Storage Service (Amazon...
Read more >
Access Denied using S3 upload function
I have a lambda function with the following policy
Read more >
Uploading and copying objects using multipart upload
You must be allowed to perform the s3:PutObject action on an object to create multipart upload. The bucket owner can allow other principals...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found