Create an adapter for DynamoDB event trigger
See original GitHub issueThis ticket is created from #111.
Is your feature request related to a problem? Please describe. An adapter for dynamodb stream trigger, similar to what laconia already have for kinesis: https://laconiajs.io/docs/api/adapter#kinesis-app.
Describe the solution you’d like
const laconia = require("@laconia/core");
const dynamodb = require("@laconia/adapter").dynamodb();
const app = async records => {
console.log(records); // Prints: [{ Message: 'New item!', Id: 101 }]
};
exports.handler = laconia(dynamodb(app));
// Calls handler with an example Kinesis event
exports.handler({
Records: [
{
dynamodb: {
Keys: {
Id: {
N: "101"
}
},
NewImage: {
Message: {
S: "New item!"
},
Id: {
N: "101"
}
}
}
}
]
});
Additional context
- The most common use case of the trigger is to use NewImage. We can support OldImage in another iteartion
- Watch out on the conversation of DynamoDB data types. Is there anything that we can reuse from DocumentClient? DocumentClient has an automatic conversion from DynamoDB type to JavaScript type.
- See example stream here: https://docs.aws.amazon.com/lambda/latest/dg/with-ddb.html
Issue Analytics
- State:
- Created 4 years ago
- Comments:8 (8 by maintainers)
Top Results From Across the Web
Walkthrough: DynamoDB Streams Kinesis adapter
Walkthrough: DynamoDB Streams Kinesis adapter · Step 1: Create DynamoDB tables · Step 2: Generate update activity in source table · Step 3:...
Read more >Using filters to process all events with Amazon DynamoDB ...
Tutorial #1: Using filters to process all events with Amazon DynamoDB and Amazon Lambda using the Amazon CLI · Step 1: Create a...
Read more >AWS DynamoDB Streams — Change Data Capture for ...
Using AWS Lambda triggers. Another option is to use AWS Lambda functions to read and process change events from a DynamoDB stream. You...
Read more >DynamoDB Streams Kinesis Adapter for Java
The Amazon DynamoDB Streams Adapter implements the Amazon Kinesis interface so that your application can use KCL to consume and process data from...
Read more >AWS Lambda Events - Kinesis & DynamoDB Streams
DynamoDB / Kinesis Streams ... This setup specifies that the compute function should be triggered whenever: ... The ARN for the stream can...
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
Although to disagree with my previous comment all the code we have to handle db stream events starts with
so maybe the framework abstracting that would be nice but I am not sure if it would just add more complexity
@Ankcorn There is no active pull request at the moment by @sakthivel-tw, so definitely happy for you to pick it up!
That’s an interesting thought on supporting multiple entities, I presume you’re talking about the Single table design? To be honest it’s a pattern that I haven’t explored yet as I’ve been living with Faux SQL at the moment. But based on my understanding, most users will have their table designed differently that it’ll be quite impossible for a framework to derive what type of entity that’s coming. So I think even for simple table, the conversion from DDB Items to Entities should belong to the user land (an Entity Factory). This means, Laconia may simply need to be able to accept an entity factory type of function. This function will then be called when we’re adapting the event.
I don’t actually know the best practice on the batching and rate-limiting of ddb streams. Is this something that should be handled on AWS level? I’m thinking that there’ll be a complexity thing around handling large DDB stream that the user might hit a timeout (we could recurse I guess, which is what Laconia is already doing, see: https://laconiajs.io/docs/guides/long-running-tasks). Is this a better pattern for example: DDB stream -> lambda -> sqs/kinesis -> lambda -> legacy, to throttle the request?
Something to bear in mind, the API for DynamoDB adapter has sadly has a design flaw; it only supports newImage at the moment. If I were to redesign this, I’d make sure that laconia users know what requests they are handling, and even explicitly force them to be aware that they might need to handle different types of DynamoDB events. Something like (not well thought yet…):