Working With DynamoDB Triggers & Lambda - The Ultimate Guide
Written by Daniel Galati
Published on December 11th, 2022
Time to 10x your DynamoDB productivity with Dynobase [learn more]
DynamoDB Triggers - What You Need To Know
DynamoDB Triggers are an efficient way to capture record-level changes on your DynamoDB Table’s Items using AWS Lambda Functions. Using triggers, you can build powerful event-driven applications that leverage the scaling advantages of AWS Lambda.
A Trigger is a piece of code that automatically runs in response to new events occurring on your DynamoDB Stream. Before you can understand more about how to use Triggers, you first need to understand the basics of DynamoDB streams and how Triggers interact with them.
Behind the Scenes of DynamoDB Triggers
Triggers operate using DynamoDB Streams as their foundation. Streams capture a time-ordered sequence of record changes that occur on your DynamoDB table over the past 24 hours. This information is stored in a log which you can access.
When using Triggers, the Lambda service handles the complexity of detecting when new items are added to the log and invoking your function as a result with the new events. With every invocation, your Lambda function receives a list of items that were modified as part of the event
argument in your Lambda function’s main handler.
Enabling Streams
Before you can start using a Trigger, you first need to enable DynamoDB Streams on your table. Enabling a stream involves selecting what type of information you would like the DynamoDB Stream to capture when a Table Item modification takes place. A list of the different options you have and a brief explanation of each is as follows:
- Key Attributes Only - Capture the partition key (and sort key if used).
- New Image - Capture the complete item as it exists after the update.
- Old Image - Capture the original item as it existed prior to the update.
- New and Old Images - Capture both the original item and the updated item independently.
As you’ll see in upcoming sections, the type you choose determines the data available to you when your Lambda Trigger is invoked in the event
object.
After you’ve successfully enabled your Stream, you will now have the ability to create one or more Lambda Function Triggers that respond to item-level events.
To enable a Stream on a Table using CloudFormation for New and Old Images, you would use the following code:
Note that after enabling the Stream, it will be assigned an Amazon Resource Name (ARN). The ARN of the stream is needed to link it to your Lambda Function, as we’ll see in the upcoming steps.
How To Enable Your DynamoDB Trigger
To enable a trigger, you associate its ARN with an existing Lambda function and specify the trigger options. This is done by creating an Event Source Mapping that links your DynamoDB Stream with your function. Lambda requires the following IAM permission policies in order to create the Event Source Mapping:
There are many settings that you configure that change the behavior of your trigger. The most important settings, along with a brief explanation, are outlined below. You can read more about all available settings in the AWS Documentation on Streams.
- Event Source ARN (required) - The ARN of your DynamoDB Table’s Stream.
- Function ARN (required) - The ARN of your Lambda Function.
- Batch Size (default 100) - The maximum number of records Lambda will attempt to read from your Stream at once. Minimum value of 1 and Maximum value of 10,000. Note that there is a maximum of 6MB worth of data that can be passed to your Lambda Function at once.
- Batch Window (default 0)- The maximum amount of time to spend gathering records before function invocation in seconds.
- Starting Position (required)- Whether upon enabling the trigger, you would like to process only new records added to the stream (Latest) or processing all records that exist in the stream (Trim Horizon).
- On Failure Destination - An SQS queue or SNS topic to send updates that failed to get processed.
- Retry Attempts - The number of retry attempts to perform in light of a failure.
- Enabled - Whether you would like your trigger enabled immediately upon creation or not.
Here’s an example that brings these concepts together using CloudFormation:
Alternatively, you can use CDK with Python to achieve the same:
Below are examples of what our Lambda Function’s code might look like for different scenarios:
DynamoDB Trigger Lambda Examples
DynamoDB Trigger Lambda - On Insert
Below is an example of an incoming event that relates to an Item INSERT:
Notice the event type is listed under the eventName field. For INSERT events, the snapshot of the inserted record is within the NewImage field.
A handling lambda function in Python that is only concerned with INSERT events might be:
DynamoDB Trigger Lambda - On Update
Below is an example of an incoming event that relates to an Item update or MODIFY event:
Notice that MODIFY events contain both a NewImage and OldImage field along with the changed attributes. In this specific example, we can see that the record's FirstName was updated from John to James.
Below is an example of lambda code in Python that prints out the values of the old and new side by side:
DynamoDB Trigger Lambda - On Delete
Below is an example of an incoming event that relates to an Item deletion or REMOVE event:
Notice that in the case of deletion or REMOVE events, only the OldImage field is present. We can pull out the records using the following code:
DynamoDB Trigger Lambda - On TTL
TTL or Time To Live is a feature of your DynamoDB Table that automatically deletes records after a specified period of time. In the case of your DynamoDB Trigger, the event you receive will be a REMOVE type, just like the example we saw previously.
FAQ
Does DynamoDB have triggers?
Yes, DynamoDB supports triggers which leverage AWS Lambda functions to execute code in response to item-level updates on your DynamoDB Table.
Can DynamoDB trigger Lambda?
DynamoDB streams create change events in response to item updates that live on the stream for up to 24 hours. Associating a Lambda Function Trigger with your stream allows Lambda to auto-detect when updates occur on your stream, and invoke it with the changed records in the event body.
Can DynamoDB trigger Step Functions?
DynamoDB can only trigger a Lambda function in response to item changes. To trigger a Step Function in response to a record change event, you will need to invoke the Step Function from within your Lambda code using the AWS SDK.
What’s the difference between DynamoDB trigger and DynamoDB stream?
DynamoDB streams capture a time-ordered sequence of record changes that occur on your DynamoDB table over the past 24 hours. DynamoDB Triggers allow Lambda to use your stream to detect change events and invoke your Lambda function in response.
Can I set multiple triggers in DynamoDB?
There is no limit on the number of DynamoDB Triggers you can have on a Table.
Best Practices for Using DynamoDB Triggers
When using DynamoDB Triggers, it's important to follow best practices to ensure optimal performance and reliability. First, always handle errors gracefully within your Lambda function to avoid data loss or processing delays. Implementing retry logic and using dead-letter queues can help manage failed events. Second, monitor the performance and execution metrics of your Lambda functions to identify and resolve bottlenecks. Third, consider the security implications of your triggers and ensure that your IAM roles and policies follow the principle of least privilege. Finally, test your triggers thoroughly in a development environment before deploying them to production to ensure they behave as expected under various scenarios.