dynobase-icon
Dynobase

10 DynamoDB Limits You Need To Know

Rafal Wilinski

Written by Rafal Wilinski

Published on July 18th, 2021

    Still using AWS console to work with DynamoDB? 🙈

    Time to 10x your DynamoDB productivity with Dynobase [learn more]

    Amazon constantly proves, that in terms of scalability, DynamoDB has practically no limits. It can easily handle hundreds of milions of requests per second with a single-digit miliseconds latency. However, that power is not free. DynamoDB achieves its incredible ability to scale thanks to strict limitations - these limits, both in usage and in query patterns, can be pervieced as guidelines on how to use NoSQL database in a proper way.

    What are these limits? Let's start with the most important one:

    Item size limit

    DynamoDB's limit on the size of each record is 400KB. You might think it's very inconvinient but it's for your good - this limit makes it less likely that you will make a mistake when designing your database.

    If you have a lot of data, you should consider denormalizing it, breaking it into multiple items, or store it in a different place. For example, if you want to store an image, upload it to Amazon S3 and instead of putting them in DynamoDB, save just a link to it.

    If you have a lot of related records, you might store them all in a single record in DynamoDB as a big nested JSON. However, as number of informations contained in this blob increases, your application performance will degrade too. You'll likely end up fetching alot of unnecessary data and eventually even break the 400 KB limit. This limitation forces you to consider access patterns upfront and think about denormalization.

    Indexes

    DynamoDB Indexes allow you to create additional access patterns. GSIs (Global Secondary Index), aka indexes that use a different attribute as partition key, are limited to 20 per table. However, that limit can be increased by asking the support. On the other hand, LSIs (Local Secondary Index) are hard-capped to 5 indexes. That isn't a big deal. Usage of LSIs adds yet another, often overlooked limitation - it imposes a 10 GB size limit per partition key value. For that reason, you should probably always favor GSIs over LSIs.

    Scan and Query operations

    DynamoDB Scan and Query are two main operations to fetch a collection of items. These two operations have not only similar syntax - both of them can also return up to 1 MB of data per request. If the data you're looking for is not present in the first request's response, you'll have to paginate through the results - call the operation once again but with NextPageToken set to LastEvaluatedKey from the previous one.

    Transactions and Batch Operations

    Transactional and Batch APIs allow you to read or write multiple DynamoDB items across multiple tables at once.

    For transactions:

    • TransactWriteItems is limited to 25 items per request
    • TransactReadItems is limited to 25 items per request

    For batch operations:

    • BatchWriteItem is limited to 25 items per request
    • BatchGetItem is limited to 100 items per request

    Partition Throughput Limits

    DynamoDB Tables are internally divided into partitions. Each partition has its own throughput limit, it is set to 3,000 RCUs (Read Capacity Units) and 1,000 WCUs (Write Capacity Units). Because of that limitation, it is extremely important to design your application to distribute reads and writes evenly across all partitions, or in other words, across all logical partition keys.

    Unfortunately, it is not always possible to distribute that load evenly. In cases like that, "hot" partitions (the ones that receive most of the requests), will use adaptive capacity for a limited period of time to continue operation without disruptions or throttling. This mechanism works automatically and is completely transparent to the application.

    Others

    • Throughput Default Quotas per table - 40,000 read capacity units and 40,000 write capacity units
    • Partition Key Length - from 1 byte to 2048 bytes
    • Sort Key Length - from 1 byte to 1024 bytes
    • Table Name Length - from 3 characters to 255
    • Item's Attribute Names - from 1 character to 64KB long
    • Item's Attribute Depth - up to 32 levels deep
    • ConditionExpression, ProjectionExpression, UpdateExpression & FilterExpression length - up to 4KB
    • DescribeLimits API operation should be called no more than once a minute.

    There's also a bunch of reserved keywords.

    Frequently Asked Questions

    Which DynamoDB limits can be raised by contacting AWS support?

    Users can increase the following limits by contacting AWS support:

    1. The number of provisioned throughput units per account
    2. The number of tables per account

    What is the DynamoDB document/item size limit?

    DynamoDB supports up to 400KB per item within its database. The items stored within a DynamoDB database cannot exceed this limit. However, this size is typically enough for most regular database operations and use cases.

    What is the DynamoDB object size limit?

    DynamoDB only allows a maximum size of 400KB per DynamoDB item. The items stored within a DynamoDB database cannot exceed this limit. However, this size is typically enough for most regular database operations and use cases.

    What is the DynamoDB column limit?

    DynamoDB does not contain columns within its database structure. Therefore, there is no column limit for DynamoDB databases. However, the database design requires the users to store data as items that may have single or multiple attributes.

    What is the DynamoDB read capacity limit?

    DynamoDB leaves enough read capacity units for its tables and accounts that suit most typical use cases. The database limits 40,000 read capacity units per table and 80,000 read capacity units per account on provisioned mode. At the same time, the database limits 40,000 read capacity per table and no limits per account on on-demand mode.

    Are there any scaling limits in DynamoDB?

    DynamoDB does not have limits when scaling and can scale up or down depending on the capacity needed. In addition, the auto-scaling feature within DynamoDB and the various deployment modes such as provisioned and on-demand allow users to select the most appropriate deployment mode depending on the expected workload.

    Spend less time in the AWS console, use Dynobase.

    Try 7-day free trial. No credit card needed.

    Product Features

    Download
    /
    Changelog
    /
    Pricing
    /
    Member Portal
    /
    Privacy
    /
    EULA
    /
    Twitter
    © 2024 Dynobase