6 Common DynamoDB Issues
Taavi Rehemägi
Posted on April 7, 2022
DynamoDB, the primary NoSQL database service offered by AWS, is a versatile tool. It's fast, scales without much effort, and best of all, it's billed on-demand! These things make DynamoDB the first choice when a datastore is needed for a new project.
But as with all technology, it's not all roses. You can feel a little lost if you're coming from years of working with relational databases. You're SQL and normalization know-how doesn't bring you much gain.
It's expected that developers face many of the same issues when starting their NoSQL journey with DynamoDB. This article might clear things up a bit.
1. Query Condition Missed Key Schema Element
You're trying to run a query on a field that wasn't indexed for querying. Add a secondary index for the field to your table to fix it.
DynamoDB is a document store, which means it can store documents in each table that have an arbitrary number of fields, in contrast to relational databases that usually include fixed columns in each table. This feature often leads developers to think, DynamoDB is a very flexible database, which is not 100% true.
To efficiently query a table, you have to create the proper indexes right from the beginning or add global secondary indexes later. Otherwise, you will need to use the scan command to get your data, which will check every item in your table. The query command uses the indexes you created and, in turn, is much quicker than using scan.
2. Resource Not Found Exception
The AWS SDK can't find your table. The most common reasons for this are a typo in the table name, using the wrong account credentials, or operating in the wrong region.
If you're using CloudFormation or TerraForm, it can also be that these tools are still deploying your table, and you try to access them before they are finished.
Companies using AWS in many different teams tend to have multiple AWS accounts, and if they operate globally, chances are good they deploy tables close to their customers. If you got many different combinations of accounts and regions, things could get confusing.
That's why you should always double-check your AWS SDK configurations and credentials.
3. Cannot do Operations on a Non-Existent Table
You have errors in your credentials for DynamoDB Local or the AWS SDK. Use the same accounts and regions in your credentials or configure DynamoDB Local with the -sharedDb flag so that multiple accounts can use one table.
DynamoDB Local is a tool that allows you to run DynamoDB on your local computer for development and testing purposes. It has to be configured just like the regular DynamoDB to ensure the system works as your production deployment.
While this makes DynamoDB Local susceptible to the same error, we saw above in point two. This also makes sure you catch these errors at development time.
4. The Provided Key Element Does Not Match the Schema
Your table configuration for hash and sort keys is different from the one you use in GetItem. Make sure you use the same key config at definition and access time to fix this.
Again, this is about the difference between the perceived and actual flexibility of DynamoDB. You need to correctly set up your key schema for a table when creating it and later keep on using the same keys when loading data from the table.
Check out our knowledge base if you want to learn more about setting up DynamoDB key schemas correctly.
5. User is Not Authorized to Perform: DynamoDB PutItem on Resource
An IAM policy issue, the user doesn't have access to the PutItem command. To fix this, you either use the managed policy AmazonDynamoDBFullAccess or, if you don't want to give the user access to all commands, explicitly add the PutItem to their IAM role.
AWS tries to keep you safe from security problems; that's why services deployed on AWS are always as closed as possible for any access. The downside is that accessing services deployed into a production environment can become a boilerplate-heavy task.
To get around this, you can try the AWS CDK. It's an infrastructure as code tool built around CloudFormation that comes with constructs for all AWS services. These constructs help with all that permission management that is easy to get wrong when using AWS.
6. Expected params.Item to be a Structure
You're using the low-level DynamoDB client of the AWS SDK, which expects you to specify runtime types for your fields. Instead of using pid: '', you have to use an object that includes the type: pid: {S: ''}.
The AWS SDK for JavaScript includes two clients. The low-level client requires you to set all your types manually, and a higher-level client, called DynamoDB Document Client, tries to infer your types from the values you use as arguments.
I would recommend using the DynamoDB Document Client if you don't have good reasons to use the low-level one instead.
Conclusion
Using DynamoDB for your projects can save you from many issues with relational databases. Most importantly, the costs associated with them. It scales well in terms of developer experience, from single developers to big teams, and performance, from single deployments to replicas worldwide.
But using DynamoDB and NoSQL databases requires you to relearn what you know from relational databases. If you don't see what you're getting into by just following the hype, you might find yourself at a dead-end later because DynamoDB won't deliver on the promises you read about.
Let Dashbird Help
If you're new to serverless or DynamoDB, Dashbird can help you get up to speed. After you add Dashbird to your AWS account, it automatically monitors all deployed DynamoDB tables, so no problem that might occur is lost on you.
And best of all, Dashbird comes with AWS Well-Architected best practices out of the box. This way, you can make sure you're using DynamoDB in a safe and performant way without learning everything about it up-front.\
You can try Dashbird now; no credit card is required. The first one million Lambda invocations are on us! See our product tour here.
Posted on April 7, 2022
Join Our Newsletter. No Spam, Only the good stuff.
Sign up to receive the latest update from our blog.