Question

I'm in the process of building out an architecture that will heavily leverage DynamoDb. My organization is considering the possibility of storing sensitive data (config info) in DynamoDb, as well as non-sensitive data (e.g. telemetry data).

Historically, our SQL Servers have been protected by numerous layers of network and physical security. While we love the idea of DynamoDb as a seemingly infinitely scale-able store for data, we're unsure of what the best practices are for these internet accessible databases like DynamoDb.

Are there any best practices for limiting access to data in DynamoDB besides simply authentication and authorization? Are there any network level protections one can put in place to limit the attack profile?

Was it helpful?

Solution

Are there any best practices for limiting access to data in DynamoDB besides simply authentication and authorization?

For DynamoDB, the entire security model is the authentication and authorization (IAM) supplied by AWS; there is nothing "besides" those two. However, you don't ever access DynamoDB directly from the Internet, but instead go through a service of some form (e.g. API gateway backing onto a Lambda function, or some custom code running on an EC2 box). AWS will then manage the authorization for you (e.g. via IAM Roles for EC2), and nothing outside of AWS ever needs to know what's going on. You should definitely never be putting any IAM keys into your applications.

As an aside:

storing sensitive data (config info) in DynamoDb

I wouldn't personally go down that route, not because of the nature of the data, but because of the lack of atomicity and isolation in DynamoDB. You don't need the blinding speed of DynamoDB, so just keep in it good old SQL.

Licensed under: CC-BY-SA with attribution
scroll top