Learning AWS Day by Day — Day 48 — Amazon DynamoDB — Part 4

rksalo88

Saloni Singh

Posted on April 27, 2024

Learning AWS Day by Day — Day 48 — Amazon DynamoDB — Part 4

Exploring AWS !!

Day 48:

Amazon DynamoDB — Part 4

Image description

Provisioned Capacity mode: autoscaling, maintains performance.

Disney+ chose DynamoDB to help with:
Utilizing multi-region replication with single digit latency to shift traffic without data issues.
Adding another region in global tables to launch in new countries.
Scaling recommendation and bookmarks with no operational overhead.
Having ability to switch between on demand and provisioned capacity mode.

Enterprise Ready: Simplify application code with ACID guarantees run transaction for large scale workloads, accelerate legacy migration.

Security:
Access controls and Encryption at Rest:
All tables encrypted in transit, at rest by default
Fully integrated with IAM
Access DynamoDB from a VPC via secure VPC endpoint

Audit logging with CloudTrail:
Capture all Control plane and Data Plane operation logs for compliance, risk and auditing.
Record table level and item level activity, trigger actions when important events are detected and analyze events and logs with Amazon Athena and CloudWatch logs Insights, Compliance aid, visibility into activity, detect data exfiltration, automate security analysis, troubleshoot anomalies, analyze permission.

Backup and Restore: on demand backups for long term data archiving and compliance.
Continuous backups for PTR(Point in Time Recovery) — last 35 days
Zero impact performance.

NoSQL workbench:
Data modeler: client side application that helps you build scalable, high performance data models.
Visualizer: Simplifies query development and testing. A rich GUI based tool that helps you visualize data models and perform DynamoDB operations.
Operation Builder: Available for Windows, MacOS, Linux

Export data to S3 for analysis and insights:
DynamoDB is not for analytics, but for transactional workloads, OLTP. So, for analytics you need to export from DynamoDB and put that data into another system. You can use Athena directly against S3 files, but this way with single hit exporting to S3 is great.

Extract Actionable Insights:
Export DynamoDB across regions and accounts to help comply with regulatory requirements and develop a Disaster recovery and Backup plan.

No impact on performance:
Zero impact

Integrate with backups:
Select a DynamoDB table with PITR enabled specify any point in 35 days and choose target S3, output in JSON and Amazon Ion.

Capital One Bank was first US Bank to go all in on the cloud:
Migrated from Mainframe to DynamoDB
Previously all application were using Mainframe sitting in middle of the physical servers they had.
Producer teams busy coming up with new mobile products for customers were often blocked by mainframe.
DynamoDB microservices give application developers unbounded scale, nimbleness and ability to rollout all new services.

💖 💪 🙅 🚩
rksalo88
Saloni Singh

Posted on April 27, 2024

Join Our Newsletter. No Spam, Only the good stuff.

Sign up to receive the latest update from our blog.

Related