AWS Database Blog
How to configure a Linked Server between Amazon RDS for SQL Server and Teradata database
In this post, we demonstrate how to configure a linked server between Amazon RDS for SQL Server and a Teradata database instance. We guide you through the step-by-step process to establish this connection and show you how to verify its functionality.
Achieve up to 1.7 times higher write throughput and 1.38 times better price performance with Amazon Aurora PostgreSQL on AWS Graviton4-based R8g instances
In this post, we demonstrate how upgrading to Graviton4-based R8g instances with Aurora PostgreSQL-Compatible 17.4 on Aurora I/O-Optimized cluster configuration can deliver significant price-performance gains – delivering up to 1.7 times higher write throughput, 1.38 times better price-performance and reducing commit latency by up to 46% on r8g.16xlarge instances and 38% on r8g.2xlarge instances as compared to Graviton2-based R6g instances.
How Amazon maintains accurate totals at scale with Amazon DynamoDB
Amazon’s Finance Technologies Tax team (FinTech Tax) manages mission-critical services for tax computation, deduction, remittance, and reporting across global jurisdictions. The Application processes billions of transactions annually across multiple international marketplaces. In this post, we show how the team implemented tiered tax withholding using Amazon DynamoDB transactions and conditional writes.
Build an AI-powered text-to-SQL chatbot using Amazon Bedrock, Amazon MemoryDB, and Amazon RDS
Text-to-SQL can automatically transform analytical questions into executable SQL code for enhanced data accessibility and streamlined data exploration, from analyzing sales data and monitoring performance metrics to assessing customer feedback. In this post, we explore how to use Amazon Relational Database Service (Amazon RDS) for PostgreSQL and Amazon Bedrock to build a generative AI text-to-SQL chatbot application using Retrieval Augmented Generation (RAG). We’ll also see how we can use Amazon MemoryDB with vector search to provide semantic caching to further accelerate this solution.
Amazon DynamoDB data modeling for Multi-tenancy – Part 3
In this series of posts, we walk through the process of creating a DynamoDB data model using an example multi-tenant application, a customer issue tracking service. The goal of this series is to explore areas that are important for decision-making and provide insights into the influences to help you plan your data model for a multi-tenant application. In this last part of the series, we explore how to validate the chosen data model from both a performance and a security perspective. Additionally, we cover how to extend the data model as new access patterns and requirements arise.
Amazon DynamoDB data modeling for Multi-Tenancy – Part 2
In this series of posts, we walk through the process of creating a DynamoDB data model using an example multi-tenant application, a customer issue tracking service. The goal of this series is to explore areas that are important for decision-making and provide insights into the influences to help you plan your data model for a multi-tenant application. In this post, we continue the design process, selecting a partition key design and creating our data schema. We also show how to implement the access patterns using the AWS Command Line Interface (AWS CLI).
Amazon DynamoDB data modeling for Multi-Tenancy – Part 1
In this series of posts, we walk through the process of creating a DynamoDB data model using an example multi-tenant application, a customer issue tracking service. The goal of this series is to explore areas that are important for decision-making and provide insights into the influences to help you plan your data model for a multi-tenant application. In this post, we define the access patterns and decide on the table design.
Create a unit testing framework for PostgreSQL using the pgTAP extension
pgTAP (PostgreSQL Test Anything Protocol) is a unit testing framework that empowers developers to write and run tests directly within the database. In this post, we explore how to leverage the pgTAP extension for unit testing on Amazon RDS for PostgreSQL and Amazon Aurora PostgreSQL-Compatible Edition database, helping you build robust and reliable database applications.
Scaling Amazon RDS for MySQL performance for Careem’s digital platform on AWS
Careem powers rides, deliveries, and payments across the Middle East, North Africa and South Asia. As Careem grew, so did its data infrastructure challenges. Their monolithic 270 TB Amazon RDS for MySQL database consisting of one writer and five read replicas— experienced performance issues due to increased storage utilization, slow queries, high replica lag, and increased Amazon RDS cost. In this post, we provide a step-by-step breakdown of how Careem successfully implemented a phased data purging strategy, improving DB performance while addressing key technical challenges.
Amazon CloudWatch Database Insights applied in real scenarios
In this post, we show how you can use Amazon CloudWatch Database Insights for troubleshooting your Amazon RDS and Amazon Aurora resources. CloudWatch Database Insights serves as a database observability solution offering a tailored experience for DevOps engineers, application developers, and database administrators. This tool is designed to accelerate database troubleshooting processes and address issues across entire database fleets, enhancing overall operational efficiency.