AWS Storage Blog

Tag: Amazon CloudWatch

AWS DataSync Featured Image 2020

How London Stock Exchange Group migrated 30 PB of market data using AWS DataSync

London Stock Exchange Group (LSEG) has 30 PB of Tick History-PCAP data, which is ultra-high-quality global market data that is based on raw exchange data, timestamped to the nanosecond. An additional 60 TB is generated every day. LSEG sought to migrate their data from Wasabi cloud storage, LSEG was looking for a new solution to […]

Monitoring AWS Storage Gateway health and performance using Amazon CloudWatch

When managing a hybrid-cloud infrastructure, monitoring system health is essential for maintaining business continuity. Setting up comprehensive monitoring provides visibility into performance and availability of infrastructure components. By establishing alert thresholds and promptly responding to alarms, administrators can identify degraded performance or outages early. Quickly diagnosing and fixing the issues maximizes uptime. AWS Storage Gateway, […]

A Pilot Light disaster recovery strategy for WordPress

In today’s digital ecosystem, maintaining an uninterrupted online presence and resilience is essential for businesses. WordPress platforms, whether e-commerce sites or news portals, must not only meet but exceed stringent Service Level Agreements (SLA’s) to maintain user trust, ensure continuity, and protect revenue. These SLAs, defining Recovery Time Objectives (RTO) and Recovery Point Objectives (RPO), […]

Amazon FSx for OpenZFS

Improve application resiliency and performance with Multi-AZ Amazon FSx for OpenZFS

Business-critical applications frequently require high availability to meet Service Level Agreements (SLAs). In addition, users are challenged by compliance requirements and disaster recovery plans to assure workload availability. In many cases, workloads needing high availability also need highly performant, low-latency file storage. Customers who are considering the migration of business-critical application to AWS want to […]

Amazon FSx for OpenZFS

Configuring the auto-expansion of Amazon FSx for OpenZFS with Amazon CloudWatch and AWS Lambda

Today’s demanding workloads such as database, rendering farm, analytics and ML workloads, have increasingly demanding IO requirements. These workloads need a reliable storage infrastructure that provides sufficient storage capacity, IOPS, and throughput. As customers move more workloads to the cloud, they want to benefit from the agility and performance capabilities of the cloud as their […]

Conducting chaos engineering experiments on Amazon EBS using AWS Fault Injection Simulator

As distributed systems get more complex, anticipating disruptions becomes even more challenging. Conventional techniques of verifying known situations through unit or integration testing leave gaps that don’t cover component failures, which can result in expensive outages. Chaos engineering is a disciplined approach to unhide failures before they become outages. By proactively identifying how a system […]

AWS Transfer Family Featured Image

­­Encrypt and decrypt files with PGP and AWS Transfer Family

1/11/2024: Updates made due to CloudShell migration to Amazon Linux 2023 (AL2023). Protecting sensitive data is not a novel idea. Customers in industries like financial services and healthcare regularly exchange files containing sensitive data, including Personal Identifiable Information (PII) and financial records with their users. Pretty Good Privacy (PGP) encryption of these files is often […]

Performance analysis for different Amazon EFS throughput modes via Amazon CloudWatch

When I talk with customers about their file storage, I frequently get asked “How can I determine the right throughput capacity for my file storage?” The simple answer is through monitoring the performance of your workload to determine the right performance configuration for your file storage. Throughput modes for Amazon Elastic File System (EFS) determine […]

Maximizing price performance for big data workloads using Amazon EBS

Since the emergence of big data over a decade ago, Hadoop ­– an open-source framework that is used to efficiently store and process large datasets – has been crucial in storing, analyzing, and reducing that data to provide value for enterprises. Hadoop lets you store structured, partially structured, or unstructured data of any kind across […]

Improve application resiliency with Amazon EBS volume metrics and AWS Fault Injection Simulator

When business build applications, it is crucial to monitor those applications and take measures if and when needed to avoid downtime, and potentially, revenue loss. Choosing the correct metrics to monitor and setting up alarms as needed is table stakes for customers to achieve their application resiliency and availability goals. Amazon Elastic Block Store (Amazon […]