Why Amazon EC2 G6e Instances?
Amazon EC2 G6e instances powered by NVIDIA L40S Tensor Core GPUs are the most cost-efficient GPU instances for deploying generative AI models and the highest performance GPU instances for spatial computing workloads. They offer 2x higher GPU memory (48 GB), and 2.9x faster GPU memory bandwidth compared to G6 instances. G6e instances deliver up to 2.5x better performance compared to G5 instances.
Customers can use G6e instances to deploy large language models (LLMs) with up to 13B parameters and diffusion models for generating images, video, and audio. Additionally, the G6e instances will unlock customers’ ability to create larger, more immersive 3D simulations and digital twins for spatial computing workloads using NVIDIA Omniverse.
G6e instances feature up to 8 NVIDIA L40S Tensor Core GPUs with 384 GB of total GPU memory (48 GB of memory per GPU) and third generation AMD EPYC processors. They also support up to 192 vCPUs, up to 400 Gbps of network bandwidth, up to 1.536 TB of system memory, and up to 7.6 TB of local NVMe SSD storage.
Benefits
Features
Product details
Instance Size | GPU | GPU Memory (GiB) | vCPUs | Memory(GiB) | Storage (GB) | Network Bandwidth (Gbps) | EBS Bandwidth (Gbps) |
g6e.xlarge | 1 | 48 | 4 | 32 | 250 | Up to 20 | Up to 5 |
g6e.2xlarge | 1 | 48 | 8 | 64 | 450 | Up to 20 | Up to 5 |
g6e.4xlarge | 1 | 48 | 16 | 128 | 600 | 20 | 8 |
g6e.8xlarge | 1 | 48 | 32 | 256 | 900 | 25 | 16 |
g6e.16xlarge | 1 | 48 | 64 | 512 | 1900 | 35 | 20 |
g6e.12xlarge | 4 | 192 | 48 | 384 | 3800 | 100 | 20 |
g6e.24xlarge | 4 | 192 | 96 | 768 | 3800 | 200 | 30 |
g6e.48xlarge | 8 | 384 | 192 | 1536 | 7600 | 400 | 60 |
Customer and Partner testimonials
Here is an example of how customers and partners have achieved their business goals with Amazon EC2 G6e instances.
-
Leonardo.AI
Leonardo.AI offers a production suite for content creation that leverages generative AI technologies.