General Purpose Instances
Balanced compute, memory, and networking for diverse workloads like web servers and code repositories.
Mac1
Processor: Built on Apple Mac hardware
Configuration Range: 6 Physical Cores / 12 Logical Threads
32GiB RAM
Notes: Bare Metal macOS Instances
T4g (ARM)
Processor: AWS Graviton2 Processor
Configuration Range: 2-64 vCPUs
0.5GB-256GB RAM
Networking: Up to 25 Gbps
Notes: Free Tier Eligible (t4g.micro)
M6i (Intel)
Processor: 3rd Gen Intel Xeon Scalable
Configuration Range: 2-128 vCPUs
8GB-512GB RAM
Networking: Up to 50 Gbps
Notes: Always-On Memory Encryption
M5a (AMD)
Processor: AMD EPYC 7000 Series
Configuration Range: 2-96 vCPUs
8GB-384GB RAM
Networking: Up to 20 Gbps
Storage: 3.6TB NVMe SSD
Compute Optimized Instances
For compute-intensive workloads: HPC, media transcoding, ML inference and etc.
C6g (ARM)
Processor: AWS Graviton2 Processor
Configuration Range: 1-64 vCPUs
2GB-128GB RAM
Networking: Up to 25 Gbps
Storage: 3.8TB NVMe SSD
C5n (Intel)
Processor: 3.0 GHz Intel Xeon Platinum
Configuration Range: 1-72 vCPUs
2GB-192GB RAM
Networking: Up to 100 Gbps
Storage: 38 Gbps EBS Bandwidth
C5 (Intel)
Processor: 2nd Gen Intel Xeon Scalable
Configuration Range: 2-96 vCPUs
8GB-384GB RAM
Networking: Up to 25 Gbps
Storage: 3.6TB NVMe SSD
C5a (AMD)
Processor: AMD EPYC 7002 Series
Configuration Range: 2-96 vCPUs
4GB-192GB RAM
Networking: Up to 20 Gbps
Storage: 3.8TB NVMe SSD
Memory Optimized Instances
For memory-intensive workloads like in-memory databases.
U-Series
Processor: Intel Xeon Platinum 8176M/Cascade Lake
Configuration Range: 224-448 vCPUs
6TB-24TB RAM
Networking: 25 Gbps
Storage: Bare Metal Option
R6g (ARM)
Processor: AWS Graviton2 Processor
Configuration Range: 1-64 vCPUs
8GB-512GB RAM
Networking: Up to 25 Gbps
Storage: 3.8TB NVMe SSD
X1e (Intel)
Processor: Intel Xeon E7-8880 v3
Configuration Range: 4-128 vCPUs
122GB-4TB RAM
Networking: Up to 25 Gbps
Storage: 3.8TB NVMe SSD
R5a (AMD)
Processor: AMD EPYC 7000 Series
Configuration Range: 2-96 vCPUs
16GB-768GB RAM
Networking: Up to 20 Gbps
Storage: 3.6TB NVMe SSD
Accelerated Computing Instances
These instances utilize hardware accelerators or co-processors to perform floating-point calculations, graphics processing, and data pattern matching significantly more efficiently than CPU-based software implementations.
P4 (GPU)
Accelerator: 8x NVIDIA A100 Tensor Core GPUs
Configuration Range: 96 vCPUs/1.1TB RAM
Networking: 400 Gbps
Storage: 8TB NVMe SSD
Inf1 (Inferentia)
Accelerator: 16x AWS Inferentia Chips
Configuration Range: 4-96 vCPUs
8GB-192GB RAM
Networking: Up to 100 Gbps
Processors: 2nd Gen Intel® Xeon® Scalable Processors
G4dn (GPU)
Accelerator: NVIDIA T4 Tensor Core GPU
Configuration Range: 1-8 GPUs
4-96 vCPUs/16-128GB RAM
Networking: Up to 100 Gbps
Storage: 1.8TB NVMe SSD
Processors: Intel Xeon Cascade Lake 24C Processors
F1 (FPGA)
Accelerator: Xilinx Virtex UltraScale+ VU9P
Configuration Range: 1-8 FPGAs
8-64 vCPUs/122-976GB RAM
Networking: 25 Gbps
Storage: 250M Logic Elements
Storage Optimized Instances
Designed for workloads requiring high-speed sequential read/write access to large on-instance datasets. Optimized to deliver hundreds of thousands of low-latency, random I/O operations per second (IOPS).
H1
Processor: Intel Xeon E5 2686 v4
Configuration Range: 8-64 vCPUs
32GB-256GB RAM
Networking: Up to 25 Gbps
Storage: 16TB HDD
I3
Processor: Intel Xeon E5 2686 v4
Configuration Range: 2-72 vCPUs
15.25GB-512GB RAM
Networking: Up to 25 Gbps
Storage: 16TB NVMe SSD
D3
Processor: Intel Xeon Cascade Lake
Configuration Range: 4-32 vCPUs
32GB-256GB RAM
Networking: Up to 25 Gbps
Storage: 48TB HDD
D3en
Processor: Intel Xeon Cascade Lake
Configuration Range: 4-48 vCPUs
16GB-192GB RAM
Networking: Up to 75 Gbps
Storage: 336TB HDD