Hardware Index

GPU & Accelerator Comparison

Compare AI accelerators by memory, bandwidth, and compute. Select chips to compare side-by-side.

Max HBM Capacity
288 GB
AMD Instinct MI355X
Max Bandwidth
8 TB/s
AMD Instinct MI355X
Peak FP8 Compute
4.61 TFLOPS
Google TPU v7
Chips Indexed
19
+3 this month

Full Hardware Index

Select up to 3 chips to compare side-by-side
Hardware Manufacturer Type Primary Workload Secondary Workload Release Date FP-16 (PFLOPS) FP-8 (PFLOPS) Memory (GB) Bandwidth (TB/s) Power (W) Foundry Compare
Google TPU v7GoogleTPUTrainingInference2025-11-062.54.611927.37960TSMC
AMD Instinct MI355XAMDGPUTrainingInference2025-06-122.254.628881400TSMC
AMD Instinct MI350XAMDGPUTrainingInference2025-06-122.254.628881000TSMC
NVIDIA B300NVIDIAGPUTrainingInference2025-08-222.314.528881400TSMC
NVIDIA B200NVIDIAGPUTrainingInference2024-11-152.254.519281000TSMC
NVIDIA B100NVIDIAGPUTrainingInference2024-11-151.313.51928700TSMC
AMD Instinct MI325XAMDGPUInferenceTraining2024-10-101.32.6125661000TSMC
AMD Instinct MI300XAMDGPUTrainingInference2023-12-061.32.611925.3750TSMC
Amazon Trainium3Amazon AWSGPUTrainingInference2025-12-021.262.521554.9700TSMC
NVIDIA H200NVIDIAGPUInferenceTraining2024-11-180.991.981414.8700TSMC
NVIDIA H100 SXM5NVIDIAGPUTrainingInference2022-09-200.991.98803.35700TSMC
Google TPU v6eGoogleTPUTrainingInference2024-05-140.921.84321.60TSMC
NVIDIA H100 PCIeNVIDIAGPUInferenceTraining2022-10-050.761.51802400TSMC
Amazon Trainium2Amazon AWSGPUTrainingInference2024-12-030.651.3962.9500TSMC
NVIDIA L40SNVIDIAGPUInference2023-08-080.370.73480.86350TSMC
Google TPU v5eGoogleTPUInference2023-08-290.20.39160.80TSMC
NVIDIA L40NVIDIAGPUInference2022-10-130.180.36480.86300TSMC
NVIDIA L4NVIDIAGPUInference2023-03-210.120.24240.372TSMC
NVIDIA A100NVIDIAGPUTrainingInference2020-05-140.310802400TSMC
Data Source: Aggregated from manufacturer specifications and verified benchmarks