Side-by-side GPU comparison: specs, memory, compute performance, and live cloud pricing.
Verdict
A100 40GB has 8 GB more VRAM, making it better suited for large models and long context windows. A100 40GB supports a broader range of models (612 vs 577 from this catalog), giving more flexibility.
Specifications
| A100 40GB | Gaudi HL-205 | |
|---|---|---|
| VRAM | 40 GB | 32 GB |
| VRAM Type | HBM2 | HBM2 |
| Memory Bandwidth | 1.6 TB/s | 1.0 TB/s |
| FP16 Performance | 312 TFLOPS | — |
| Manufacturer | NVIDIA | Habana |
| FP8 Support | No | No |
| FP4 Support | No | No |
Price / Performance
Based on cheapest single-GPU on-demand pricing. Lower $/TFLOP = better compute value; lower $/GB = better memory value.
| A100 40GB | Gaudi HL-205 | |
|---|---|---|
| $/hr (cheapest) | $0.93 | — |
| $/TFLOP (compute value) | $0.0030 | — |
| $/GB VRAM (memory value) | $0.0232 | — |
Cloud Pricing
Cheapest on-demand price per provider (single GPU).
Model Compatibility
Models from the catalog that fit on each GPU, grouped by required precision.
A100 40GB (612 models)
Gaudi HL-205 (577 models)
You might also compare…
Pricing data refreshed hourly · Last updated April 11, 2026 · Browse all comparisons