A100 40GB vs M4 Max (36 GB)

Open Advisor →

Side-by-side GPU comparison: specs, memory, compute performance, and live cloud pricing.

Verdict

A100 40GB has 13 GB more VRAM, making it better suited for large models and long context windows. For compute-bound workloads like training, A100 40GB delivers 9.2× higher FP16 throughput. A100 40GB supports a broader range of models (612 vs 569 from this catalog), giving more flexibility.

Specifications

A100 40GBM4 Max (36 GB)
VRAM40 GB27 GB
VRAM TypeHBM2LPDDR5X
Memory Bandwidth1.6 TB/s0.4 TB/s
FP16 Performance312 TFLOPS34 TFLOPS
ManufacturerNVIDIAApple
FP8 SupportNoNo
FP4 SupportNoNo

Price / Performance

Based on cheapest single-GPU on-demand pricing. Lower $/TFLOP = better compute value; lower $/GB = better memory value.

A100 40GBM4 Max (36 GB)
$/hr (cheapest)$0.93
$/TFLOP (compute value)$0.0030
$/GB VRAM (memory value)$0.0232

Cloud Pricing

Cheapest on-demand price per provider (single GPU).

A100 40GB

ProviderOn-demandSpotRent
Vast.ai$0.93/hrRent
Google Cloud$1.61/hr$1.17/hr
Lambda$1.99/hr

M4 Max (36 GB)

No cloud pricing available.

Model Compatibility

Models from the catalog that fit on each GPU, grouped by required precision.

A100 40GB (612 models)

M4 Max (36 GB) (569 models)

You might also compare…

Pricing data refreshed hourly · Last updated April 11, 2026 · Browse all comparisons