A16 vs P100

Open Advisor →

Side-by-side GPU comparison: specs, memory, compute performance, and live cloud pricing.

Verdict

For compute-bound workloads like training, P100 delivers 1.2× higher FP16 throughput.

Specifications

A16P100
VRAM16 GB16 GB
VRAM TypeGDDR6HBM2
Memory Bandwidth0.4 TB/s0.7 TB/s
FP16 Performance18 TFLOPS21 TFLOPS
ManufacturerNVIDIANVIDIA
FP8 SupportNoNo
FP4 SupportNoNo

Price / Performance

Based on cheapest single-GPU on-demand pricing. Lower $/TFLOP = better compute value; lower $/GB = better memory value.

A16P100
$/hr (cheapest)$1.46
$/TFLOP (compute value)$0.0689
$/GB VRAM (memory value)$0.0912

Cloud Pricing

Cheapest on-demand price per provider (single GPU).

A16

No cloud pricing available.

P100

ProviderOn-demandSpotRent
Google Cloud$1.46/hr$0.14/hr

Model Compatibility

Models from the catalog that fit on each GPU, grouped by required precision.

A16 (502 models)

P100 (502 models)

You might also compare…

Pricing data refreshed hourly · Last updated April 11, 2026 · Browse all comparisons