L40 vs L40S

Open Advisor →

Side-by-side GPU comparison: specs, memory, compute performance, and live cloud pricing.

Verdict

For compute-bound workloads like training, L40S delivers 2.0× higher FP16 throughput. At $0.60/hr vs $0.99/hr, L40S is the more cost-efficient choice for inference.

Specifications

L40L40S
VRAM48 GB48 GB
VRAM TypeGDDR6GDDR6
Memory Bandwidth0.9 TB/s0.9 TB/s
FP16 Performance181 TFLOPS366 TFLOPS
ManufacturerNVIDIANVIDIA
FP8 SupportYesYes
FP4 SupportNoNo

Price / Performance

Based on cheapest single-GPU on-demand pricing. Lower $/TFLOP = better compute value; lower $/GB = better memory value.

L40L40S
$/hr (cheapest)$0.99$0.60✓ best
$/TFLOP (compute value)$0.0055$0.0016✓ best
$/GB VRAM (memory value)$0.0206$0.0125✓ best

Cloud Pricing

Cheapest on-demand price per provider (single GPU).

L40

ProviderOn-demandSpotRent
RunPod$0.99/hr$0.50/hrRent

L40S

ProviderOn-demandSpotRent
Vast.ai$0.60/hrRent
RunPod$0.86/hr$0.26/hrRent
Nebius$1.55/hr$0.75/hr
Amazon Web Services$1.86/hr$0.36/hr

Model Compatibility

Models from the catalog that fit on each GPU, grouped by required precision.

L40 (624 models)

L40S (624 models)

You might also compare…

Pricing data refreshed hourly · Last updated April 11, 2026 · Browse all comparisons