CodeLlama-7b-Python-hf

Open Advisor →
6.7BLlamaForCausalLMllama

2,943 downloads

VRAM Requirements

VRAM requirements for CodeLlama-7b-Python-hf at different quantization levels
QuantizationVRAM Required
FP1615.1 GB
Q87.5 GB
Q65.6 GB
Q43.8 GB

Compatible GPUs

Top 10 compatible GPUs for CodeLlama-7b-Python-hf, sorted by cheapest price
GPUVRAMBest QuantFromRent
L424 GBFP16$0.20/hr
A4048 GBFP16$0.44/hrRent
A100 40GB40 GBFP16$0.93/hrRent
A10G24 GBFP16$1.01/hr
A1024 GBFP16$1.29/hr
A100 80GB PCIe80 GBFP16$1.39/hrRent
A100 80GB80 GBFP16$1.39/hrRent
H100 MEGA80 GBFP16$1.80/hrRent
H100 SXM80 GBFP16$1.80/hrRent
H100 NVL94 GBFP16$1.80/hrRent

Showing top 10. Open Advisor for full results.

CodeLlama-7b-Python-hf FAQ