VRAM Requirements
VRAM requirements for Llama-4-Maverick-17B-128E-Instruct at different quantization levels| Quantization | VRAM Required |
|---|
| FP16 | 897.6 GB |
| Q8 | 448.8 GB |
| Q6 | 336.6 GB |
| Q4 | 224.4 GB |
Compatible GPUs
No compatible GPUs in database.
Llama-4-Maverick-17B-128E-Instruct FAQ