Llama-3.1-405B
by meta-llama
405B params · text-generation · 965 likes · 555.8k downloads
Llama-3.1-405B is a 405B parameter model. At Q4 quantization it requires 202GB of VRAM. It requires a GPU with at least 202GB of VRAM.
by meta-llama
405B params · text-generation · 965 likes · 555.8k downloads
Llama-3.1-405B is a 405B parameter model. At Q4 quantization it requires 202GB of VRAM. It requires a GPU with at least 202GB of VRAM.