Wiki C4
W4G64 4.15 9.18
W3G64 4.74 9.48

Revisions available in this repository:

  • main (W4G64, scales learned);
  • nfl_w3g64 (W3G64, scales learned);

Evaluations are provided for models with learned scales.
Check the base Meta-Llama-3.1-70B-FLUTE for lm-eval-harness benchmarks.

Downloads last month
5
Safetensors
Model size
20.3B params
Tensor type
FP16
·
F32
·
I16
·
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API: The model has no library tag.

Collection including radi-cho/Meta-Llama-3.1-70B-Instruct-FLUTE