|
--- |
|
base_model: Menlo/Jan-nano |
|
--- |
|
|
|
[EXL3](https://github.com/turboderp-org/exllamav3) quantization of [Jan-nano](https://huggingface.co/Menlo/Jan-nano), 8 bits per weight, including output layers. |
|
|
|
### HumanEval (argmax) |
|
|
|
| Model | Q4 | Q6 | Q8 | FP16 | |
|
| ---------------------------------------------------------------------------- | ---- | ---- | ---- | ---- | |
|
| [Jan-nano-exl3-4bpw](https://huggingface.co/isogen/Jan-nano-exl3-4bpw) | 79.9 | 81.7 | 82.9 | 82.9 | |
|
| [Jan-nano-exl3-6bpw](https://huggingface.co/isogen/Jan-nano-exl3-6bpw) | 83.5 | 81.7 | 81.7 | 81.1 | |
|
| [Jan-nano-exl3-8bpw-h8](https://huggingface.co/isogen/Jan-nano-exl3-8bpw-h8) | 84.8 | 82.9 | 83.5 | 82.9 | |
|
| [Qwen3-4B-exl3-4bpw](https://huggingface.co/isogen/Qwen3-4B-exl3-4bpw) | 80.5 | 81.1 | 81.7 | 80.5 | |
|
| [Qwen3-4B-exl3-6bpw](https://huggingface.co/isogen/Qwen3-4B-exl3-6bpw) | 80.5 | 85.4 | 86.0 | 86.0 | |
|
| [Qwen3-4B-exl3-8bpw-h8](https://huggingface.co/isogen/Qwen3-4B-exl3-8bpw-h8) | 82.3 | 84.8 | 83.5 | 82.9 | |
|
|