File size: 1,070 Bytes
716fe88
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
---
base_model: Menlo/Jan-nano
---

[EXL3](https://github.com/turboderp-org/exllamav3) quantization of [Jan-nano](https://huggingface.co/Menlo/Jan-nano), 6 bits per weight.

### HumanEval (argmax)

| Model                                                                        | Q4   | Q6   | Q8   | FP16 |
| ---------------------------------------------------------------------------- | ---- | ---- | ---- | ---- |
| [Jan-nano-exl3-4bpw](https://huggingface.co/isogen/Jan-nano-exl3-4bpw)       | 79.9 | 81.7 | 82.9 | 82.9 |
| [Jan-nano-exl3-6bpw](https://huggingface.co/isogen/Jan-nano-exl3-6bpw)       | 83.5 | 81.7 | 81.7 | 81.1 |
| [Jan-nano-exl3-8bpw-h8](https://huggingface.co/isogen/Jan-nano-exl3-8bpw-h8) | 84.8 | 82.9 | 83.5 | 82.9 |
| [Qwen3-4B-exl3-4bpw](https://huggingface.co/isogen/Qwen3-4B-exl3-4bpw)       | 80.5 | 81.1 | 81.7 | 80.5 |
| [Qwen3-4B-exl3-6bpw](https://huggingface.co/isogen/Qwen3-4B-exl3-6bpw)       | 80.5 | 85.4 | 86.0 | 86.0 |
| [Qwen3-4B-exl3-8bpw-h8](https://huggingface.co/isogen/Qwen3-4B-exl3-8bpw-h8) | 82.3 | 84.8 | 83.5 | 82.9 |