Create README.md
Browse files
README.md
ADDED
@@ -0,0 +1,16 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
---
|
2 |
+
base_model: Menlo/Jan-nano
|
3 |
+
---
|
4 |
+
|
5 |
+
[EXL3](https://github.com/turboderp-org/exllamav3) quantization of [Jan-nano](https://huggingface.co/Menlo/Jan-nano), 8 bits per weight, including output layers.
|
6 |
+
|
7 |
+
### HumanEval (argmax)
|
8 |
+
|
9 |
+
| Model | Q4 | Q6 | Q8 | FP16 |
|
10 |
+
| ---------------------------------------------------------------------------- | ---- | ---- | ---- | ---- |
|
11 |
+
| [Jan-nano-exl3-4bpw](https://huggingface.co/isogen/Jan-nano-exl3-4bpw) | 79.9 | 81.7 | 82.9 | 82.9 |
|
12 |
+
| [Jan-nano-exl3-6bpw](https://huggingface.co/isogen/Jan-nano-exl3-6bpw) | 83.5 | 81.7 | 81.7 | 81.1 |
|
13 |
+
| [Jan-nano-exl3-8bpw-h8](https://huggingface.co/isogen/Jan-nano-exl3-8bpw-h8) | 84.8 | 82.9 | 83.5 | 82.9 |
|
14 |
+
| [Qwen3-4B-exl3-4bpw](https://huggingface.co/isogen/Qwen3-4B-exl3-4bpw) | 80.5 | 81.1 | 81.7 | 80.5 |
|
15 |
+
| [Qwen3-4B-exl3-6bpw](https://huggingface.co/isogen/Qwen3-4B-exl3-6bpw) | 80.5 | 85.4 | 86.0 | 86.0 |
|
16 |
+
| [Qwen3-4B-exl3-8bpw-h8](https://huggingface.co/isogen/Qwen3-4B-exl3-8bpw-h8) | 82.3 | 84.8 | 83.5 | 82.9 |
|