|
--- |
|
base_model: RekaAI/reka-flash-3 |
|
--- |
|
|
|
[EXL3](https://github.com/turboderp-org/exllamav3) quantization of [reka-flash-3](https://huggingface.co/RekaAI/reka-flash-3), 3 bits per weight. |
|
|
|
### HumanEval (argmax) |
|
|
|
| Model | Q4 | Q8 | FP16 | |
|
| ------------------------------------------------------------------------------ | ---- | ---- | ---- | |
|
| [reka-flash-3-exl3-3bpw](https://huggingface.co/isogen/reka-flash-3-exl3-3bpw) | 87.8 | 90.2 | 90.9 | |
|
| [reka-flash-3-exl3-4bpw](https://huggingface.co/isogen/reka-flash-3-exl3-4bpw) | 89.0 | 88.4 | 87.2 | |
|
|