|
--- |
|
license: llama3.1 |
|
language: |
|
- en |
|
quantized_by: TheMelonGod |
|
pipeline_tag: text-generation |
|
tags: |
|
- quantized |
|
- safetensors |
|
- exllamav2 |
|
base_model: |
|
- Joseph717171/Llama-3.1-SuperNova-8B-Lite_TIES_with_Base |
|
base_model_relation: quantized |
|
--- |
|
ExLlamaV2 quantizations of: [Joseph717171 - Llama-3.1-SuperNova-8B-Lite_TIES_with_Base](https://huggingface.co/Joseph717171/Llama-3.1-SuperNova-8B-Lite_TIES_with_Base) |
|
|
|
|
|
Quantizations (6hb) |
|
[8.0bpw](https://huggingface.co/TheMelonGod/Llama-3.1-SuperNova-8B-Lite_TIES_with_Base-exl2/tree/8.00bpw) |
|
[7.5bpw](https://huggingface.co/TheMelonGod/Llama-3.1-SuperNova-8B-Lite_TIES_with_Base-exl2/tree/7.5bpw) |
|
[7.0bpw](https://huggingface.co/TheMelonGod/Llama-3.1-SuperNova-8B-Lite_TIES_with_Base-exl2/tree/7.0bpw) |
|
[6.5bpw](https://huggingface.co/TheMelonGod/Llama-3.1-SuperNova-8B-Lite_TIES_with_Base-exl2/tree/6.5bpw) |
|
[6.0bpw](https://huggingface.co/TheMelonGod/Llama-3.1-SuperNova-8B-Lite_TIES_with_Base-exl2/tree/6.0bpw) |
|
[5.5bpw](https://huggingface.co/TheMelonGod/Llama-3.1-SuperNova-8B-Lite_TIES_with_Base-exl2/tree/5.5bpw) |
|
[5.0bpw](https://huggingface.co/TheMelonGod/Llama-3.1-SuperNova-8B-Lite_TIES_with_Base-exl2/tree/5.0bpw) |
|
[4.5bpw](https://huggingface.co/TheMelonGod/Llama-3.1-SuperNova-8B-Lite_TIES_with_Base-exl2/tree/4.5bpw) |
|
[4.25bpw](https://huggingface.co/TheMelonGod/Llama-3.1-SuperNova-8B-Lite_TIES_with_Base-exl2/tree/4.25bpw) |
|
[4.0bpw](https://huggingface.co/TheMelonGod/Llama-3.1-SuperNova-8B-Lite_TIES_with_Base-exl2/tree/4.0bpw) |
|
[3.75bpw](https://huggingface.co/TheMelonGod/Llama-3.1-SuperNova-8B-Lite_TIES_with_Base-exl2/tree/3.75bpw) |
|
[3.5bpw](https://huggingface.co/TheMelonGod/Llama-3.1-SuperNova-8B-Lite_TIES_with_Base-exl2/tree/3.5bpw) |
|
[3.0bpw](https://huggingface.co/TheMelonGod/Llama-3.1-SuperNova-8B-Lite_TIES_with_Base-exl2/tree/3.0bpw) |
|
[2.75bpw](https://huggingface.co/TheMelonGod/Llama-3.1-SuperNova-8B-Lite_TIES_with_Base-exl2/tree/2.75bpw) |
|
[2.5bpw](https://huggingface.co/TheMelonGod/Llama-3.1-SuperNova-8B-Lite_TIES_with_Base-exl2/tree/2.5bpw) |
|
[2.25bpw](https://huggingface.co/TheMelonGod/Llama-3.1-SuperNova-8B-Lite_TIES_with_Base-exl2/tree/2.25bpw) |
|
[2.0bpw](https://huggingface.co/TheMelonGod/Llama-3.1-SuperNova-8B-Lite_TIES_with_Base-exl2/tree/2.0bpw) |
|
|
|
|
|
|
|
If you need a specific model quantization or a particular bits per weight, please let me know. I’m happy to help quantize lesser known models. |
|
|
|
|
|
If you have any suggestions for improvements or feedback, feel free to reach out. Your input is greatly appreciated and helps me make quantizations better for everyone. |
|
|
|
|
|
Special thanks to [turboderp](https://huggingface.co/turboderp) for developing the tools that made these quantizations possible. Your contributions are greatly appreciated! |