--- base_model: Goekdeniz-Guelmez/Josiefied-Qwen3-14B-abliterated-v3 --- [EXL3](https://github.com/turboderp-org/exllamav3) quantization of [Josiefied-Qwen3-14B-abliterated-v3](https://huggingface.co/Goekdeniz-Guelmez/Josiefied-Qwen3-14B-abliterated-v3), 6 bits per weight.