metadata
base_model:
- SicariusSicariiStuff/Negative_LLAMA_70B
datasets:
- SicariusSicariiStuff/UBW_Tapestries
language:
- en
library_name: transformers
license: apache-2.0
quantized_by: SicariusSicariiStuff
Note
This was made to test extremely low quants. 70B is pretty usable at 2.50 bpw. A custom strategy for quanting was used, for details check EXL3 repo. This was made to fit a single 32GB card like the 5090.