--- base_model: - Delta-Vector/Rei-12B - IlyaGusev/saiga_nemo_12b - redrix/patricide-12B-Unslop-Mell library_name: transformers tags: - mergekit - merge --- # merge This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit). ## Merge Details ### Merge Method This model was merged using the [DARE TIES](https://arxiv.org/abs/2311.03099) merge method using [Delta-Vector/Rei-12B](https://huggingface.co/Delta-Vector/Rei-12B) as a base. ### Models Merged The following models were included in the merge: * [IlyaGusev/saiga_nemo_12b](https://huggingface.co/IlyaGusev/saiga_nemo_12b) * [redrix/patricide-12B-Unslop-Mell](https://huggingface.co/redrix/patricide-12B-Unslop-Mell) ### Configuration The following YAML configuration was used to produce this model: ```yaml models: - model: Delta-Vector/Rei-12B parameters: weight: 1 density: 0.85 - model: redrix/patricide-12B-Unslop-Mell parameters: weight: 0.72 density: 0.89 - model: IlyaGusev/saiga_nemo_12b parameters: weight: 0.18 density: 0.71 merge_method: dare_ties base_model: Delta-Vector/Rei-12B parameters: density: 0.9 epsilon: 0.05 lambda: 1.1 dtype: bfloat16 ```