Nemoties-ChatML-12B
This is a merge of pre-trained language models created using mergekit.
Merge Details
Merge Method
This model was merged using the TIES merge method using mistralai/Mistral-Nemo-Base-2407 as a base.
Models Merged
The following models were included in the merge:
- nbeerbower/Mistral-Nemo-Prism-12B-v7
- nbeerbower/Mistral-Nemo-Gutenberg-Doppel-12B-v2
- nbeerbower/mistral-nemo-kartoffel-12B
- flammenai/Mahou-1.5-mistral-nemo-12B
- flammenai/Flammades-Mistral-Nemo-12B
- nbeerbower/mistral-nemo-bophades3-12B
Configuration
The following YAML configuration was used to produce this model:
models:
- model: flammenai/Mahou-1.5-mistral-nemo-12B
parameters:
weight: 1
density: 1
- model: nbeerbower/Mistral-Nemo-Gutenberg-Doppel-12B-v2
parameters:
weight: 1
density: 1
- model: flammenai/Flammades-Mistral-Nemo-12B
parameters:
weight: 1
density: 1
- model: nbeerbower/Mistral-Nemo-Prism-12B-v7
parameters:
weight: 1
density: 1
- model: nbeerbower/mistral-nemo-kartoffel-12B
parameters:
weight: 1
density: 1
- model: nbeerbower/mistral-nemo-bophades3-12B
parameters:
weight: 1
density: 1
merge_method: ties
base_model: mistralai/Mistral-Nemo-Base-2407
parameters:
weight: 1
density: 1
normalize: true
int8_mask: true
dtype: bfloat16
tokenizer:
source: nbeerbower/mistral-nemo-kartoffel-12B
- Downloads last month
- 10
Inference Providers
NEW
This model is not currently available via any of the supported Inference Providers.
Model tree for nbeerbower/Nemoties-ChatML-12B
Merge model
this model