Plasma-8B / mergekit_config.yml
DoesntKnowAI's picture
Upload folder using huggingface_hub
dfa76cc verified
raw
history blame
453 Bytes
slices:
- sources:
- model: DoesntKnowAI/MentalNitrogenOxide-8B
layer_range: [0, 32]
weight: 0.86
- model: arcee-ai/Llama-3.1-SuperNova-Lite
layer_range: [0, 32]
weight: 0.14
merge_method: slerp
parameters:
t:
- model: DoesntKnowAI/MentalNitrogenOxide-8B
value: 1.0
- model: arcee-ai/Llama-3.1-SuperNova-Lite
value: 1.0
base_model: DoesntKnowAI/MentalNitrogenOxide-8B
dtype: bfloat16