File size: 793 Bytes
3b05a20 fd4fb53 3b05a20 fd4fb53 3b05a20 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 |
---
base_model:
- unsloth/Llama-3.2-3B-Instruct
- unsloth/Llama-3.2-3B
library_name: transformers
tags:
- mergekit
- merge
---
# Details
This is an experimental merge I plan to use for future projects, it shows promising results from my limited testing. Further testing should probably be done! I just don't have the time, nor compute right now.
### Configuration
The following YAML configuration was used to produce this model:
```yaml
models:
- model: unsloth/Llama-3.2-3B
parameters:
weight: 0.5
density: 0.7
- model: unsloth/Llama-3.2-3B-Instruct
parameters:
weight: 0.5
density: 0.6
merge_method: ties
base_model: unsloth/Llama-3.2-3B
parameters:
normalize: true
int8_mask: true
dtype: bfloat16
tokenizer_source: unsloth/Llama-3.2-3B
```
|