merge

This is a merge of pre-trained language models created using mergekit.

Merge Details

Merge Method

This model was merged using the DARE TIES merge method using cognitivecomputations/Dolphin3.0-Mistral-24B as a base.

Models Merged

The following models were included in the merge:

Configuration

The following YAML configuration was used to produce this model:

models:
  - model: cognitivecomputations/Dolphin3.0-Mistral-24B
    parameters:
      density: 0.4
      weight: 0.6
  - model: baconnier/Napoleon_24B_V0.0
    parameters:
      density: 0.4
      weight: 0.4

merge_method: dare_ties
base_model: cognitivecomputations/Dolphin3.0-Mistral-24B

parameters:
  int8_mask: true
  normalize: true

# CORRECTED: dtype must be top-level, not under parameters
dtype: bfloat16  # Valid options: float16|bfloat16|float32

tokenizer_source: baconnier/Napoleon_24B_V0.0
Downloads last month
22
Safetensors
Model size
23.6B params
Tensor type
BF16
·
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.

Model tree for baconnier/Napoleon_24B_V0.2