merge

This is a merge of pre-trained language models created using mergekit.

Merge Details

Merge Method

This model was merged using the DARE TIES merge method using Sao10K/L3-8B-Lunaris-v1 as a base.

Models Merged

The following models were included in the merge:

Configuration

The following YAML configuration was used to produce this model:

models:
  - model: Sao10K/L3-8B-Lunaris-v1
  - model: kromeurus/L3.1-Clouded-Uchtave-v0.1-8B
    parameters:
      density: [0.35, 0.45, 0.5, 0.55, 0.65, 0.55, 0.5, 0.45, 0.35]
      weight: [0.495, 0.165, 0.165, 0.495, 0.495, 0.165, 0.165, 0.495]
  - model: kromeurus/L3.1-Aglow-Vulca-v0.1-8B
    parameters:
      density: [0.65, 0.55, 0.5, 0.45, 0.35, 0.45, 0.5, 0.55, 0.65]
      weight: [0.165, 0.495, 0.495, 0.165, 0.165, 0.495, 0.495, 0.165]
merge_method: dare_ties
base_model: Sao10K/L3-8B-Lunaris-v1
parameters:
  normalize: false
  int8_mask: true
dtype: float32
out_dtype: bfloat16
Downloads last month
20
Safetensors
Model size
8.03B params
Tensor type
BF16
·
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.

Model tree for DopeyGay/L3-Lunaris-Clouded-Vulca-8B