eval_name
stringlengths
12
111
Precision
stringclasses
3 values
Type
stringclasses
7 values
T
stringclasses
7 values
Weight type
stringclasses
2 values
Architecture
stringclasses
63 values
Model
stringlengths
355
689
fullname
stringlengths
4
102
Model sha
stringlengths
0
40
Average ⬆️
float64
0.74
52.1
Hub License
stringclasses
27 values
Hub ❤️
int64
0
6.04k
#Params (B)
float64
-1
141
Available on the hub
bool
2 classes
MoE
bool
2 classes
Flagged
bool
2 classes
Chat Template
bool
2 classes
CO₂ cost (kg)
float64
0.04
187
IFEval Raw
float64
0
0.9
IFEval
float64
0
90
BBH Raw
float64
0.24
0.75
BBH
float64
0.25
64.1
MATH Lvl 5 Raw
float64
0
0.71
MATH Lvl 5
float64
0
71.5
GPQA Raw
float64
0.21
0.47
GPQA
float64
0
29.4
MUSR Raw
float64
0.29
0.6
MUSR
float64
0
38.5
MMLU-PRO Raw
float64
0.1
0.73
MMLU-PRO
float64
0
70
Merged
bool
2 classes
Official Providers
bool
2 classes
Upload To Hub Date
stringclasses
502 values
Submission Date
stringclasses
241 values
Generation
int64
0
10
Base Model
stringlengths
4
102
BEE-spoke-data_tFINE-900m-e16-d32-instruct_2e_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
T5ForConditionalGeneration
<a target="_blank" href="https://huggingface.co/BEE-spoke-data/tFINE-900m-e16-d32-instruct_2e" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BEE-spoke-data/tFINE-900m-e16-d32-instruct_2e</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BEE-spoke-data__tFINE-900m-e16-d32-instruct_2e-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
BEE-spoke-data/tFINE-900m-e16-d32-instruct_2e
4c626138c9f4e0c3eafe74b2755eb89334c7ca59
5.908138
apache-2.0
0
0.887
true
false
false
false
5.033237
0.140286
14.028555
0.313457
5.01307
0.013595
1.359517
0.259228
1.230425
0.420698
11.18724
0.12367
2.630024
false
false
2024-09-17
2024-09-22
3
pszemraj/tFINE-900m-e16-d32-1024ctx
BEE-spoke-data_tFINE-900m-instruct-orpo_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
T5ForConditionalGeneration
<a target="_blank" href="https://huggingface.co/BEE-spoke-data/tFINE-900m-instruct-orpo" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BEE-spoke-data/tFINE-900m-instruct-orpo</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BEE-spoke-data__tFINE-900m-instruct-orpo-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
BEE-spoke-data/tFINE-900m-instruct-orpo
e0a21c79bac74442252d36e2c01403afa3f0971b
3.696308
apache-2.0
0
0.887
true
false
false
true
5.149924
0.132992
13.299157
0.302209
3.267301
0.015861
1.586103
0.259228
1.230425
0.340854
1.106771
0.115193
1.688091
false
false
2024-09-22
2024-09-23
0
BEE-spoke-data/tFINE-900m-instruct-orpo
BSC-LT_salamandra-7b_float16
float16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/BSC-LT/salamandra-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BSC-LT/salamandra-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BSC-LT__salamandra-7b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
BSC-LT/salamandra-7b
bf30739316ceac4b624583a27ec96dfc401179e8
5.704911
apache-2.0
26
7.768
true
false
false
false
0.378577
0.136738
13.67383
0.351661
10.157422
0.003776
0.377644
0.270134
2.684564
0.350094
1.861719
0.149269
5.474291
false
false
2024-09-30
2024-11-22
0
BSC-LT/salamandra-7b
BSC-LT_salamandra-7b-instruct_float16
float16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/BSC-LT/salamandra-7b-instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BSC-LT/salamandra-7b-instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BSC-LT__salamandra-7b-instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
BSC-LT/salamandra-7b-instruct
77ddccbc7d9f9ffd55a8535365e8eebc493ccb8e
10.181244
apache-2.0
52
7.768
true
false
false
true
2.295008
0.245074
24.507418
0.385132
14.688129
0.008308
0.830816
0.264262
1.901566
0.413437
10.213021
0.180519
8.946513
false
false
2024-09-30
2024-11-22
1
BSC-LT/salamandra-7b-instruct (Merge)
Ba2han_Llama-Phi-3_DoRA_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Ba2han/Llama-Phi-3_DoRA" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Ba2han/Llama-Phi-3_DoRA</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Ba2han__Llama-Phi-3_DoRA-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Ba2han/Llama-Phi-3_DoRA
36f99064a7be8ba475c2ee5c5424e95c263ccb87
25.469895
mit
6
3.821
true
false
false
true
1.066273
0.513053
51.305314
0.551456
37.249164
0.121601
12.160121
0.326342
10.178971
0.406927
9.532552
0.391539
32.393248
false
false
2024-05-15
2024-06-26
0
Ba2han/Llama-Phi-3_DoRA
BenevolenceMessiah_Qwen2.5-72B-2x-Instruct-TIES-v1.0_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/BenevolenceMessiah/Qwen2.5-72B-2x-Instruct-TIES-v1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BenevolenceMessiah/Qwen2.5-72B-2x-Instruct-TIES-v1.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BenevolenceMessiah__Qwen2.5-72B-2x-Instruct-TIES-v1.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
BenevolenceMessiah/Qwen2.5-72B-2x-Instruct-TIES-v1.0
459891ec78c9bbed2836a8bba706e1707db10231
42.26732
0
72.7
false
false
false
true
34.701784
0.54735
54.734992
0.727311
61.911495
0.57855
57.854985
0.36745
15.659955
0.420667
12.016667
0.562832
51.425827
false
false
2024-11-11
2024-11-24
1
BenevolenceMessiah/Qwen2.5-72B-2x-Instruct-TIES-v1.0 (Merge)
BenevolenceMessiah_Yi-Coder-9B-Chat-Instruct-TIES-MoE-v1.0_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/BenevolenceMessiah/Yi-Coder-9B-Chat-Instruct-TIES-MoE-v1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BenevolenceMessiah/Yi-Coder-9B-Chat-Instruct-TIES-MoE-v1.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BenevolenceMessiah__Yi-Coder-9B-Chat-Instruct-TIES-MoE-v1.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
BenevolenceMessiah/Yi-Coder-9B-Chat-Instruct-TIES-MoE-v1.0
d90f6e36584dc9b367461701e83c833bdeb736f2
15.071092
apache-2.0
0
28.309
true
true
false
false
6.669594
0.301153
30.115316
0.490867
26.877991
0.041541
4.154079
0.262584
1.677852
0.407979
8.930729
0.268035
18.670582
true
false
2024-09-21
2024-09-22
1
BenevolenceMessiah/Yi-Coder-9B-Chat-Instruct-TIES-MoE-v1.0 (Merge)
BlackBeenie_Bloslain-8B-v0.2_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/BlackBeenie/Bloslain-8B-v0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BlackBeenie/Bloslain-8B-v0.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BlackBeenie__Bloslain-8B-v0.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
BlackBeenie/Bloslain-8B-v0.2
ebcb7f9f30bc172523a827d1ddefeb52b1aba494
23.803914
1
8.03
false
false
false
false
1.383526
0.502337
50.233713
0.511088
30.662902
0.145015
14.501511
0.306208
7.494407
0.407573
10.446615
0.365359
29.484338
false
false
2024-11-19
2024-11-19
1
BlackBeenie/Bloslain-8B-v0.2 (Merge)
BlackBeenie_Llama-3.1-8B-OpenO1-SFT-v0.1_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/BlackBeenie/Llama-3.1-8B-OpenO1-SFT-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BlackBeenie/Llama-3.1-8B-OpenO1-SFT-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BlackBeenie__Llama-3.1-8B-OpenO1-SFT-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
BlackBeenie/Llama-3.1-8B-OpenO1-SFT-v0.1
35e7781b9dff5aea29576709201d641e5f44440d
21.40041
apache-2.0
1
8.03
true
false
false
true
1.462856
0.512404
51.240376
0.478745
26.03429
0.152568
15.256798
0.268456
2.46085
0.361813
5.726563
0.349152
27.683585
false
false
2024-12-28
2024-12-29
1
BlackBeenie/Llama-3.1-8B-OpenO1-SFT-v0.1 (Merge)
BlackBeenie_Llama-3.1-8B-pythonic-passthrough-merge_float16
float16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/BlackBeenie/Llama-3.1-8B-pythonic-passthrough-merge" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BlackBeenie/Llama-3.1-8B-pythonic-passthrough-merge</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BlackBeenie__Llama-3.1-8B-pythonic-passthrough-merge-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
BlackBeenie/Llama-3.1-8B-pythonic-passthrough-merge
3ec46616f5b34821b3b928938931295f92e49213
7.399579
0
20.245
false
false
false
false
7.166581
0.231586
23.158553
0.345385
9.359905
0.011329
1.132931
0.268456
2.46085
0.377812
4.593229
0.133228
3.692007
false
false
2024-11-06
2024-11-06
1
BlackBeenie/Llama-3.1-8B-pythonic-passthrough-merge (Merge)
BlackBeenie_Neos-Gemma-2-9b_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/BlackBeenie/Neos-Gemma-2-9b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BlackBeenie/Neos-Gemma-2-9b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BlackBeenie__Neos-Gemma-2-9b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
BlackBeenie/Neos-Gemma-2-9b
56dbbb4f972be887e5b57311a8a32e148e98d154
25.475663
apache-2.0
1
9.242
true
false
false
true
5.358184
0.587567
58.756655
0.550298
35.638851
0.098187
9.818731
0.322987
9.731544
0.36175
5.785417
0.398105
33.122784
false
false
2024-11-11
2024-11-11
1
BlackBeenie/Neos-Gemma-2-9b (Merge)
BlackBeenie_Neos-Llama-3.1-8B_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/BlackBeenie/Neos-Llama-3.1-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BlackBeenie/Neos-Llama-3.1-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BlackBeenie__Neos-Llama-3.1-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
BlackBeenie/Neos-Llama-3.1-8B
9b48520ec1a777be0f1fd88f95454d85ac568407
19.512177
apache-2.0
1
8.03
true
false
false
true
1.587734
0.494394
49.439376
0.4425
21.080123
0.132175
13.217523
0.268456
2.46085
0.37499
5.740365
0.326213
25.134826
false
false
2024-11-12
2024-11-12
1
BlackBeenie/Neos-Llama-3.1-8B (Merge)
BlackBeenie_Neos-Llama-3.1-base_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/BlackBeenie/Neos-Llama-3.1-base" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BlackBeenie/Neos-Llama-3.1-base</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BlackBeenie__Neos-Llama-3.1-base-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
BlackBeenie/Neos-Llama-3.1-base
d4af4d73ba5fea0275fd1e3ba5102a79ac8009db
3.968795
0
4.65
false
false
false
true
2.818569
0.175082
17.508212
0.293034
2.221447
0
0
0.237416
0
0.349906
2.838281
0.111203
1.244829
false
false
2024-11-11
2024-11-11
0
BlackBeenie/Neos-Llama-3.1-base
BlackBeenie_Neos-Phi-3-14B-v0.1_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Phi3ForCausalLM
<a target="_blank" href="https://huggingface.co/BlackBeenie/Neos-Phi-3-14B-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BlackBeenie/Neos-Phi-3-14B-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BlackBeenie__Neos-Phi-3-14B-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
BlackBeenie/Neos-Phi-3-14B-v0.1
0afb7cc74a94f11f2695dc92788cdc6e28325f9c
27.032307
apache-2.0
0
13.96
true
false
false
true
1.819252
0.402245
40.224493
0.621193
46.631387
0.178248
17.824773
0.305369
7.38255
0.412542
10.534375
0.456366
39.596262
false
false
2024-11-27
2024-11-27
1
BlackBeenie/Neos-Phi-3-14B-v0.1 (Merge)
BlackBeenie_llama-3-luminous-merged_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/BlackBeenie/llama-3-luminous-merged" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BlackBeenie/llama-3-luminous-merged</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BlackBeenie__llama-3-luminous-merged-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
BlackBeenie/llama-3-luminous-merged
64288dd8e3305f2dc11d84fe0c653f351b2e8a9d
21.618577
0
8.03
false
false
false
false
1.527707
0.432345
43.234507
0.515392
30.643687
0.086858
8.685801
0.292785
5.704698
0.414896
10.628646
0.377327
30.814125
false
false
2024-09-15
2024-10-11
1
BlackBeenie/llama-3-luminous-merged (Merge)
BlackBeenie_llama-3.1-8B-Galore-openassistant-guanaco_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/BlackBeenie/llama-3.1-8B-Galore-openassistant-guanaco" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BlackBeenie/llama-3.1-8B-Galore-openassistant-guanaco</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BlackBeenie__llama-3.1-8B-Galore-openassistant-guanaco-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
BlackBeenie/llama-3.1-8B-Galore-openassistant-guanaco
828fa03c10e9085700b7abbe26f95067fab010fd
18.374215
1
8.03
false
false
false
false
1.71364
0.263484
26.348422
0.521337
31.444705
0.066465
6.646526
0.300336
6.711409
0.440625
14.578125
0.320645
24.516105
false
false
2024-10-16
2024-10-19
0
BlackBeenie/llama-3.1-8B-Galore-openassistant-guanaco
Bllossom_llama-3.2-Korean-Bllossom-AICA-5B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MllamaForConditionalGeneration
<a target="_blank" href="https://huggingface.co/Bllossom/llama-3.2-Korean-Bllossom-AICA-5B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Bllossom/llama-3.2-Korean-Bllossom-AICA-5B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Bllossom__llama-3.2-Korean-Bllossom-AICA-5B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Bllossom/llama-3.2-Korean-Bllossom-AICA-5B
4672b7de38c2cc390b146d6b6ce7a6dd295d8a0e
19.012852
llama3.2
61
5.199
true
false
false
true
1.220236
0.51725
51.724979
0.429307
18.650223
0.123867
12.386707
0.298658
6.487696
0.383396
5.824479
0.271027
19.003029
false
false
2024-12-12
2024-12-16
0
Bllossom/llama-3.2-Korean-Bllossom-AICA-5B
BoltMonkey_DreadMix_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/BoltMonkey/DreadMix" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BoltMonkey/DreadMix</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BoltMonkey__DreadMix-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
BoltMonkey/DreadMix
ab5dbaaff606538db73b6fd89aa169760104a566
28.761732
0
8.03
false
false
false
true
2.419428
0.709491
70.949082
0.54351
34.845015
0.155589
15.558912
0.299497
6.599553
0.421219
13.61901
0.378989
30.998818
false
false
2024-10-12
2024-10-13
1
BoltMonkey/DreadMix (Merge)
BoltMonkey_NeuralDaredevil-SuperNova-Lite-7B-DARETIES-abliterated_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/BoltMonkey/NeuralDaredevil-SuperNova-Lite-7B-DARETIES-abliterated" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BoltMonkey/NeuralDaredevil-SuperNova-Lite-7B-DARETIES-abliterated</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BoltMonkey__NeuralDaredevil-SuperNova-Lite-7B-DARETIES-abliterated-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
BoltMonkey/NeuralDaredevil-SuperNova-Lite-7B-DARETIES-abliterated
969e4c9b41e733a367f5ea18ed50a6171b5e2357
27.776634
llama3.1
2
8.03
true
false
false
true
2.486203
0.799891
79.989096
0.515199
30.7599
0.119335
11.933535
0.28104
4.138702
0.401875
9.467708
0.373338
30.370863
true
false
2024-10-01
2024-10-10
1
BoltMonkey/NeuralDaredevil-SuperNova-Lite-7B-DARETIES-abliterated (Merge)
BoltMonkey_NeuralDaredevil-SuperNova-Lite-7B-DARETIES-abliterated_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/BoltMonkey/NeuralDaredevil-SuperNova-Lite-7B-DARETIES-abliterated" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BoltMonkey/NeuralDaredevil-SuperNova-Lite-7B-DARETIES-abliterated</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BoltMonkey__NeuralDaredevil-SuperNova-Lite-7B-DARETIES-abliterated-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
BoltMonkey/NeuralDaredevil-SuperNova-Lite-7B-DARETIES-abliterated
969e4c9b41e733a367f5ea18ed50a6171b5e2357
21.345511
llama3.1
2
8.03
true
false
false
false
0.774319
0.459023
45.902317
0.518544
30.793785
0.093656
9.365559
0.274329
3.243848
0.40826
9.532552
0.363115
29.235003
true
false
2024-10-01
2024-10-01
1
BoltMonkey/NeuralDaredevil-SuperNova-Lite-7B-DARETIES-abliterated (Merge)
BoltMonkey_SuperNeuralDreadDevil-8b_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/BoltMonkey/SuperNeuralDreadDevil-8b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BoltMonkey/SuperNeuralDreadDevil-8b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BoltMonkey__SuperNeuralDreadDevil-8b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
BoltMonkey/SuperNeuralDreadDevil-8b
804d5864127e603abec179a159b43f446246fafc
27.111055
1
8.03
false
false
false
true
3.281416
0.77099
77.098986
0.52862
32.612158
0.0929
9.29003
0.291946
5.592841
0.397687
8.310938
0.367852
29.761377
false
false
2024-10-13
2024-10-13
1
BoltMonkey/SuperNeuralDreadDevil-8b (Merge)
BrainWave-ML_llama3.2-3B-maths-orpo_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/BrainWave-ML/llama3.2-3B-maths-orpo" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BrainWave-ML/llama3.2-3B-maths-orpo</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BrainWave-ML__llama3.2-3B-maths-orpo-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
BrainWave-ML/llama3.2-3B-maths-orpo
d149d83d8e8f3883421d800848fec85766181923
5.076083
apache-2.0
2
3
true
false
false
false
1.414438
0.204907
20.490742
0.291178
2.347041
0
0
0.259228
1.230425
0.357531
4.52474
0.116772
1.863549
false
false
2024-10-24
2024-10-24
2
meta-llama/Llama-3.2-3B-Instruct
BramVanroy_GEITje-7B-ultra_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/BramVanroy/GEITje-7B-ultra" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BramVanroy/GEITje-7B-ultra</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BramVanroy__GEITje-7B-ultra-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
BramVanroy/GEITje-7B-ultra
d4552cdc6f015754646464d8411aa4f6bcdba8e8
11.022899
cc-by-nc-4.0
41
7.242
true
false
false
true
1.239046
0.372344
37.234427
0.377616
12.879913
0.015861
1.586103
0.262584
1.677852
0.328979
1.522396
0.20113
11.236702
false
false
2024-01-27
2024-10-28
3
mistralai/Mistral-7B-v0.1
BramVanroy_fietje-2_bfloat16
bfloat16
🟩 continuously pretrained
🟩
Original
PhiForCausalLM
<a target="_blank" href="https://huggingface.co/BramVanroy/fietje-2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BramVanroy/fietje-2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BramVanroy__fietje-2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
BramVanroy/fietje-2
3abe75d01094b713368e3d911ffb78a2d66ead22
9.1403
mit
9
2.78
true
false
false
false
0.625077
0.209803
20.980332
0.403567
15.603676
0.015861
1.586103
0.254195
0.559284
0.369563
5.161979
0.198554
10.950428
false
false
2024-04-09
2024-10-28
1
microsoft/phi-2
BramVanroy_fietje-2-chat_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
PhiForCausalLM
<a target="_blank" href="https://huggingface.co/BramVanroy/fietje-2-chat" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BramVanroy/fietje-2-chat</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BramVanroy__fietje-2-chat-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
BramVanroy/fietje-2-chat
364e785d90438b787b94e33741a930c9932353c0
10.615455
mit
5
2.775
true
false
false
true
0.798065
0.291736
29.173593
0.414975
17.718966
0.018882
1.888218
0.239933
0
0.35276
3.195052
0.205452
11.716903
false
false
2024-04-29
2024-10-28
3
microsoft/phi-2
BramVanroy_fietje-2-instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
PhiForCausalLM
<a target="_blank" href="https://huggingface.co/BramVanroy/fietje-2-instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BramVanroy/fietje-2-instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BramVanroy__fietje-2-instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
BramVanroy/fietje-2-instruct
b7b44797cd52eda1182667217e8371dbdfee4976
10.485718
mit
3
2.775
true
false
false
true
0.64879
0.278996
27.89964
0.413607
17.57248
0.022659
2.265861
0.233221
0
0.336917
2.914583
0.210356
12.261746
false
false
2024-04-27
2024-10-28
2
microsoft/phi-2
CarrotAI_Llama-3.2-Rabbit-Ko-3B-Instruct_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/CarrotAI/Llama-3.2-Rabbit-Ko-3B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CarrotAI/Llama-3.2-Rabbit-Ko-3B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CarrotAI__Llama-3.2-Rabbit-Ko-3B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CarrotAI/Llama-3.2-Rabbit-Ko-3B-Instruct
5be46c768d800447b82de41fdc9df2f8c43ba3c0
23.509451
llama3.2
8
3.213
true
false
false
true
1.135907
0.719882
71.988213
0.442672
21.49731
0.205438
20.543807
0.270973
2.796421
0.364917
3.98125
0.282247
20.249704
false
false
2024-09-30
2024-12-20
1
CarrotAI/Llama-3.2-Rabbit-Ko-3B-Instruct (Merge)
CarrotAI_Llama-3.2-Rabbit-Ko-3B-Instruct-2412_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/CarrotAI/Llama-3.2-Rabbit-Ko-3B-Instruct-2412" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CarrotAI/Llama-3.2-Rabbit-Ko-3B-Instruct-2412</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CarrotAI__Llama-3.2-Rabbit-Ko-3B-Instruct-2412-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CarrotAI/Llama-3.2-Rabbit-Ko-3B-Instruct-2412
ac6f1c0b756412163e17cb05d9e2f7ced274dc12
20.301755
llama3.2
2
3.213
true
false
false
false
1.287179
0.478182
47.818233
0.435772
20.17568
0.175982
17.598187
0.292785
5.704698
0.387208
6.801042
0.313414
23.712692
false
false
2024-12-03
2024-12-19
1
CarrotAI/Llama-3.2-Rabbit-Ko-3B-Instruct-2412 (Merge)
Casual-Autopsy_L3-Umbral-Mind-RP-v2.0-8B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Casual-Autopsy/L3-Umbral-Mind-RP-v2.0-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Casual-Autopsy/L3-Umbral-Mind-RP-v2.0-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Casual-Autopsy__L3-Umbral-Mind-RP-v2.0-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Casual-Autopsy/L3-Umbral-Mind-RP-v2.0-8B
b46c066ea8387264858dc3461f382e7b42fd9c48
25.899339
llama3
15
8.03
true
false
false
true
1.97677
0.712263
71.226346
0.526241
32.486278
0.109517
10.951662
0.286913
4.9217
0.368667
5.55
0.37234
30.260047
true
false
2024-06-26
2024-07-02
1
Casual-Autopsy/L3-Umbral-Mind-RP-v2.0-8B (Merge)
CausalLM_14B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/CausalLM/14B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CausalLM/14B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CausalLM__14B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CausalLM/14B
cc054cf5953252d0709cb3267d1a85246e489e95
17.23558
wtfpl
302
14
true
false
false
false
1.992829
0.278821
27.882131
0.470046
24.780943
0.075529
7.55287
0.302852
7.04698
0.415479
11.468229
0.322141
24.682329
false
true
2023-10-22
2024-06-12
0
CausalLM/14B
CausalLM_34b-beta_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/CausalLM/34b-beta" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CausalLM/34b-beta</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CausalLM__34b-beta-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CausalLM/34b-beta
0429951eb30ccdfff3515e711aaa7649a8a7364c
23.297833
gpl-3.0
63
34.389
true
false
false
false
5.853193
0.304325
30.432475
0.5591
36.677226
0.048338
4.833837
0.346477
12.863535
0.374865
6.92474
0.532497
48.055186
false
true
2024-02-06
2024-06-26
0
CausalLM/34b-beta
CausalLM_preview-1-hf_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
GlmForCausalLM
<a target="_blank" href="https://huggingface.co/CausalLM/preview-1-hf" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CausalLM/preview-1-hf</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CausalLM__preview-1-hf-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CausalLM/preview-1-hf
08e1e1ab428a591e74d849ff30bd8766474205bf
16.706753
0
9.543
true
false
false
true
2.557498
0.555893
55.589281
0.361457
10.100941
0.030211
3.021148
0.261745
1.565996
0.342188
1.106771
0.359707
28.856383
false
true
2025-01-26
0
Removed
Changgil_K2S3-14b-v0.2_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Changgil/K2S3-14b-v0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Changgil/K2S3-14b-v0.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Changgil__K2S3-14b-v0.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Changgil/K2S3-14b-v0.2
b4f0e1eed2640df2b75847ff37e6ebb1be217b6c
15.275785
cc-by-nc-4.0
0
14.352
true
false
false
false
3.249261
0.324284
32.428401
0.461331
24.283947
0.057402
5.740181
0.28104
4.138702
0.39226
6.799219
0.264378
18.264258
false
false
2024-06-17
2024-06-27
0
Changgil/K2S3-14b-v0.2
Changgil_K2S3-v0.1_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Changgil/K2S3-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Changgil/K2S3-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Changgil__K2S3-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Changgil/K2S3-v0.1
d544e389f091983bb4f11314edb526d81753c919
14.839284
cc-by-nc-4.0
0
14.352
true
false
false
false
2.499765
0.327656
32.765617
0.465549
24.559558
0.046073
4.607251
0.264262
1.901566
0.401406
7.842448
0.256233
17.359264
false
false
2024-04-29
2024-06-27
0
Changgil/K2S3-v0.1
ClaudioItaly_Albacus_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/ClaudioItaly/Albacus" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ClaudioItaly/Albacus</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ClaudioItaly__Albacus-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ClaudioItaly/Albacus
a53faf62d0f99b67478ed9d262872c821a3ba83c
20.505574
mit
1
8.987
true
false
false
false
1.507878
0.466742
46.674158
0.511304
31.638865
0.070997
7.099698
0.271812
2.908277
0.413531
10.658073
0.316489
24.054374
true
false
2024-09-08
2024-09-08
1
ClaudioItaly/Albacus (Merge)
ClaudioItaly_Book-Gut12B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/ClaudioItaly/Book-Gut12B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ClaudioItaly/Book-Gut12B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ClaudioItaly__Book-Gut12B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ClaudioItaly/Book-Gut12B
ae54351faca8170c93bf1de3a51bf16650f5bcf5
23.394098
mit
1
12.248
true
false
false
false
2.904496
0.399847
39.984685
0.541737
34.632193
0.101964
10.196375
0.307047
7.606264
0.463542
18.276042
0.367021
29.669031
true
false
2024-09-12
2024-09-17
1
ClaudioItaly/Book-Gut12B (Merge)
ClaudioItaly_Evolutionstory-7B-v2.2_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/ClaudioItaly/Evolutionstory-7B-v2.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ClaudioItaly/Evolutionstory-7B-v2.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ClaudioItaly__Evolutionstory-7B-v2.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ClaudioItaly/Evolutionstory-7B-v2.2
9f838721d24a5195bed59a5ed8d9af536f7f2459
20.810835
mit
2
7.242
true
false
false
false
1.120464
0.481379
48.137941
0.510804
31.623865
0.070997
7.099698
0.275168
3.355705
0.413531
10.658073
0.315908
23.989731
true
false
2024-08-30
2024-09-01
1
ClaudioItaly/Evolutionstory-7B-v2.2 (Merge)
ClaudioItaly_intelligence-cod-rag-7b-v3_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/ClaudioItaly/intelligence-cod-rag-7b-v3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ClaudioItaly/intelligence-cod-rag-7b-v3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ClaudioItaly__intelligence-cod-rag-7b-v3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ClaudioItaly/intelligence-cod-rag-7b-v3
2b21473c8a086f8d0c54b82c3454bf5499cdde3a
31.836966
mit
0
7.616
true
false
false
true
1.320945
0.689782
68.9782
0.536634
34.776159
0.380665
38.066465
0.272651
3.020134
0.415271
10.675521
0.419548
35.505319
true
false
2024-11-29
2024-12-02
1
ClaudioItaly/intelligence-cod-rag-7b-v3 (Merge)
CohereForAI_aya-23-35B_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
CohereForCausalLM
<a target="_blank" href="https://huggingface.co/CohereForAI/aya-23-35B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CohereForAI/aya-23-35B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CohereForAI__aya-23-35B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CohereForAI/aya-23-35B
31d6fd858f20539a55401c7ad913086f54d9ca2c
24.755408
cc-by-nc-4.0
271
34.981
true
false
false
true
33.970634
0.646193
64.619321
0.539955
34.85836
0.034743
3.47432
0.294463
5.928412
0.43099
13.473698
0.335605
26.178339
false
true
2024-05-19
2024-06-12
0
CohereForAI/aya-23-35B
CohereForAI_aya-23-8B_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
CohereForCausalLM
<a target="_blank" href="https://huggingface.co/CohereForAI/aya-23-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CohereForAI/aya-23-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CohereForAI__aya-23-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CohereForAI/aya-23-8B
ec151d218a24031eb039d92fb83d10445427efc9
16.010983
cc-by-nc-4.0
405
8.028
true
false
false
true
2.390344
0.469889
46.988878
0.429616
20.203761
0.016616
1.661631
0.284396
4.58613
0.394063
8.424479
0.227809
14.20102
false
true
2024-05-19
2024-06-12
0
CohereForAI/aya-23-8B
CohereForAI_aya-expanse-32b_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
CohereForCausalLM
<a target="_blank" href="https://huggingface.co/CohereForAI/aya-expanse-32b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CohereForAI/aya-expanse-32b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CohereForAI__aya-expanse-32b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CohereForAI/aya-expanse-32b
08b69cfa4240e2009c80ad304f000b491d1b8c38
29.71851
cc-by-nc-4.0
216
32.296
true
false
false
true
11.03547
0.730174
73.017372
0.564867
38.709611
0.153323
15.332326
0.325503
10.067114
0.387271
6.408854
0.412982
34.775783
false
true
2024-10-23
2024-10-24
0
CohereForAI/aya-expanse-32b
CohereForAI_aya-expanse-8b_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
CohereForCausalLM
<a target="_blank" href="https://huggingface.co/CohereForAI/aya-expanse-8b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CohereForAI/aya-expanse-8b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CohereForAI__aya-expanse-8b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CohereForAI/aya-expanse-8b
b9848575c8731981dfcf2e1f3bfbcb917a2e585d
22.406574
cc-by-nc-4.0
335
8.028
true
false
false
true
2.339378
0.635852
63.585176
0.49772
28.523483
0.086103
8.610272
0.302852
7.04698
0.372885
4.410677
0.300366
22.262855
false
true
2024-10-23
2024-10-24
0
CohereForAI/aya-expanse-8b
CohereForAI_c4ai-command-r-plus_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
CohereForCausalLM
<a target="_blank" href="https://huggingface.co/CohereForAI/c4ai-command-r-plus" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CohereForAI/c4ai-command-r-plus</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CohereForAI__c4ai-command-r-plus-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CohereForAI/c4ai-command-r-plus
fa1bd7fb1572ceb861bbbbecfa8af83b29fa8cca
30.936071
cc-by-nc-4.0
1,709
103.811
true
false
false
true
57.263063
0.766419
76.641866
0.581542
39.919954
0.08006
8.006042
0.305369
7.38255
0.480719
20.423177
0.399186
33.242834
false
true
2024-04-03
2024-06-13
0
CohereForAI/c4ai-command-r-plus
CohereForAI_c4ai-command-r-plus-08-2024_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
CohereForCausalLM
<a target="_blank" href="https://huggingface.co/CohereForAI/c4ai-command-r-plus-08-2024" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CohereForAI/c4ai-command-r-plus-08-2024</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CohereForAI__c4ai-command-r-plus-08-2024-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CohereForAI/c4ai-command-r-plus-08-2024
2d8cf3ab0af78b9e43546486b096f86adf3ba4d0
33.647475
cc-by-nc-4.0
236
103.811
true
false
false
true
44.637753
0.753954
75.395395
0.5996
42.836865
0.123867
12.386707
0.350671
13.422819
0.482948
19.835156
0.442071
38.007905
false
true
2024-08-21
2024-09-19
0
CohereForAI/c4ai-command-r-plus-08-2024
CohereForAI_c4ai-command-r-v01_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
CohereForCausalLM
<a target="_blank" href="https://huggingface.co/CohereForAI/c4ai-command-r-v01" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CohereForAI/c4ai-command-r-v01</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CohereForAI__c4ai-command-r-v01-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CohereForAI/c4ai-command-r-v01
16881ccde1c68bbc7041280e6a66637bc46bfe88
25.929032
cc-by-nc-4.0
1,075
34.981
true
false
false
true
26.790875
0.674819
67.481948
0.540642
34.556659
0.034743
3.47432
0.307047
7.606264
0.451698
16.128906
0.336935
26.326093
false
true
2024-03-11
2024-06-13
0
CohereForAI/c4ai-command-r-v01
CohereForAI_c4ai-command-r7b-12-2024_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Cohere2ForCausalLM
<a target="_blank" href="https://huggingface.co/CohereForAI/c4ai-command-r7b-12-2024" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CohereForAI/c4ai-command-r7b-12-2024</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CohereForAI__c4ai-command-r7b-12-2024-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CohereForAI/c4ai-command-r7b-12-2024
a9650f3bda8b0e00825ee36592e086b4ee621102
31.617529
cc-by-nc-4.0
360
8.028
true
false
false
true
4.909614
0.771315
77.131456
0.550264
36.024564
0.299094
29.909366
0.308725
7.829978
0.41251
10.230469
0.357214
28.579344
false
true
2024-12-11
2024-12-20
0
CohereForAI/c4ai-command-r7b-12-2024
Columbia-NLP_LION-Gemma-2b-dpo-v1.0_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
GemmaForCausalLM
<a target="_blank" href="https://huggingface.co/Columbia-NLP/LION-Gemma-2b-dpo-v1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Columbia-NLP/LION-Gemma-2b-dpo-v1.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Columbia-NLP__LION-Gemma-2b-dpo-v1.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Columbia-NLP/LION-Gemma-2b-dpo-v1.0
a5f780075831374f8850324448acf94976dea504
11.483995
0
2.506
false
false
false
true
0.979648
0.327831
32.783127
0.391996
14.585976
0.043051
4.305136
0.249161
0
0.41201
9.834635
0.166556
7.395095
false
false
2024-06-28
2024-07-04
0
Columbia-NLP/LION-Gemma-2b-dpo-v1.0
Columbia-NLP_LION-Gemma-2b-dpo-v1.0_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
GemmaForCausalLM
<a target="_blank" href="https://huggingface.co/Columbia-NLP/LION-Gemma-2b-dpo-v1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Columbia-NLP/LION-Gemma-2b-dpo-v1.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Columbia-NLP__LION-Gemma-2b-dpo-v1.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Columbia-NLP/LION-Gemma-2b-dpo-v1.0
a5f780075831374f8850324448acf94976dea504
11.262093
0
2.506
false
false
false
true
1.989138
0.310246
31.02457
0.388103
14.243046
0.053625
5.362538
0.253356
0.447427
0.408073
9.109115
0.166473
7.38586
false
false
2024-06-28
2024-07-04
0
Columbia-NLP/LION-Gemma-2b-dpo-v1.0
Columbia-NLP_LION-Gemma-2b-odpo-v1.0_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
GemmaForCausalLM
<a target="_blank" href="https://huggingface.co/Columbia-NLP/LION-Gemma-2b-odpo-v1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Columbia-NLP/LION-Gemma-2b-odpo-v1.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Columbia-NLP__LION-Gemma-2b-odpo-v1.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Columbia-NLP/LION-Gemma-2b-odpo-v1.0
090d9f59c3b47ab8dd099ddd278c058aa6d2d529
11.897379
4
2.506
false
false
false
true
1.924136
0.306649
30.664858
0.389584
14.023922
0.069486
6.94864
0.24245
0
0.427917
12.05625
0.169215
7.690603
false
false
2024-06-28
2024-07-13
0
Columbia-NLP/LION-Gemma-2b-odpo-v1.0
Columbia-NLP_LION-Gemma-2b-sft-v1.0_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
GemmaForCausalLM
<a target="_blank" href="https://huggingface.co/Columbia-NLP/LION-Gemma-2b-sft-v1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Columbia-NLP/LION-Gemma-2b-sft-v1.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Columbia-NLP__LION-Gemma-2b-sft-v1.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Columbia-NLP/LION-Gemma-2b-sft-v1.0
44d6f26fa7e3b0d238064d844569bf8a07b7515e
12.60325
0
2.506
false
false
false
true
1.921618
0.369247
36.924693
0.387878
14.117171
0.067976
6.797583
0.255872
0.782998
0.40274
8.309115
0.178191
8.687943
false
false
2024-07-02
2024-07-04
0
Columbia-NLP/LION-Gemma-2b-sft-v1.0
Columbia-NLP_LION-LLaMA-3-8b-dpo-v1.0_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Columbia-NLP/LION-LLaMA-3-8b-dpo-v1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Columbia-NLP/LION-LLaMA-3-8b-dpo-v1.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Columbia-NLP__LION-LLaMA-3-8b-dpo-v1.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Columbia-NLP/LION-LLaMA-3-8b-dpo-v1.0
3cddd4a6f5939a0a4db1092a0275342b7b9912f3
21.785404
2
8.03
false
false
false
true
1.393698
0.495742
49.574241
0.502848
30.356399
0.117069
11.706949
0.28104
4.138702
0.409719
10.28151
0.321892
24.654625
false
false
2024-06-28
2024-07-04
0
Columbia-NLP/LION-LLaMA-3-8b-dpo-v1.0
Columbia-NLP_LION-LLaMA-3-8b-odpo-v1.0_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Columbia-NLP/LION-LLaMA-3-8b-odpo-v1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Columbia-NLP/LION-LLaMA-3-8b-odpo-v1.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Columbia-NLP__LION-LLaMA-3-8b-odpo-v1.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Columbia-NLP/LION-LLaMA-3-8b-odpo-v1.0
e2cec0d68a67092951e9205dfe634a59f2f4a2dd
19.853208
2
8.03
false
false
false
true
1.437394
0.396799
39.679938
0.502393
30.457173
0.106495
10.649547
0.285235
4.697987
0.40575
9.71875
0.315243
23.915854
false
false
2024-06-28
2024-07-04
0
Columbia-NLP/LION-LLaMA-3-8b-odpo-v1.0
Columbia-NLP_LION-LLaMA-3-8b-sft-v1.0_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Columbia-NLP/LION-LLaMA-3-8b-sft-v1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Columbia-NLP/LION-LLaMA-3-8b-sft-v1.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Columbia-NLP__LION-LLaMA-3-8b-sft-v1.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Columbia-NLP/LION-LLaMA-3-8b-sft-v1.0
822eddb2fd127178d9fb7bb9f4fca0e93ada2836
20.748862
0
8.03
false
false
false
true
1.507226
0.381712
38.171164
0.508777
30.88426
0.114048
11.404834
0.277685
3.691275
0.450271
15.483854
0.32372
24.857787
false
false
2024-07-02
2024-07-04
0
Columbia-NLP/LION-LLaMA-3-8b-sft-v1.0
CombinHorizon_Josiefied-abliteratedV4-Qwen2.5-14B-Inst-BaseMerge-TIES_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/CombinHorizon/Josiefied-abliteratedV4-Qwen2.5-14B-Inst-BaseMerge-TIES" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CombinHorizon/Josiefied-abliteratedV4-Qwen2.5-14B-Inst-BaseMerge-TIES</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CombinHorizon__Josiefied-abliteratedV4-Qwen2.5-14B-Inst-BaseMerge-TIES-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CombinHorizon/Josiefied-abliteratedV4-Qwen2.5-14B-Inst-BaseMerge-TIES
881729709fbf263b75e0f7341b66b5a880b82d11
41.765081
apache-2.0
2
14.77
true
false
false
true
3.330704
0.823996
82.399589
0.637009
48.19595
0.531722
53.172205
0.324664
9.955257
0.426031
12.653906
0.497922
44.213579
true
false
2024-12-07
2024-12-07
1
CombinHorizon/Josiefied-abliteratedV4-Qwen2.5-14B-Inst-BaseMerge-TIES (Merge)
CombinHorizon_Rombos-Qwen2.5-7B-Inst-BaseMerge-TIES_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/CombinHorizon/Rombos-Qwen2.5-7B-Inst-BaseMerge-TIES" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CombinHorizon/Rombos-Qwen2.5-7B-Inst-BaseMerge-TIES</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CombinHorizon__Rombos-Qwen2.5-7B-Inst-BaseMerge-TIES-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CombinHorizon/Rombos-Qwen2.5-7B-Inst-BaseMerge-TIES
52d6f6308eba9c3a0b9116706fbb1ddc448e6101
35.36673
apache-2.0
1
7.616
true
false
false
true
2.091122
0.756402
75.64019
0.540209
34.95407
0.493202
49.320242
0.297819
6.375839
0.403302
8.779427
0.434176
37.130615
true
false
2024-10-29
2024-10-29
1
CombinHorizon/Rombos-Qwen2.5-7B-Inst-BaseMerge-TIES (Merge)
CombinHorizon_YiSM-blossom5.1-34B-SLERP_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/CombinHorizon/YiSM-blossom5.1-34B-SLERP" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CombinHorizon/YiSM-blossom5.1-34B-SLERP</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CombinHorizon__YiSM-blossom5.1-34B-SLERP-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CombinHorizon/YiSM-blossom5.1-34B-SLERP
ebd8d6507623008567a0548cd0ff9e28cbd6a656
31.37993
apache-2.0
0
34.389
true
false
false
true
6.141628
0.503311
50.331121
0.620755
46.397613
0.215257
21.52568
0.355705
14.09396
0.441344
14.367969
0.474069
41.563239
true
false
2024-08-27
2024-08-27
1
CombinHorizon/YiSM-blossom5.1-34B-SLERP (Merge)
CombinHorizon_huihui-ai-abliterated-Qwen2.5-32B-Inst-BaseMerge-TIES_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/CombinHorizon/huihui-ai-abliterated-Qwen2.5-32B-Inst-BaseMerge-TIES" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CombinHorizon/huihui-ai-abliterated-Qwen2.5-32B-Inst-BaseMerge-TIES</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CombinHorizon__huihui-ai-abliterated-Qwen2.5-32B-Inst-BaseMerge-TIES-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CombinHorizon/huihui-ai-abliterated-Qwen2.5-32B-Inst-BaseMerge-TIES
3284c32f13733d1cd17c723ed754f2c01b65a15c
45.657847
apache-2.0
1
32.764
true
false
false
true
26.000843
0.820624
82.062372
0.692925
56.044782
0.594411
59.441088
0.338926
11.856823
0.420729
12.091146
0.572058
52.450872
true
false
2024-12-07
2024-12-07
1
CombinHorizon/huihui-ai-abliterated-Qwen2.5-32B-Inst-BaseMerge-TIES (Merge)
CombinHorizon_huihui-ai-abliteratedV2-Qwen2.5-14B-Inst-BaseMerge-TIES_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/CombinHorizon/huihui-ai-abliteratedV2-Qwen2.5-14B-Inst-BaseMerge-TIES" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CombinHorizon/huihui-ai-abliteratedV2-Qwen2.5-14B-Inst-BaseMerge-TIES</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CombinHorizon__huihui-ai-abliteratedV2-Qwen2.5-14B-Inst-BaseMerge-TIES-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CombinHorizon/huihui-ai-abliteratedV2-Qwen2.5-14B-Inst-BaseMerge-TIES
d92237b4b4deccb92a72b5209c79978f09fe3f08
41.466211
apache-2.0
2
14.77
true
false
false
true
3.334259
0.817576
81.757625
0.633589
47.767346
0.547583
54.758308
0.314597
8.612975
0.426031
12.453906
0.491024
43.447104
true
false
2024-12-07
2024-12-07
1
CombinHorizon/huihui-ai-abliteratedV2-Qwen2.5-14B-Inst-BaseMerge-TIES (Merge)
CombinHorizon_zetasepic-abliteratedV2-Qwen2.5-32B-Inst-BaseMerge-TIES_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/CombinHorizon/zetasepic-abliteratedV2-Qwen2.5-32B-Inst-BaseMerge-TIES" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CombinHorizon/zetasepic-abliteratedV2-Qwen2.5-32B-Inst-BaseMerge-TIES</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CombinHorizon__zetasepic-abliteratedV2-Qwen2.5-32B-Inst-BaseMerge-TIES-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CombinHorizon/zetasepic-abliteratedV2-Qwen2.5-32B-Inst-BaseMerge-TIES
d976a5d6768d54c5e59a88fe63238a055c30c06a
46.763621
apache-2.0
10
32.764
true
false
false
true
7.366635
0.832814
83.28136
0.695517
56.827407
0.585347
58.534743
0.36745
15.659955
0.431396
14.224479
0.568484
52.053783
true
false
2024-12-07
2024-12-20
1
CombinHorizon/zetasepic-abliteratedV2-Qwen2.5-32B-Inst-BaseMerge-TIES (Merge)
ContactDoctor_Bio-Medical-3B-CoT-012025_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/ContactDoctor/Bio-Medical-3B-CoT-012025" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ContactDoctor/Bio-Medical-3B-CoT-012025</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ContactDoctor__Bio-Medical-3B-CoT-012025-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ContactDoctor/Bio-Medical-3B-CoT-012025
37e0ac4b64a82964af3b33324629324cbcbf7cda
18.730711
other
10
3.085
true
false
false
false
1.599689
0.360379
36.037935
0.438315
22.263528
0.221299
22.129909
0.30453
7.270694
0.33676
3.195052
0.293384
21.487145
false
false
2025-01-06
2025-01-15
2
Qwen/Qwen2.5-3B
ContactDoctor_Bio-Medical-Llama-3-8B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/ContactDoctor/Bio-Medical-Llama-3-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ContactDoctor/Bio-Medical-Llama-3-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ContactDoctor__Bio-Medical-Llama-3-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ContactDoctor/Bio-Medical-Llama-3-8B
5436cda92c65b0ef520d278d864305c0f429824b
19.917453
other
62
4.015
true
false
false
false
1.235117
0.442237
44.22366
0.486312
26.195811
0.067221
6.722054
0.333893
11.185682
0.351396
1.757812
0.364777
29.419696
false
false
2024-08-09
2024-12-24
1
meta-llama/Meta-Llama-3-8B-Instruct
CoolSpring_Qwen2-0.5B-Abyme_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/CoolSpring/Qwen2-0.5B-Abyme" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CoolSpring/Qwen2-0.5B-Abyme</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CoolSpring__Qwen2-0.5B-Abyme-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CoolSpring/Qwen2-0.5B-Abyme
a48b7c04b854e5c60fe3464f96904bfc53c8640c
4.999994
apache-2.0
0
0.494
true
false
false
true
2.355595
0.191519
19.15185
0.286183
2.276484
0.029456
2.945619
0.253356
0.447427
0.354219
1.477344
0.133311
3.701241
false
false
2024-07-18
2024-09-04
1
Qwen/Qwen2-0.5B
CoolSpring_Qwen2-0.5B-Abyme-merge2_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/CoolSpring/Qwen2-0.5B-Abyme-merge2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CoolSpring/Qwen2-0.5B-Abyme-merge2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CoolSpring__Qwen2-0.5B-Abyme-merge2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CoolSpring/Qwen2-0.5B-Abyme-merge2
02c4c601453f7ecbfab5c95bf5afa889350026ba
6.320258
apache-2.0
0
0.63
true
false
false
true
1.219391
0.202185
20.218465
0.299427
3.709041
0.033233
3.323263
0.260067
1.342282
0.368729
3.891146
0.148936
5.437352
true
false
2024-07-27
2024-07-27
1
CoolSpring/Qwen2-0.5B-Abyme-merge2 (Merge)
CoolSpring_Qwen2-0.5B-Abyme-merge3_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/CoolSpring/Qwen2-0.5B-Abyme-merge3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CoolSpring/Qwen2-0.5B-Abyme-merge3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CoolSpring__Qwen2-0.5B-Abyme-merge3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CoolSpring/Qwen2-0.5B-Abyme-merge3
86fed893893cc2a6240f0ea09ce2eeda1a5178cc
6.820196
apache-2.0
0
0.63
true
false
false
true
1.220343
0.238605
23.860468
0.300314
4.301149
0.031722
3.172205
0.264262
1.901566
0.350094
2.128385
0.150017
5.557402
true
false
2024-07-27
2024-07-27
1
CoolSpring/Qwen2-0.5B-Abyme-merge3 (Merge)
Corianas_Neural-Mistral-7B_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Corianas/Neural-Mistral-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Corianas/Neural-Mistral-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Corianas__Neural-Mistral-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Corianas/Neural-Mistral-7B
cde6f0126310f38b6781cc26cdb9a02416b896b9
18.200439
apache-2.0
0
7.242
true
false
false
true
0.923427
0.548924
54.892352
0.442802
22.431163
0.018882
1.888218
0.283557
4.474273
0.387271
6.208854
0.27377
19.307772
false
false
2024-03-05
2024-12-06
0
Corianas/Neural-Mistral-7B
Corianas_Quokka_2.7b_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
GPT2LMHeadModel
<a target="_blank" href="https://huggingface.co/Corianas/Quokka_2.7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Corianas/Quokka_2.7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Corianas__Quokka_2.7b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Corianas/Quokka_2.7b
d9b3274662c2ac6c6058daac90504b5a8ebcac3c
4.99525
apache-2.0
0
2.786
true
false
false
false
0.587383
0.174907
17.490702
0.305547
3.165268
0.008308
0.830816
0.255872
0.782998
0.390833
6.0875
0.114528
1.614214
false
false
2023-03-30
2024-12-05
0
Corianas/Quokka_2.7b
Corianas_llama-3-reactor_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Corianas/llama-3-reactor" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Corianas/llama-3-reactor</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Corianas__llama-3-reactor-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Corianas/llama-3-reactor
bef2eac42fd89baa0064badbc9c7958ad9ccbed3
13.99547
apache-2.0
0
-1
true
false
false
false
1.64233
0.230012
23.001192
0.445715
21.88856
0.046828
4.682779
0.297819
6.375839
0.397719
8.014844
0.280086
20.009604
false
false
2024-07-20
2024-07-23
0
Corianas/llama-3-reactor
CortexLM_btlm-7b-base-v0.2_bfloat16
bfloat16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/CortexLM/btlm-7b-base-v0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CortexLM/btlm-7b-base-v0.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CortexLM__btlm-7b-base-v0.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CortexLM/btlm-7b-base-v0.2
eda8b4298365a26c8981316e09427c237b11217f
8.920255
mit
1
6.885
true
false
false
false
1.422717
0.148329
14.832866
0.400641
16.193277
0.015106
1.510574
0.253356
0.447427
0.384604
5.542188
0.234957
14.995198
false
false
2024-06-13
2024-06-26
0
CortexLM/btlm-7b-base-v0.2
Cran-May_SCE-2-24B_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Cran-May/SCE-2-24B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Cran-May/SCE-2-24B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Cran-May__SCE-2-24B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Cran-May/SCE-2-24B
6a477b347fa6c0ce76bcaf353ddc282dd1cc75c3
31.95154
0
23.572
false
false
false
true
2.704235
0.586592
58.659246
0.626469
46.325746
0.189577
18.957704
0.337248
11.63311
0.452813
16.001562
0.461187
40.131871
false
false
2025-02-03
2025-02-04
1
Cran-May/SCE-2-24B (Merge)
Cran-May_SCE-3-24B_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Cran-May/SCE-3-24B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Cran-May/SCE-3-24B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Cran-May__SCE-3-24B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Cran-May/SCE-3-24B
bf2b658dd404c423228e7001498bd69c2d147da2
30.62043
0
23.572
false
false
false
true
2.36317
0.546525
54.652544
0.597283
42.278565
0.188066
18.806647
0.346477
12.863535
0.443479
14.601562
0.464678
40.519725
false
false
2025-02-03
2025-02-04
1
Cran-May/SCE-3-24B (Merge)
Cran-May_T.E-8.1_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Cran-May/T.E-8.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Cran-May/T.E-8.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Cran-May__T.E-8.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Cran-May/T.E-8.1
5f84709710dcce7cc05fa12473e8bb207fe25849
35.699515
cc-by-nc-sa-4.0
3
7.616
true
false
false
true
2.181266
0.707692
70.769226
0.558175
37.024377
0.445619
44.561934
0.312919
8.389262
0.450521
15.315104
0.443235
38.13719
false
false
2024-09-27
2024-09-29
1
Cran-May/T.E-8.1 (Merge)
CultriX_Qwen2.5-14B-Broca_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/CultriX/Qwen2.5-14B-Broca" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CultriX/Qwen2.5-14B-Broca</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CultriX__Qwen2.5-14B-Broca-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CultriX/Qwen2.5-14B-Broca
51204ee25a629abfd6d5e77a850b5e7a36c78462
37.924501
1
14.766
false
false
false
false
4.154003
0.560414
56.041415
0.652715
50.034412
0.358006
35.800604
0.386745
18.232662
0.476656
18.948698
0.536403
48.489214
false
false
2024-12-23
2024-12-23
1
CultriX/Qwen2.5-14B-Broca (Merge)
CultriX_Qwen2.5-14B-BrocaV9_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/CultriX/Qwen2.5-14B-BrocaV9" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CultriX/Qwen2.5-14B-BrocaV9</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CultriX__Qwen2.5-14B-BrocaV9-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CultriX/Qwen2.5-14B-BrocaV9
883dafbff4edb8c83ef58a33413d4e09e922a53d
39.258747
2
14.766
false
false
false
false
3.548006
0.676293
67.629335
0.639138
48.053225
0.38142
38.141994
0.364094
15.212528
0.469031
18.395573
0.533078
48.119829
false
false
2025-01-02
2025-01-10
1
CultriX/Qwen2.5-14B-BrocaV9 (Merge)
CultriX_Qwen2.5-14B-Brocav3_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/CultriX/Qwen2.5-14B-Brocav3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CultriX/Qwen2.5-14B-Brocav3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CultriX__Qwen2.5-14B-Brocav3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CultriX/Qwen2.5-14B-Brocav3
6f3fe686a79dcbcd5835ca100e194c49f493167b
39.846832
2
14.766
false
false
false
false
3.633478
0.695178
69.517768
0.645235
49.049112
0.387462
38.746224
0.35906
14.541387
0.475635
19.254427
0.531749
47.972074
false
false
2024-12-23
2024-12-23
1
CultriX/Qwen2.5-14B-Brocav3 (Merge)
CultriX_Qwen2.5-14B-Brocav6_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/CultriX/Qwen2.5-14B-Brocav6" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CultriX/Qwen2.5-14B-Brocav6</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CultriX__Qwen2.5-14B-Brocav6-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CultriX/Qwen2.5-14B-Brocav6
bd981505b6950df69216b260c3c0d86124fded7b
39.84073
2
14.766
false
false
false
false
3.582802
0.699524
69.952393
0.638884
47.819225
0.387462
38.746224
0.36745
15.659955
0.474208
18.876042
0.531915
47.990544
false
false
2024-12-23
2024-12-23
1
CultriX/Qwen2.5-14B-Brocav6 (Merge)
CultriX_Qwen2.5-14B-Brocav7_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/CultriX/Qwen2.5-14B-Brocav7" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CultriX/Qwen2.5-14B-Brocav7</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CultriX__Qwen2.5-14B-Brocav7-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CultriX/Qwen2.5-14B-Brocav7
06acee7f6e9796081ced6201001784907c77f96f
39.61738
1
14.766
false
false
false
false
3.402699
0.672372
67.237153
0.644403
48.905361
0.384441
38.444109
0.36745
15.659955
0.479604
20.150521
0.525765
47.307181
false
false
2024-12-23
2024-12-23
1
CultriX/Qwen2.5-14B-Brocav7 (Merge)
CultriX_Qwen2.5-14B-Emerged_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/CultriX/Qwen2.5-14B-Emerged" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CultriX/Qwen2.5-14B-Emerged</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CultriX__Qwen2.5-14B-Emerged-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CultriX/Qwen2.5-14B-Emerged
8bf0e31b23ee22858bbde2cee44dde88963f5084
37.952143
1
14.766
false
false
false
false
3.61472
0.700024
70.002371
0.626003
45.932419
0.324773
32.477341
0.357383
14.317673
0.469094
18.470052
0.518617
46.513002
false
false
2024-12-19
2024-12-19
1
CultriX/Qwen2.5-14B-Emerged (Merge)
CultriX_Qwen2.5-14B-Emergedv3_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/CultriX/Qwen2.5-14B-Emergedv3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CultriX/Qwen2.5-14B-Emergedv3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CultriX__Qwen2.5-14B-Emergedv3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CultriX/Qwen2.5-14B-Emergedv3
f4df1b9c2bf37bbfd6b2e8f2ff244c6029a5d546
38.656292
1
14.766
false
false
false
false
3.837857
0.638849
63.884936
0.619073
44.731608
0.435801
43.58006
0.360738
14.765101
0.472813
18.601563
0.51737
46.374483
false
false
2024-12-21
2024-12-21
1
CultriX/Qwen2.5-14B-Emergedv3 (Merge)
CultriX_Qwen2.5-14B-FinalMerge_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/CultriX/Qwen2.5-14B-FinalMerge" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CultriX/Qwen2.5-14B-FinalMerge</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CultriX__Qwen2.5-14B-FinalMerge-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CultriX/Qwen2.5-14B-FinalMerge
8fd624d0d8989a312d344772814da3575423897a
32.23627
1
14.766
false
false
false
false
3.887883
0.489098
48.909782
0.571495
38.162479
0.38142
38.141994
0.354866
13.982103
0.437906
14.504948
0.457447
39.716312
false
false
2024-12-22
2024-12-23
1
CultriX/Qwen2.5-14B-FinalMerge (Merge)
CultriX_Qwen2.5-14B-Hyper_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/CultriX/Qwen2.5-14B-Hyper" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CultriX/Qwen2.5-14B-Hyper</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CultriX__Qwen2.5-14B-Hyper-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CultriX/Qwen2.5-14B-Hyper
a6399c43f84736ed1b11d8cc7a25edf634781207
37.761935
1
14.766
false
false
false
false
7.678342
0.539132
53.913173
0.650745
49.759879
0.343656
34.365559
0.391779
18.903803
0.489833
21.029167
0.5374
48.60003
false
false
2025-01-19
2025-01-19
1
CultriX/Qwen2.5-14B-Hyper (Merge)
CultriX_Qwen2.5-14B-Hyperionv3_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/CultriX/Qwen2.5-14B-Hyperionv3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CultriX/Qwen2.5-14B-Hyperionv3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CultriX__Qwen2.5-14B-Hyperionv3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CultriX/Qwen2.5-14B-Hyperionv3
bc36be5b5ca3053ae96d85e962249efd0b283c82
39.762121
4
14.766
false
false
false
false
3.965711
0.683637
68.363719
0.652217
49.950055
0.370091
37.009063
0.370805
16.107383
0.472969
18.921094
0.533993
48.22141
false
false
2025-01-10
2025-01-19
1
CultriX/Qwen2.5-14B-Hyperionv3 (Merge)
CultriX_Qwen2.5-14B-Hyperionv4_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/CultriX/Qwen2.5-14B-Hyperionv4" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CultriX/Qwen2.5-14B-Hyperionv4</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CultriX__Qwen2.5-14B-Hyperionv4-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CultriX/Qwen2.5-14B-Hyperionv4
60cc366b0648bcb40ed22ebc53d64cc5aca25550
37.670019
3
14.766
false
false
false
false
4.073614
0.54158
54.157968
0.647179
49.07652
0.347432
34.743202
0.397651
19.686801
0.483198
19.866406
0.536403
48.489214
false
false
2025-01-19
2025-01-19
1
CultriX/Qwen2.5-14B-Hyperionv4 (Merge)
CultriX_Qwen2.5-14B-Hyperionv5_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/CultriX/Qwen2.5-14B-Hyperionv5" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CultriX/Qwen2.5-14B-Hyperionv5</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CultriX__Qwen2.5-14B-Hyperionv5-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CultriX/Qwen2.5-14B-Hyperionv5
e0f4941349664a75ddd03e4d2c190284c951e54b
39.72497
2
14.766
false
false
false
false
3.973468
0.672921
67.292118
0.644266
48.94828
0.382175
38.217523
0.371644
16.219239
0.479542
19.876042
0.53017
47.796616
false
false
2025-01-19
2025-01-19
1
CultriX/Qwen2.5-14B-Hyperionv5 (Merge)
CultriX_Qwen2.5-14B-MegaMerge-pt2_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/CultriX/Qwen2.5-14B-MegaMerge-pt2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CultriX/Qwen2.5-14B-MegaMerge-pt2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CultriX__Qwen2.5-14B-MegaMerge-pt2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CultriX/Qwen2.5-14B-MegaMerge-pt2
20397f6cafc09c2cb74f105867cd99b3c68c71dc
38.79653
apache-2.0
2
14.766
true
false
false
false
4.500868
0.568308
56.830765
0.65777
50.907903
0.399547
39.954683
0.379195
17.225951
0.472875
18.742708
0.542055
49.117169
true
false
2024-10-24
2024-10-25
1
CultriX/Qwen2.5-14B-MegaMerge-pt2 (Merge)
CultriX_Qwen2.5-14B-MergeStock_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/CultriX/Qwen2.5-14B-MergeStock" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CultriX/Qwen2.5-14B-MergeStock</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CultriX__Qwen2.5-14B-MergeStock-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CultriX/Qwen2.5-14B-MergeStock
fa00543296f2731793dfb0aac667571ccf1abb5b
38.744236
apache-2.0
2
14.766
true
false
false
false
6.645908
0.568533
56.85326
0.657934
51.009391
0.414653
41.465257
0.373322
16.442953
0.467635
17.854427
0.539561
48.84013
true
false
2024-10-23
2024-10-24
1
CultriX/Qwen2.5-14B-MergeStock (Merge)
CultriX_Qwen2.5-14B-Ultimav2_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/CultriX/Qwen2.5-14B-Ultimav2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CultriX/Qwen2.5-14B-Ultimav2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CultriX__Qwen2.5-14B-Ultimav2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CultriX/Qwen2.5-14B-Ultimav2
9c805171d56f5d8720c687084c1ffc26bdf0acba
38.8356
apache-2.0
4
14.766
true
false
false
false
5.907627
0.550023
55.002283
0.655503
50.441053
0.384441
38.444109
0.385067
18.008949
0.496563
22.036979
0.541722
49.08023
true
false
2025-02-04
2025-02-05
1
CultriX/Qwen2.5-14B-Ultimav2 (Merge)
CultriX_Qwen2.5-14B-Unity_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/CultriX/Qwen2.5-14B-Unity" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CultriX/Qwen2.5-14B-Unity</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CultriX__Qwen2.5-14B-Unity-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CultriX/Qwen2.5-14B-Unity
1d15e7941e6ceff5d6e4f293378947bee721a24d
38.299229
3
14.766
false
false
false
false
3.827378
0.673895
67.389526
0.601996
42.258617
0.431269
43.126888
0.347315
12.975391
0.467948
18.760156
0.507563
45.284796
false
false
2024-12-21
2024-12-21
1
CultriX/Qwen2.5-14B-Unity (Merge)
CultriX_Qwen2.5-14B-Wernicke_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/CultriX/Qwen2.5-14B-Wernicke" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CultriX/Qwen2.5-14B-Wernicke</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CultriX__Qwen2.5-14B-Wernicke-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CultriX/Qwen2.5-14B-Wernicke
622c0a58ecb0c0c679d7381a823d2ae5ac2b8ce1
37.943351
apache-2.0
6
14.77
true
false
false
false
4.444469
0.52347
52.346995
0.656836
50.642876
0.38142
38.141994
0.393456
19.127517
0.468906
18.246615
0.542387
49.154108
true
false
2024-10-21
2024-10-22
1
CultriX/Qwen2.5-14B-Wernicke (Merge)
CultriX_Qwen2.5-14B-Wernicke-SFT_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/CultriX/Qwen2.5-14B-Wernicke-SFT" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CultriX/Qwen2.5-14B-Wernicke-SFT</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CultriX__Qwen2.5-14B-Wernicke-SFT-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CultriX/Qwen2.5-14B-Wernicke-SFT
3b68dfba2cf79e4a15e8f4271f7d4b62d2ab9f26
33.549512
apache-2.0
2
14.77
true
false
false
true
2.786025
0.493744
49.374438
0.646059
49.330572
0.359517
35.951662
0.354027
13.870246
0.39
7.55
0.506981
45.220154
true
false
2024-11-16
2024-11-17
1
CultriX/Qwen2.5-14B-Wernicke-SFT (Merge)
CultriX_Qwen2.5-14B-Wernicke-SLERP_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/CultriX/Qwen2.5-14B-Wernicke-SLERP" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CultriX/Qwen2.5-14B-Wernicke-SLERP</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CultriX__Qwen2.5-14B-Wernicke-SLERP-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CultriX/Qwen2.5-14B-Wernicke-SLERP
180175561e8061be067fc349ad4491270f19976f
36.543652
0
14.491
false
false
false
true
4.311975
0.55889
55.889041
0.644093
49.372327
0.44864
44.864048
0.34396
12.527964
0.414031
11.120573
0.509392
45.487958
false
false
2024-10-25
0
Removed
CultriX_Qwen2.5-14B-Wernickev3_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/CultriX/Qwen2.5-14B-Wernickev3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CultriX/Qwen2.5-14B-Wernickev3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CultriX__Qwen2.5-14B-Wernickev3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CultriX/Qwen2.5-14B-Wernickev3
bd141b0df78ad1f6e2938edf167c2305b395a2b2
38.381142
3
14.766
false
false
false
false
3.831269
0.70482
70.481988
0.618415
44.576275
0.35423
35.422961
0.362416
14.988814
0.471667
18.691667
0.515126
46.125148
false
false
2024-12-19
2024-12-19
1
CultriX/Qwen2.5-14B-Wernickev3 (Merge)
CultriX_Qwen2.5-14B-partialmergept1_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/CultriX/Qwen2.5-14B-partialmergept1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CultriX/Qwen2.5-14B-partialmergept1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CultriX__Qwen2.5-14B-partialmergept1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CultriX/Qwen2.5-14B-partialmergept1
02c6491a2affea23c1e5d89d324a90d24a0e5381
39.108717
0
14.766
false
false
false
false
4.018672
0.633729
63.372851
0.615118
44.594404
0.453927
45.392749
0.361577
14.876957
0.475698
19.66224
0.520778
46.753103
false
false
2025-01-02
2025-01-19
1
CultriX/Qwen2.5-14B-partialmergept1 (Merge)
CultriX_Qwenfinity-2.5-14B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/CultriX/Qwenfinity-2.5-14B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CultriX/Qwenfinity-2.5-14B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CultriX__Qwenfinity-2.5-14B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CultriX/Qwenfinity-2.5-14B
6acc1308274031b045f028b0a0290cdbe4243a04
32.322008
0
14.766
false
false
false
false
3.954133
0.481379
48.137941
0.565501
37.259942
0.410121
41.012085
0.348993
13.199105
0.450583
15.45625
0.449801
38.866726
false
false
2024-12-21
2024-12-23
1
CultriX/Qwenfinity-2.5-14B (Merge)
CultriX_Qwestion-14B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/CultriX/Qwestion-14B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CultriX/Qwestion-14B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CultriX__Qwestion-14B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CultriX/Qwestion-14B
e286bfafbc28e36859202c9f06ed8287a4f1d8b6
38.549226
apache-2.0
1
14.766
true
false
false
false
3.707642
0.63178
63.178034
0.64501
48.757034
0.372356
37.23565
0.368289
15.771812
0.463604
17.217188
0.542221
49.135638
true
false
2024-11-21
2024-11-23
1
CultriX/Qwestion-14B (Merge)
CultriX_SeQwence-14B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/CultriX/SeQwence-14B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CultriX/SeQwence-14B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CultriX__SeQwence-14B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CultriX/SeQwence-14B
f4a147b717ba0e9392f96e343250b00239196a22
36.886273
apache-2.0
3
14.766
true
false
false
false
3.592765
0.53516
53.516004
0.650567
50.163578
0.353474
35.347432
0.360738
14.765101
0.466615
18.426823
0.541888
49.0987
false
false
2024-11-20
2024-11-20
0
CultriX/SeQwence-14B
CultriX_SeQwence-14B-EvolMerge_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/CultriX/SeQwence-14B-EvolMerge" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CultriX/SeQwence-14B-EvolMerge</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CultriX__SeQwence-14B-EvolMerge-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CultriX/SeQwence-14B-EvolMerge
a98c932f0d71d76883fe9aa9d708af0506b01343
38.018641
apache-2.0
2
14.766
true
false
false
false
3.901652
0.538158
53.815764
0.657218
50.780351
0.367069
36.706949
0.380872
17.449664
0.482083
20.260417
0.541888
49.0987
true
false
2024-11-27
2024-11-27
1
CultriX/SeQwence-14B-EvolMerge (Merge)
CultriX_SeQwence-14B-EvolMergev1_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/CultriX/SeQwence-14B-EvolMergev1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CultriX/SeQwence-14B-EvolMergev1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CultriX__SeQwence-14B-EvolMergev1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CultriX/SeQwence-14B-EvolMergev1
6cc7116cdea757635dba52bb82a306654d118e77
38.463462
2
14.766
false
false
false
false
3.915792
0.555468
55.546838
0.654555
50.302259
0.42145
42.145015
0.376678
16.89038
0.462271
17.083854
0.539312
48.812426
false
false
2024-11-25
2024-11-27
1
CultriX/SeQwence-14B-EvolMergev1 (Merge)
CultriX_SeQwence-14B-v5_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/CultriX/SeQwence-14B-v5" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CultriX/SeQwence-14B-v5</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CultriX__SeQwence-14B-v5-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CultriX/SeQwence-14B-v5
9f43ad41542be56f6a18f31bfa60086318735ed5
37.608542
0
14.766
false
false
false
false
3.73032
0.591988
59.198815
0.651709
49.995731
0.330816
33.081571
0.369966
15.995526
0.471417
18.327083
0.541473
49.052527
false
false
2024-11-18
0
Removed
CultriX_SeQwence-14Bv1_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/CultriX/SeQwence-14Bv1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CultriX/SeQwence-14Bv1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CultriX__SeQwence-14Bv1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CultriX/SeQwence-14Bv1
542bfbd2e6fb25ecd11b84d956764eb23233a034
38.625628
apache-2.0
2
14.766
true
false
false
false
3.660382
0.6678
66.780033
0.634467
47.190898
0.361027
36.102719
0.361577
14.876957
0.470427
18.803385
0.531998
47.999778
true
false
2024-11-24
2024-11-27
1
CultriX/SeQwence-14Bv1 (Merge)
CultriX_SeQwence-14Bv2_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/CultriX/SeQwence-14Bv2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CultriX/SeQwence-14Bv2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CultriX__SeQwence-14Bv2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CultriX/SeQwence-14Bv2
674c6d49b604fdf26e327e1e86c4fde0724b98e8
38.740075
0
14.766
false
false
false
false
3.949787
0.578599
57.859923
0.630451
46.529224
0.475831
47.583082
0.360738
14.765101
0.460104
17.546354
0.533411
48.156767
false
false
2024-11-27
2024-12-08
1
CultriX/SeQwence-14Bv2 (Merge)