eval_name
stringlengths 12
111
| Precision
stringclasses 3
values | Type
stringclasses 7
values | T
stringclasses 7
values | Weight type
stringclasses 2
values | Architecture
stringclasses 64
values | Model
stringlengths 355
689
| fullname
stringlengths 4
102
| Model sha
stringlengths 0
40
| Average ⬆️
float64 0.74
52.1
| Hub License
stringclasses 27
values | Hub ❤️
int64 0
6.09k
| #Params (B)
float64 -1
141
| Available on the hub
bool 2
classes | MoE
bool 2
classes | Flagged
bool 2
classes | Chat Template
bool 2
classes | CO₂ cost (kg)
float64 0.04
187
| IFEval Raw
float64 0
0.9
| IFEval
float64 0
90
| BBH Raw
float64 0.22
0.83
| BBH
float64 0.25
76.7
| MATH Lvl 5 Raw
float64 0
0.71
| MATH Lvl 5
float64 0
71.5
| GPQA Raw
float64 0.21
0.47
| GPQA
float64 0
29.4
| MUSR Raw
float64 0.29
0.6
| MUSR
float64 0
38.7
| MMLU-PRO Raw
float64 0.1
0.73
| MMLU-PRO
float64 0
70
| Merged
bool 2
classes | Official Providers
bool 2
classes | Upload To Hub Date
stringclasses 525
values | Submission Date
stringclasses 263
values | Generation
int64 0
10
| Base Model
stringlengths 4
102
|
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
mlabonne_BigQwen2.5-Echo-47B-Instruct_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/mlabonne/BigQwen2.5-Echo-47B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">mlabonne/BigQwen2.5-Echo-47B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/mlabonne__BigQwen2.5-Echo-47B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
mlabonne/BigQwen2.5-Echo-47B-Instruct
|
f95fcf22f8ab87c2dbb1893b87c8a132820acb5e
| 37.031895
|
apache-2.0
| 3
| 47.392
| true
| false
| false
| true
| 17.046154
| 0.735669
| 73.566914
| 0.612511
| 44.522244
| 0.438066
| 43.806647
| 0.314597
| 8.612975
| 0.412479
| 10.193229
| 0.473404
| 41.489362
| true
| true
|
2024-09-23
|
2024-09-24
| 1
|
mlabonne/BigQwen2.5-Echo-47B-Instruct (Merge)
|
mlabonne_ChimeraLlama-3-8B-v2_float16
|
float16
|
🤝 base merges and moerges
|
🤝
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/mlabonne/ChimeraLlama-3-8B-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">mlabonne/ChimeraLlama-3-8B-v2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/mlabonne__ChimeraLlama-3-8B-v2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
mlabonne/ChimeraLlama-3-8B-v2
|
d90a12b1574d7be084e53e0ad610282638ab29cf
| 20.120505
|
other
| 14
| 8.03
| true
| false
| false
| false
| 1.67482
| 0.446883
| 44.688316
| 0.50456
| 28.478796
| 0.090634
| 9.063444
| 0.285235
| 4.697987
| 0.379083
| 5.252083
| 0.356882
| 28.542405
| true
| true
|
2024-04-22
|
2024-08-25
| 1
|
mlabonne/ChimeraLlama-3-8B-v2 (Merge)
|
mlabonne_ChimeraLlama-3-8B-v3_float16
|
float16
|
🤝 base merges and moerges
|
🤝
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/mlabonne/ChimeraLlama-3-8B-v3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">mlabonne/ChimeraLlama-3-8B-v3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/mlabonne__ChimeraLlama-3-8B-v3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
mlabonne/ChimeraLlama-3-8B-v3
|
c8c1787e1426e3979ae82134f4eb7fa332f58ae0
| 20.69713
|
other
| 15
| 8.03
| true
| false
| false
| false
| 1.647479
| 0.440788
| 44.078822
| 0.497819
| 27.646094
| 0.088369
| 8.836858
| 0.291946
| 5.592841
| 0.400354
| 8.377604
| 0.366855
| 29.650561
| true
| true
|
2024-05-01
|
2024-08-25
| 1
|
mlabonne/ChimeraLlama-3-8B-v3 (Merge)
|
mlabonne_Daredevil-8B_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/mlabonne/Daredevil-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">mlabonne/Daredevil-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/mlabonne__Daredevil-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
mlabonne/Daredevil-8B
|
717953c83631cc9adf2dddccfff06739308f10f7
| 22.409646
|
other
| 37
| 8.03
| true
| false
| false
| true
| 3.023829
| 0.454777
| 45.477666
| 0.519441
| 31.626855
| 0.106495
| 10.649547
| 0.307886
| 7.718121
| 0.393875
| 7.534375
| 0.383062
| 31.451315
| true
| true
|
2024-05-25
|
2024-07-02
| 1
|
mlabonne/Daredevil-8B (Merge)
|
mlabonne_Daredevil-8B-abliterated_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/mlabonne/Daredevil-8B-abliterated" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">mlabonne/Daredevil-8B-abliterated</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/mlabonne__Daredevil-8B-abliterated-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
mlabonne/Daredevil-8B-abliterated
|
034c0ce8ceeba075d1dff2bac1b113a017c79390
| 19.687996
|
other
| 38
| 8.03
| true
| false
| false
| true
| 2.396724
| 0.442637
| 44.263665
| 0.425427
| 19.865777
| 0.094411
| 9.441088
| 0.290268
| 5.369128
| 0.407021
| 9.177604
| 0.370096
| 30.010712
| false
| true
|
2024-05-26
|
2024-07-02
| 0
|
mlabonne/Daredevil-8B-abliterated
|
mlabonne_Hermes-3-Llama-3.1-70B-lorablated_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/mlabonne/Hermes-3-Llama-3.1-70B-lorablated" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">mlabonne/Hermes-3-Llama-3.1-70B-lorablated</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/mlabonne__Hermes-3-Llama-3.1-70B-lorablated-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
mlabonne/Hermes-3-Llama-3.1-70B-lorablated
|
4303ff3b524418e9aa5e787d60595a44a6173b02
| 31.745855
| 29
| 70.554
| true
| false
| false
| false
| 51.325371
| 0.342444
| 34.244361
| 0.669317
| 52.750073
| 0.22432
| 22.432024
| 0.365772
| 15.436242
| 0.502927
| 24.732552
| 0.467919
| 40.879876
| true
| true
|
2024-08-16
|
2024-11-27
| 1
|
mlabonne/Hermes-3-Llama-3.1-70B-lorablated (Merge)
|
|
mlabonne_Meta-Llama-3.1-8B-Instruct-abliterated_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/mlabonne/Meta-Llama-3.1-8B-Instruct-abliterated" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">mlabonne/Meta-Llama-3.1-8B-Instruct-abliterated</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/mlabonne__Meta-Llama-3.1-8B-Instruct-abliterated-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
mlabonne/Meta-Llama-3.1-8B-Instruct-abliterated
|
aef878bdf42c119d007322967006fcdef5ae6ee1
| 23.202552
|
llama3.1
| 153
| 8.03
| true
| false
| false
| true
| 3.273969
| 0.732946
| 73.294636
| 0.487406
| 27.129165
| 0.068731
| 6.873112
| 0.256711
| 0.894855
| 0.364885
| 3.210677
| 0.350316
| 27.812869
| false
| true
|
2024-07-24
|
2024-10-13
| 2
|
meta-llama/Meta-Llama-3.1-8B
|
mlabonne_NeuralBeagle14-7B_float16
|
float16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/mlabonne/NeuralBeagle14-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">mlabonne/NeuralBeagle14-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/mlabonne__NeuralBeagle14-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
mlabonne/NeuralBeagle14-7B
|
1567ad618a0998139654cb355738bb9bc018ca64
| 18.910076
|
cc-by-nc-4.0
| 158
| 7.242
| true
| false
| false
| true
| 1.343414
| 0.493519
| 49.351942
| 0.462787
| 23.959695
| 0.052115
| 5.21148
| 0.281879
| 4.250559
| 0.431948
| 12.89349
| 0.26014
| 17.793292
| true
| true
|
2024-01-15
|
2024-06-27
| 2
|
mlabonne/Beagle14-7B (Merge)
|
mlabonne_NeuralDaredevil-8B-abliterated_float16
|
float16
|
🤝 base merges and moerges
|
🤝
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/mlabonne/NeuralDaredevil-8B-abliterated" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">mlabonne/NeuralDaredevil-8B-abliterated</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/mlabonne__NeuralDaredevil-8B-abliterated-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
mlabonne/NeuralDaredevil-8B-abliterated
|
2f4a5e8a8522f19dff345c7189b7891468763061
| 27.186741
|
llama3
| 196
| 8.03
| true
| false
| false
| true
| 3.438846
| 0.756077
| 75.607721
| 0.511057
| 30.307986
| 0.090634
| 9.063444
| 0.306208
| 7.494407
| 0.401938
| 9.075521
| 0.384142
| 31.571365
| false
| true
|
2024-05-27
|
2024-07-25
| 0
|
mlabonne/NeuralDaredevil-8B-abliterated
|
mlabonne_NeuralDaredevil-8B-abliterated_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/mlabonne/NeuralDaredevil-8B-abliterated" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">mlabonne/NeuralDaredevil-8B-abliterated</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/mlabonne__NeuralDaredevil-8B-abliterated-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
mlabonne/NeuralDaredevil-8B-abliterated
|
89b01e3292e031ed85ad21545849182f5627021e
| 21.499914
|
llama3
| 196
| 8.03
| true
| false
| false
| false
| 0.985007
| 0.416233
| 41.623337
| 0.512396
| 29.763198
| 0.085347
| 8.534743
| 0.302852
| 7.04698
| 0.414958
| 10.903125
| 0.380153
| 31.128103
| false
| true
|
2024-05-27
|
2024-06-27
| 0
|
mlabonne/NeuralDaredevil-8B-abliterated
|
mlabonne_OrpoLlama-3-8B_float16
|
float16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/mlabonne/OrpoLlama-3-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">mlabonne/OrpoLlama-3-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/mlabonne__OrpoLlama-3-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
mlabonne/OrpoLlama-3-8B
|
7f200e4c84ad0daa3ff6bc414012d8d0bacbf90e
| 15.157037
|
other
| 53
| 8.03
| true
| false
| false
| true
| 1.7806
| 0.365275
| 36.527525
| 0.442408
| 21.954108
| 0.055891
| 5.589124
| 0.279362
| 3.914989
| 0.357938
| 4.008854
| 0.270529
| 18.947621
| false
| true
|
2024-04-18
|
2024-06-12
| 1
|
meta-llama/Meta-Llama-3-8B
|
mlabonne_phixtral-2x2_8_float16
|
float16
|
🤝 base merges and moerges
|
🤝
|
Original
|
PhiForCausalLM
|
<a target="_blank" href="https://huggingface.co/mlabonne/phixtral-2x2_8" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">mlabonne/phixtral-2x2_8</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/mlabonne__phixtral-2x2_8-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
mlabonne/phixtral-2x2_8
|
7744a977d83f132ae5808d8c3b70157031f7de44
| 15.553114
|
mit
| 148
| 4.458
| true
| true
| false
| true
| 1.921902
| 0.343118
| 34.311848
| 0.488859
| 28.502645
| 0.035498
| 3.549849
| 0.265101
| 2.013423
| 0.364354
| 7.710938
| 0.25507
| 17.229979
| false
| true
|
2024-01-07
|
2024-06-12
| 0
|
mlabonne/phixtral-2x2_8
|
mlx-community_Josiefied-Qwen2.5-0.5B-Instruct-abliterated-v1-float32_float16
|
float16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/mlx-community/Josiefied-Qwen2.5-0.5B-Instruct-abliterated-v1-float32" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">mlx-community/Josiefied-Qwen2.5-0.5B-Instruct-abliterated-v1-float32</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/mlx-community__Josiefied-Qwen2.5-0.5B-Instruct-abliterated-v1-float32-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
mlx-community/Josiefied-Qwen2.5-0.5B-Instruct-abliterated-v1-float32
|
78ba059029cfbdc819ee80f1e91827b9d3ba1620
| 9.708088
|
apache-2.0
| 1
| 0.494
| true
| false
| false
| true
| 1.000347
| 0.336898
| 33.689832
| 0.32921
| 7.221169
| 0.084592
| 8.459215
| 0.25755
| 1.006711
| 0.324917
| 0.78125
| 0.163813
| 7.090352
| false
| false
|
2024-11-17
|
2025-01-07
| 3
|
Qwen/Qwen2.5-0.5B
|
mlx-community_Mistral-Small-24B-Instruct-2501-bf16_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/mlx-community/Mistral-Small-24B-Instruct-2501-bf16" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">mlx-community/Mistral-Small-24B-Instruct-2501-bf16</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/mlx-community__Mistral-Small-24B-Instruct-2501-bf16-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
mlx-community/Mistral-Small-24B-Instruct-2501-bf16
|
92ae924591721abf40ae8dbebb7f37f10a518448
| 38.669424
|
apache-2.0
| 6
| 23.572
| true
| false
| false
| false
| 2.578045
| 0.628283
| 62.828296
| 0.671327
| 52.392869
| 0.322508
| 32.250755
| 0.395134
| 19.35123
| 0.461833
| 16.3625
| 0.539478
| 48.830895
| false
| false
|
2025-01-30
|
2025-02-06
| 2
|
mistralai/Mistral-Small-24B-Instruct-2501 (Merge)
|
mmnga_Llama-3-70B-japanese-suzume-vector-v0.1_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/mmnga/Llama-3-70B-japanese-suzume-vector-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">mmnga/Llama-3-70B-japanese-suzume-vector-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/mmnga__Llama-3-70B-japanese-suzume-vector-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
mmnga/Llama-3-70B-japanese-suzume-vector-v0.1
|
16f98b2d45684af2c4a9ff5da75b00ef13cca808
| 30.380044
|
llama3
| 4
| 70.554
| true
| false
| false
| true
| 32.19425
| 0.464893
| 46.489315
| 0.654176
| 50.022661
| 0.232628
| 23.26284
| 0.286074
| 4.809843
| 0.414063
| 10.757813
| 0.52244
| 46.937796
| false
| false
|
2024-04-28
|
2024-09-19
| 0
|
mmnga/Llama-3-70B-japanese-suzume-vector-v0.1
|
mobiuslabsgmbh_DeepSeek-R1-ReDistill-Llama3-8B-v1.1_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/mobiuslabsgmbh/DeepSeek-R1-ReDistill-Llama3-8B-v1.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">mobiuslabsgmbh/DeepSeek-R1-ReDistill-Llama3-8B-v1.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/mobiuslabsgmbh__DeepSeek-R1-ReDistill-Llama3-8B-v1.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
mobiuslabsgmbh/DeepSeek-R1-ReDistill-Llama3-8B-v1.1
|
16f6691b234d868a71b2addfc237a6c5088ecb48
| 15.806892
|
mit
| 10
| 8.03
| true
| false
| false
| true
| 0.732228
| 0.370396
| 37.03961
| 0.347303
| 7.891833
| 0.32855
| 32.854985
| 0.270973
| 2.796421
| 0.339552
| 0.94401
| 0.21983
| 13.314495
| false
| false
|
2025-01-29
|
2025-02-16
| 1
|
mobiuslabsgmbh/DeepSeek-R1-ReDistill-Llama3-8B-v1.1 (Merge)
|
mobiuslabsgmbh_DeepSeek-R1-ReDistill-Qwen-7B-v1.1_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/mobiuslabsgmbh/DeepSeek-R1-ReDistill-Qwen-7B-v1.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">mobiuslabsgmbh/DeepSeek-R1-ReDistill-Qwen-7B-v1.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/mobiuslabsgmbh__DeepSeek-R1-ReDistill-Qwen-7B-v1.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
mobiuslabsgmbh/DeepSeek-R1-ReDistill-Qwen-7B-v1.1
|
40f505b1ec4f6008fd9e6867bbe0d338addcafbd
| 17.737905
|
mit
| 15
| 7.616
| true
| false
| false
| true
| 0.675267
| 0.347315
| 34.731512
| 0.369838
| 11.5654
| 0.349698
| 34.969789
| 0.265101
| 2.013423
| 0.400885
| 8.410677
| 0.23263
| 14.736628
| false
| false
|
2025-01-27
|
2025-02-16
| 1
|
mobiuslabsgmbh/DeepSeek-R1-ReDistill-Qwen-7B-v1.1 (Merge)
|
moeru-ai_L3.1-Moe-2x8B-v0.2_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
MixtralForCausalLM
|
<a target="_blank" href="https://huggingface.co/moeru-ai/L3.1-Moe-2x8B-v0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">moeru-ai/L3.1-Moe-2x8B-v0.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/moeru-ai__L3.1-Moe-2x8B-v0.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
moeru-ai/L3.1-Moe-2x8B-v0.2
|
1a0b4d4d1e839e332c67c9c16a2fc1f7ccc7f81e
| 28.878094
|
llama3.1
| 6
| 13.668
| true
| true
| false
| true
| 3.853136
| 0.734795
| 73.479479
| 0.525569
| 32.945891
| 0.16994
| 16.993958
| 0.300336
| 6.711409
| 0.419854
| 11.381771
| 0.385805
| 31.756058
| true
| false
|
2024-10-25
|
2024-10-25
| 1
|
moeru-ai/L3.1-Moe-2x8B-v0.2 (Merge)
|
moeru-ai_L3.1-Moe-4x8B-v0.1_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
MixtralForCausalLM
|
<a target="_blank" href="https://huggingface.co/moeru-ai/L3.1-Moe-4x8B-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">moeru-ai/L3.1-Moe-4x8B-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/moeru-ai__L3.1-Moe-4x8B-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
moeru-ai/L3.1-Moe-4x8B-v0.1
|
f8d477fad4c02c099c80ef38865c01e2c882e996
| 19.441557
|
llama3.1
| 3
| 24.942
| true
| true
| false
| true
| 8.704718
| 0.433219
| 43.321941
| 0.493928
| 27.856765
| 0.129909
| 12.990937
| 0.259228
| 1.230425
| 0.360917
| 3.98125
| 0.345412
| 27.268026
| true
| false
|
2024-10-23
|
2024-10-23
| 1
|
moeru-ai/L3.1-Moe-4x8B-v0.1 (Merge)
|
moeru-ai_L3.1-Moe-4x8B-v0.2_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
MixtralForCausalLM
|
<a target="_blank" href="https://huggingface.co/moeru-ai/L3.1-Moe-4x8B-v0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">moeru-ai/L3.1-Moe-4x8B-v0.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/moeru-ai__L3.1-Moe-4x8B-v0.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
moeru-ai/L3.1-Moe-4x8B-v0.2
|
fab49d865eb51f00e955c5624712184c39d207c9
| 18.310513
|
llama3.1
| 2
| 24.942
| true
| true
| false
| true
| 6.732654
| 0.540655
| 54.065546
| 0.446626
| 21.337007
| 0.103474
| 10.347432
| 0.266779
| 2.237136
| 0.323396
| 2.291146
| 0.276263
| 19.584811
| true
| false
|
2024-10-30
|
2024-10-30
| 1
|
moeru-ai/L3.1-Moe-4x8B-v0.2 (Merge)
|
monsterapi_Llama-3_1-8B-Instruct-orca-ORPO_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Adapter
|
?
|
<a target="_blank" href="https://huggingface.co/monsterapi/Llama-3_1-8B-Instruct-orca-ORPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">monsterapi/Llama-3_1-8B-Instruct-orca-ORPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/monsterapi__Llama-3_1-8B-Instruct-orca-ORPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
monsterapi/Llama-3_1-8B-Instruct-orca-ORPO
|
5206a32e0bd3067aef1ce90f5528ade7d866253f
| 4.832138
|
apache-2.0
| 2
| 16.061
| true
| false
| false
| true
| 3.065361
| 0.227289
| 22.728915
| 0.286536
| 1.340469
| 0
| 0
| 0.249161
| 0
| 0.344479
| 3.059896
| 0.116772
| 1.863549
| false
| false
|
2024-08-01
|
2024-08-30
| 2
|
meta-llama/Meta-Llama-3.1-8B
|
monsterapi_gemma-2-2b-LoRA-MonsterInstruct_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/monsterapi/gemma-2-2b-LoRA-MonsterInstruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">monsterapi/gemma-2-2b-LoRA-MonsterInstruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/monsterapi__gemma-2-2b-LoRA-MonsterInstruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
monsterapi/gemma-2-2b-LoRA-MonsterInstruct
|
6422e27e96e15cf93b966c973aacc15f8a27a458
| 12.519873
|
gemma
| 0
| 2.614
| true
| false
| false
| true
| 2.677375
| 0.390255
| 39.025452
| 0.364969
| 11.965057
| 0.050604
| 5.060423
| 0.270134
| 2.684564
| 0.364385
| 5.414844
| 0.19872
| 10.968898
| false
| false
|
2024-08-03
|
2024-08-05
| 0
|
monsterapi/gemma-2-2b-LoRA-MonsterInstruct
|
mosaicml_mpt-7b_bfloat16
|
bfloat16
|
🟢 pretrained
|
🟢
|
Original
|
MPTForCausalLM
|
<a target="_blank" href="https://huggingface.co/mosaicml/mpt-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">mosaicml/mpt-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/mosaicml__mpt-7b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
mosaicml/mpt-7b
|
039e37745f00858f0e01e988383a8c4393b1a4f5
| 6.032029
|
apache-2.0
| 1,167
| 7
| true
| false
| false
| false
| 1.287007
| 0.215199
| 21.519901
| 0.329974
| 6.550601
| 0.015861
| 1.586103
| 0.260067
| 1.342282
| 0.36724
| 2.904948
| 0.120595
| 2.288342
| false
| true
|
2023-05-05
|
2024-06-08
| 0
|
mosaicml/mpt-7b
|
mosama_Qwen2.5-1.5B-Instruct-CoT-Reflection_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/mosama/Qwen2.5-1.5B-Instruct-CoT-Reflection" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">mosama/Qwen2.5-1.5B-Instruct-CoT-Reflection</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/mosama__Qwen2.5-1.5B-Instruct-CoT-Reflection-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
mosama/Qwen2.5-1.5B-Instruct-CoT-Reflection
|
0dd9c511521b05a2eb70d1dfb102c1766be3ae26
| 11.862792
|
apache-2.0
| 1
| 1.544
| true
| false
| false
| true
| 1.180676
| 0.287039
| 28.70395
| 0.410937
| 17.973738
| 0.02719
| 2.719033
| 0.261745
| 1.565996
| 0.321198
| 1.866667
| 0.265126
| 18.34737
| false
| false
|
2024-12-22
|
2024-12-22
| 1
|
mosama/Qwen2.5-1.5B-Instruct-CoT-Reflection (Merge)
|
mrdayl_OpenCogito_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/mrdayl/OpenCogito" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">mrdayl/OpenCogito</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/mrdayl__OpenCogito-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
mrdayl/OpenCogito
|
f2a54023d176e00311001a73e609fb10ef7416fc
| 22.158547
| 0
| 3.086
| false
| false
| false
| false
| 6.847273
| 0.393377
| 39.337735
| 0.47197
| 26.33272
| 0.218278
| 21.827795
| 0.300336
| 6.711409
| 0.42401
| 11.501302
| 0.345163
| 27.240322
| false
| false
|
2025-03-07
| 0
|
Removed
|
||
mrdayl_OpenCognito_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/mrdayl/OpenCognito" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">mrdayl/OpenCognito</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/mrdayl__OpenCognito-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
mrdayl/OpenCognito
|
af631967849155d520801331fb1aca9ac5c6055e
| 22.285664
|
apache-2.0
| 0
| 3.086
| true
| false
| false
| false
| 0.773126
| 0.406217
| 40.621662
| 0.470561
| 25.985836
| 0.21148
| 21.148036
| 0.297819
| 6.375839
| 0.429344
| 12.434635
| 0.344332
| 27.147976
| false
| false
|
2025-03-07
|
2025-03-07
| 1
|
Removed
|
mrdayl_OpenCognito-r1_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/mrdayl/OpenCognito-r1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">mrdayl/OpenCognito-r1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/mrdayl__OpenCognito-r1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
mrdayl/OpenCognito-r1
|
1dd62e221fee56966697cc391eeba52beea726f4
| 22.152615
|
apache-2.0
| 0
| 3.086
| true
| false
| false
| false
| 3.837372
| 0.424127
| 42.412687
| 0.467335
| 25.595544
| 0.190332
| 19.033233
| 0.299497
| 6.599553
| 0.424073
| 11.775781
| 0.34749
| 27.498892
| false
| false
|
2025-03-08
|
2025-03-11
| 3
|
Qwen/Qwen2.5-3B
|
mrdayl_OpenCognito-r2_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/mrdayl/OpenCognito-r2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">mrdayl/OpenCognito-r2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/mrdayl__OpenCognito-r2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
mrdayl/OpenCognito-r2
|
0d3e3cd82084e6b1ff1b68a4b9732c1fb2c2efd3
| 21.96747
|
apache-2.0
| 0
| 3.086
| true
| false
| false
| false
| 1.57563
| 0.395875
| 39.587517
| 0.468828
| 25.775898
| 0.202417
| 20.241692
| 0.306208
| 7.494407
| 0.420167
| 11.354167
| 0.34616
| 27.351138
| false
| false
|
2025-03-11
|
2025-03-13
| 4
|
Qwen/Qwen2.5-3B
|
mrdayl_OpenThink_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/mrdayl/OpenThink" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">mrdayl/OpenThink</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/mrdayl__OpenThink-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
mrdayl/OpenThink
|
d23dff000bc8faba0b83a5bec444ff9afc6a000a
| 12.397923
|
apache-2.0
| 0
| 1.777
| true
| false
| false
| false
| 1.767873
| 0.205407
| 20.540721
| 0.345979
| 9.176575
| 0.28852
| 28.851964
| 0.282718
| 4.362416
| 0.328885
| 2.010677
| 0.185007
| 9.445183
| false
| false
|
2025-02-22
|
2025-02-27
| 1
|
mrdayl/OpenThink (Merge)
|
mrm8488_phi-4-14B-grpo-gsm8k-3e_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/mrm8488/phi-4-14B-grpo-gsm8k-3e" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">mrm8488/phi-4-14B-grpo-gsm8k-3e</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/mrm8488__phi-4-14B-grpo-gsm8k-3e-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
mrm8488/phi-4-14B-grpo-gsm8k-3e
|
d8874d5f1dc81b1c251ebb9ccd492d95c25a86b5
| 39.20729
|
apache-2.0
| 0
| 14.66
| true
| false
| false
| true
| 0.951835
| 0.688533
| 68.853309
| 0.680542
| 54.020966
| 0.452417
| 45.241692
| 0.33557
| 11.409396
| 0.399396
| 8.291146
| 0.526845
| 47.427231
| false
| false
|
2025-02-11
|
2025-02-13
| 2
|
microsoft/phi-4
|
mrm8488_phi-4-14B-grpo-limo_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/mrm8488/phi-4-14B-grpo-limo" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">mrm8488/phi-4-14B-grpo-limo</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/mrm8488__phi-4-14B-grpo-limo-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
mrm8488/phi-4-14B-grpo-limo
|
68d12df3c3240529fc9f6ce5a226e7c0d2d3d245
| 39.064088
|
apache-2.0
| 0
| 14.66
| true
| false
| false
| true
| 0.948357
| 0.681239
| 68.123911
| 0.678485
| 53.675902
| 0.456949
| 45.694864
| 0.336409
| 11.521253
| 0.398063
| 8.024479
| 0.526097
| 47.344119
| false
| false
|
2025-02-12
|
2025-02-13
| 2
|
microsoft/phi-4
|
mukaj_Llama-3.1-Hawkish-8B_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/mukaj/Llama-3.1-Hawkish-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">mukaj/Llama-3.1-Hawkish-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/mukaj__Llama-3.1-Hawkish-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
mukaj/Llama-3.1-Hawkish-8B
|
bd4968f565d94e3595e41a260f5550888df3fc85
| 26.581501
|
other
| 42
| 8.03
| true
| false
| false
| true
| 1.364197
| 0.672047
| 67.204684
| 0.488382
| 28.135841
| 0.243202
| 24.320242
| 0.290268
| 5.369128
| 0.396729
| 8.557812
| 0.333112
| 25.9013
| false
| false
|
2024-10-26
|
2024-12-18
| 0
|
mukaj/Llama-3.1-Hawkish-8B
|
natong19_Mistral-Nemo-Instruct-2407-abliterated_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/natong19/Mistral-Nemo-Instruct-2407-abliterated" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">natong19/Mistral-Nemo-Instruct-2407-abliterated</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/natong19__Mistral-Nemo-Instruct-2407-abliterated-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
natong19/Mistral-Nemo-Instruct-2407-abliterated
|
9c7087f62e6ab10ec4aeeb268e25cb3d4000696b
| 25.017625
|
apache-2.0
| 15
| 12.248
| true
| false
| false
| true
| 2.475714
| 0.639224
| 63.922393
| 0.504845
| 29.915044
| 0.132175
| 13.217523
| 0.286913
| 4.9217
| 0.403333
| 10.15
| 0.351812
| 27.979093
| false
| false
|
2024-08-15
|
2024-09-21
| 0
|
natong19/Mistral-Nemo-Instruct-2407-abliterated
|
natong19_Qwen2-7B-Instruct-abliterated_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/natong19/Qwen2-7B-Instruct-abliterated" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">natong19/Qwen2-7B-Instruct-abliterated</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/natong19__Qwen2-7B-Instruct-abliterated-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
natong19/Qwen2-7B-Instruct-abliterated
|
127962453ae87879719a82a97384ac1859787a25
| 28.515342
|
apache-2.0
| 5
| 7.616
| true
| false
| false
| true
| 2.151671
| 0.583695
| 58.36946
| 0.555304
| 37.746834
| 0.276435
| 27.643505
| 0.301174
| 6.823266
| 0.403427
| 8.928385
| 0.384225
| 31.5806
| false
| false
|
2024-06-14
|
2024-07-29
| 0
|
natong19/Qwen2-7B-Instruct-abliterated
|
nazimali_Mistral-Nemo-Kurdish_bfloat16
|
bfloat16
|
🟩 continuously pretrained
|
🟩
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/nazimali/Mistral-Nemo-Kurdish" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nazimali/Mistral-Nemo-Kurdish</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nazimali__Mistral-Nemo-Kurdish-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
nazimali/Mistral-Nemo-Kurdish
|
1eb544577a2874d8df0b77ca83ff1c88dd20f481
| 19.482238
|
apache-2.0
| 3
| 12.248
| true
| false
| false
| false
| 3.699454
| 0.340121
| 34.012088
| 0.513332
| 29.855897
| 0.095921
| 9.592145
| 0.301174
| 6.823266
| 0.411573
| 11.779948
| 0.323471
| 24.830083
| false
| false
|
2024-10-09
|
2024-10-14
| 1
|
nazimali/Mistral-Nemo-Kurdish (Merge)
|
nazimali_Mistral-Nemo-Kurdish-Instruct_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/nazimali/Mistral-Nemo-Kurdish-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nazimali/Mistral-Nemo-Kurdish-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nazimali__Mistral-Nemo-Kurdish-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
nazimali/Mistral-Nemo-Kurdish-Instruct
|
512140572f11203441e60ca26b5ede2b9979cb1d
| 18.555958
|
apache-2.0
| 2
| 12.248
| true
| false
| false
| true
| 1.702117
| 0.496392
| 49.63918
| 0.469942
| 25.561423
| 0.004532
| 0.453172
| 0.282718
| 4.362416
| 0.397875
| 8.401042
| 0.306267
| 22.918514
| false
| false
|
2024-10-09
|
2024-10-14
| 1
|
nazimali/Mistral-Nemo-Kurdish-Instruct (Merge)
|
nazimali_Mistral-Nemo-Kurdish-Instruct_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/nazimali/Mistral-Nemo-Kurdish-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nazimali/Mistral-Nemo-Kurdish-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nazimali__Mistral-Nemo-Kurdish-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
nazimali/Mistral-Nemo-Kurdish-Instruct
|
512140572f11203441e60ca26b5ede2b9979cb1d
| 19.948622
|
apache-2.0
| 2
| 12.248
| true
| false
| false
| true
| 3.503441
| 0.486
| 48.600048
| 0.472144
| 26.021741
| 0.084592
| 8.459215
| 0.284396
| 4.58613
| 0.400573
| 8.838281
| 0.308677
| 23.186318
| false
| false
|
2024-10-09
|
2024-10-14
| 1
|
nazimali/Mistral-Nemo-Kurdish-Instruct (Merge)
|
nbeerbower_BigKartoffel-mistral-nemo-20B_float16
|
float16
|
🤝 base merges and moerges
|
🤝
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/nbeerbower/BigKartoffel-mistral-nemo-20B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nbeerbower/BigKartoffel-mistral-nemo-20B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nbeerbower__BigKartoffel-mistral-nemo-20B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
nbeerbower/BigKartoffel-mistral-nemo-20B
|
a552090b42c2cb6ed573fc12cf9571eb0faa8174
| 23.763763
|
apache-2.0
| 3
| 20.427
| true
| false
| false
| true
| 2.457534
| 0.585718
| 58.571812
| 0.551483
| 35.798644
| 0.026435
| 2.643505
| 0.286913
| 4.9217
| 0.428042
| 12.538542
| 0.352975
| 28.108378
| true
| false
|
2025-03-04
|
2025-03-05
| 1
|
nbeerbower/BigKartoffel-mistral-nemo-20B (Merge)
|
nbeerbower_DoppelKartoffel-Mistral-Nemo-23B_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/nbeerbower/DoppelKartoffel-Mistral-Nemo-23B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nbeerbower/DoppelKartoffel-Mistral-Nemo-23B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nbeerbower__DoppelKartoffel-Mistral-Nemo-23B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
nbeerbower/DoppelKartoffel-Mistral-Nemo-23B
|
c1b2106456bea0b5172d34c8c4b0818bb79a0429
| 19.833865
|
apache-2.0
| 0
| 23.153
| true
| false
| false
| true
| 3.060304
| 0.519148
| 51.914808
| 0.521793
| 31.920697
| 0.030967
| 3.096677
| 0.275168
| 3.355705
| 0.37949
| 5.602865
| 0.308012
| 23.112441
| false
| false
|
2025-02-15
|
2025-02-15
| 1
|
nbeerbower/DoppelKartoffel-Mistral-Nemo-23B (Merge)
|
nbeerbower_DoublePotato-Mistral-Nemo-13B_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/nbeerbower/DoublePotato-Mistral-Nemo-13B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nbeerbower/DoublePotato-Mistral-Nemo-13B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nbeerbower__DoublePotato-Mistral-Nemo-13B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
nbeerbower/DoublePotato-Mistral-Nemo-13B
|
7c00ffb0a327260101eb2957a8f5af63443870bf
| 26.796264
| 1
| 13.338
| false
| false
| false
| true
| 1.061673
| 0.679616
| 67.961564
| 0.543792
| 35.211853
| 0.04003
| 4.003021
| 0.301174
| 6.823266
| 0.459979
| 17.930729
| 0.359624
| 28.847148
| false
| false
|
2025-02-13
|
2025-02-21
| 1
|
nbeerbower/DoublePotato-Mistral-Nemo-13B (Merge)
|
|
nbeerbower_Dumpling-Qwen2.5-1.5B_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/nbeerbower/Dumpling-Qwen2.5-1.5B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nbeerbower/Dumpling-Qwen2.5-1.5B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nbeerbower__Dumpling-Qwen2.5-1.5B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
nbeerbower/Dumpling-Qwen2.5-1.5B
|
ce189fd8d1d8daec8ccffdd293bfbda81c34a524
| 15.625094
|
apache-2.0
| 1
| 1.544
| true
| false
| false
| true
| 1.205201
| 0.369896
| 36.989632
| 0.415974
| 18.178358
| 0.117069
| 11.706949
| 0.268456
| 2.46085
| 0.37276
| 4.728385
| 0.277178
| 19.686392
| false
| false
|
2025-01-29
|
2025-01-29
| 1
|
nbeerbower/Dumpling-Qwen2.5-1.5B (Merge)
|
nbeerbower_Dumpling-Qwen2.5-14B_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/nbeerbower/Dumpling-Qwen2.5-14B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nbeerbower/Dumpling-Qwen2.5-14B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nbeerbower__Dumpling-Qwen2.5-14B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
nbeerbower/Dumpling-Qwen2.5-14B
|
4ce3f034a983caaf7a0aa49fd3aab2f531e9f2eb
| 34.798721
|
apache-2.0
| 3
| 14.77
| true
| false
| false
| true
| 1.380583
| 0.606401
| 60.640102
| 0.645064
| 49.666836
| 0.309668
| 30.966767
| 0.301174
| 6.823266
| 0.435396
| 14.357812
| 0.517038
| 46.337544
| false
| false
|
2025-02-14
|
2025-02-15
| 1
|
nbeerbower/Dumpling-Qwen2.5-14B (Merge)
|
nbeerbower_Dumpling-Qwen2.5-7B-1k-r16_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/nbeerbower/Dumpling-Qwen2.5-7B-1k-r16" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nbeerbower/Dumpling-Qwen2.5-7B-1k-r16</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nbeerbower__Dumpling-Qwen2.5-7B-1k-r16-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
nbeerbower/Dumpling-Qwen2.5-7B-1k-r16
|
5295fc4f2ebe7124ce53808dfe8796e2ede7b53c
| 25.295662
|
apache-2.0
| 2
| 7.616
| true
| false
| false
| true
| 0.633828
| 0.486
| 48.600048
| 0.521423
| 32.10173
| 0.236405
| 23.640483
| 0.270134
| 2.684564
| 0.42299
| 11.873698
| 0.395861
| 32.873449
| false
| false
|
2025-01-31
|
2025-02-21
| 1
|
nbeerbower/Dumpling-Qwen2.5-7B-1k-r16 (Merge)
|
nbeerbower_Dumpling-Qwen2.5-7B-1k-r64-2e-5_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/nbeerbower/Dumpling-Qwen2.5-7B-1k-r64-2e-5" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nbeerbower/Dumpling-Qwen2.5-7B-1k-r64-2e-5</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nbeerbower__Dumpling-Qwen2.5-7B-1k-r64-2e-5-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
nbeerbower/Dumpling-Qwen2.5-7B-1k-r64-2e-5
|
e8de9ede88cc20ec887b80a9f722f562cc5a065f
| 25.052668
|
apache-2.0
| 0
| 7.616
| true
| false
| false
| true
| 0.677231
| 0.417907
| 41.790671
| 0.530055
| 33.867115
| 0.21148
| 21.148036
| 0.270134
| 2.684564
| 0.448604
| 16.142188
| 0.412151
| 34.683437
| false
| false
|
2025-02-02
|
2025-02-21
| 1
|
nbeerbower/Dumpling-Qwen2.5-7B-1k-r64-2e-5 (Merge)
|
nbeerbower_EVA-abliterated-TIES-Qwen2.5-1.5B_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/nbeerbower/EVA-abliterated-TIES-Qwen2.5-1.5B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nbeerbower/EVA-abliterated-TIES-Qwen2.5-1.5B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nbeerbower__EVA-abliterated-TIES-Qwen2.5-1.5B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
nbeerbower/EVA-abliterated-TIES-Qwen2.5-1.5B
|
dc0152d7482236dacce20f1f5a3d184073ff01b6
| 15.375831
|
apache-2.0
| 0
| 1.777
| true
| false
| false
| true
| 1.230459
| 0.411487
| 41.148708
| 0.399656
| 15.218366
| 0.137462
| 13.746224
| 0.265101
| 2.013423
| 0.350188
| 1.106771
| 0.271193
| 19.021498
| true
| false
|
2025-01-29
|
2025-01-29
| 1
|
nbeerbower/EVA-abliterated-TIES-Qwen2.5-1.5B (Merge)
|
nbeerbower_EVA-abliterated-TIES-Qwen2.5-14B_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/nbeerbower/EVA-abliterated-TIES-Qwen2.5-14B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nbeerbower/EVA-abliterated-TIES-Qwen2.5-14B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nbeerbower__EVA-abliterated-TIES-Qwen2.5-14B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
nbeerbower/EVA-abliterated-TIES-Qwen2.5-14B
|
610b00c3b61b7ff606bbd0fb3608d8236c825869
| 42.164426
|
apache-2.0
| 1
| 14.77
| true
| false
| false
| true
| 1.395556
| 0.783554
| 78.35543
| 0.637202
| 48.522478
| 0.504532
| 50.453172
| 0.354866
| 13.982103
| 0.440667
| 14.883333
| 0.52111
| 46.790041
| true
| false
|
2025-02-08
|
2025-02-09
| 1
|
nbeerbower/EVA-abliterated-TIES-Qwen2.5-14B (Merge)
|
nbeerbower_Flammades-Mistral-Nemo-12B_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/nbeerbower/Flammades-Mistral-Nemo-12B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nbeerbower/Flammades-Mistral-Nemo-12B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nbeerbower__Flammades-Mistral-Nemo-12B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
nbeerbower/Flammades-Mistral-Nemo-12B
|
ddc76d1976af06aedc7f06bbffcaa34166c1cbdd
| 22.566724
|
apache-2.0
| 2
| 12.248
| true
| false
| false
| false
| 3.25484
| 0.38416
| 38.415959
| 0.529961
| 32.393772
| 0.075529
| 7.55287
| 0.303691
| 7.158837
| 0.480625
| 20.311458
| 0.366107
| 29.56745
| false
| false
|
2024-10-05
|
2024-10-06
| 1
|
nbeerbower/Flammades-Mistral-Nemo-12B (Merge)
|
nbeerbower_Gemma2-Gutenberg-Doppel-9B_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/nbeerbower/Gemma2-Gutenberg-Doppel-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nbeerbower/Gemma2-Gutenberg-Doppel-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nbeerbower__Gemma2-Gutenberg-Doppel-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
nbeerbower/Gemma2-Gutenberg-Doppel-9B
|
f425bc69783891088e89e0afe44ec62b730567ba
| 32.542444
|
gemma
| 4
| 9.242
| true
| false
| false
| false
| 3.894631
| 0.717109
| 71.710949
| 0.587011
| 41.083063
| 0.197885
| 19.78852
| 0.329698
| 10.626398
| 0.460781
| 17.297656
| 0.412733
| 34.748079
| false
| false
|
2024-09-29
|
2024-10-01
| 1
|
nbeerbower/Gemma2-Gutenberg-Doppel-9B (Merge)
|
nbeerbower_Gutensuppe-mistral-nemo-12B_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/nbeerbower/Gutensuppe-mistral-nemo-12B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nbeerbower/Gutensuppe-mistral-nemo-12B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nbeerbower__Gutensuppe-mistral-nemo-12B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
nbeerbower/Gutensuppe-mistral-nemo-12B
|
6ee13f347071bc3c4ee95c9dc3488a4093927143
| 22.294222
| 6
| 12.248
| false
| false
| false
| false
| 3.112499
| 0.291611
| 29.16107
| 0.548683
| 35.569348
| 0.132931
| 13.293051
| 0.337248
| 11.63311
| 0.429031
| 14.328906
| 0.368019
| 29.779846
| false
| false
|
2024-08-23
|
2024-09-03
| 1
|
nbeerbower/Gutensuppe-mistral-nemo-12B (Merge)
|
|
nbeerbower_Hermes2-Gutenberg2-Mistral-7B_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/nbeerbower/Hermes2-Gutenberg2-Mistral-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nbeerbower/Hermes2-Gutenberg2-Mistral-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nbeerbower__Hermes2-Gutenberg2-Mistral-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
nbeerbower/Hermes2-Gutenberg2-Mistral-7B
|
5eec0dfd29999ef1d7775010b7e9c7be9ed89bfd
| 19.363861
|
apache-2.0
| 2
| 7.242
| true
| false
| false
| false
| 1.162242
| 0.372145
| 37.21448
| 0.498145
| 28.907335
| 0.057402
| 5.740181
| 0.28943
| 5.257271
| 0.462302
| 16.921094
| 0.299285
| 22.142804
| false
| false
|
2024-09-30
|
2024-10-01
| 1
|
nbeerbower/Hermes2-Gutenberg2-Mistral-7B (Merge)
|
nbeerbower_Kartoffel-Deepfry-12B_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/nbeerbower/Kartoffel-Deepfry-12B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nbeerbower/Kartoffel-Deepfry-12B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nbeerbower__Kartoffel-Deepfry-12B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
nbeerbower/Kartoffel-Deepfry-12B
|
8c4da9fb61da7561424f8a20a6196a8a817b7430
| 24.147488
|
apache-2.0
| 1
| 12.248
| true
| false
| false
| true
| 0.802052
| 0.502162
| 50.216204
| 0.536537
| 33.754974
| 0.060423
| 6.042296
| 0.296141
| 6.152125
| 0.479167
| 20.029167
| 0.358211
| 28.69016
| false
| false
|
2025-03-01
|
2025-03-01
| 1
|
nbeerbower/Kartoffel-Deepfry-12B (Merge)
|
nbeerbower_Llama-3.1-Nemotron-lorablated-70B_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/nbeerbower/Llama-3.1-Nemotron-lorablated-70B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nbeerbower/Llama-3.1-Nemotron-lorablated-70B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nbeerbower__Llama-3.1-Nemotron-lorablated-70B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
nbeerbower/Llama-3.1-Nemotron-lorablated-70B
|
f335a582cdb7fb0e63a7343a908766ebd0ed9882
| 40.87645
|
llama3.1
| 15
| 70.554
| true
| false
| false
| false
| 70.178639
| 0.72288
| 72.287974
| 0.682505
| 54.182581
| 0.333837
| 33.383686
| 0.39094
| 18.791946
| 0.468167
| 18.354167
| 0.534325
| 48.258348
| true
| false
|
2024-10-17
|
2024-11-27
| 1
|
nbeerbower/Llama-3.1-Nemotron-lorablated-70B (Merge)
|
nbeerbower_Llama3.1-Gutenberg-Doppel-70B_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/nbeerbower/Llama3.1-Gutenberg-Doppel-70B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nbeerbower/Llama3.1-Gutenberg-Doppel-70B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nbeerbower__Llama3.1-Gutenberg-Doppel-70B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
nbeerbower/Llama3.1-Gutenberg-Doppel-70B
|
5de156e97f776ce1b88ce5b2e2dc1e7709205a82
| 36.92339
|
llama3.1
| 5
| 70.554
| true
| false
| false
| true
| 19.987186
| 0.709216
| 70.921599
| 0.666089
| 52.556779
| 0.212236
| 21.223565
| 0.344799
| 12.639821
| 0.489719
| 22.68151
| 0.473654
| 41.517066
| false
| false
|
2024-10-11
|
2024-10-12
| 1
|
nbeerbower/Llama3.1-Gutenberg-Doppel-70B (Merge)
|
nbeerbower_Lyra-Gutenberg-mistral-nemo-12B_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/nbeerbower/Lyra-Gutenberg-mistral-nemo-12B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nbeerbower/Lyra-Gutenberg-mistral-nemo-12B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nbeerbower__Lyra-Gutenberg-mistral-nemo-12B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
nbeerbower/Lyra-Gutenberg-mistral-nemo-12B
|
5c506391eb02075e02f4cf5953b443505d646bce
| 22.867364
|
cc-by-nc-4.0
| 20
| 12.248
| true
| false
| false
| true
| 3.837204
| 0.349488
| 34.948825
| 0.558625
| 36.992432
| 0.101208
| 10.120846
| 0.333893
| 11.185682
| 0.435667
| 14.758333
| 0.362783
| 29.198064
| false
| false
|
2024-08-23
|
2024-09-03
| 1
|
nbeerbower/Lyra-Gutenberg-mistral-nemo-12B (Merge)
|
nbeerbower_Lyra4-Gutenberg-12B_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/nbeerbower/Lyra4-Gutenberg-12B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nbeerbower/Lyra4-Gutenberg-12B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nbeerbower__Lyra4-Gutenberg-12B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
nbeerbower/Lyra4-Gutenberg-12B
|
cb6911be3475da99a810071c04803d6edfb5965b
| 19.844119
|
cc-by-nc-4.0
| 20
| 12.248
| true
| false
| false
| false
| 3.381067
| 0.221219
| 22.121859
| 0.538669
| 34.235593
| 0.129909
| 12.990937
| 0.318792
| 9.17226
| 0.403792
| 11.973958
| 0.357131
| 28.570109
| false
| false
|
2024-09-09
|
2024-09-12
| 1
|
nbeerbower/Lyra4-Gutenberg-12B (Merge)
|
nbeerbower_Lyra4-Gutenberg2-12B_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/nbeerbower/Lyra4-Gutenberg2-12B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nbeerbower/Lyra4-Gutenberg2-12B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nbeerbower__Lyra4-Gutenberg2-12B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
nbeerbower/Lyra4-Gutenberg2-12B
|
6a5f117695cc729de16da87654b979e6df72ed2f
| 19.944882
|
cc-by-nc-4.0
| 12
| 12.248
| true
| false
| false
| false
| 3.618679
| 0.258513
| 25.851297
| 0.534453
| 33.73064
| 0.117069
| 11.706949
| 0.312919
| 8.389262
| 0.397219
| 11.485677
| 0.356549
| 28.505467
| false
| false
|
2024-09-29
|
2024-10-01
| 1
|
nbeerbower/Lyra4-Gutenberg2-12B (Merge)
|
nbeerbower_Mahou-1.5-mistral-nemo-12B-lorablated_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/nbeerbower/Mahou-1.5-mistral-nemo-12B-lorablated" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nbeerbower/Mahou-1.5-mistral-nemo-12B-lorablated</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nbeerbower__Mahou-1.5-mistral-nemo-12B-lorablated-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
nbeerbower/Mahou-1.5-mistral-nemo-12B-lorablated
|
8c9eecaace50659647c7d8b569237ad068a6c837
| 27.050923
|
apache-2.0
| 3
| 12.248
| true
| false
| false
| true
| 2.810849
| 0.682488
| 68.248802
| 0.549604
| 36.077381
| 0.089124
| 8.912387
| 0.279362
| 3.914989
| 0.452167
| 16.554167
| 0.35738
| 28.597813
| true
| false
|
2024-10-19
|
2024-10-19
| 1
|
nbeerbower/Mahou-1.5-mistral-nemo-12B-lorablated (Merge)
|
nbeerbower_Mistral-Gutenberg-Doppel-7B-FFT_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/nbeerbower/Mistral-Gutenberg-Doppel-7B-FFT" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nbeerbower/Mistral-Gutenberg-Doppel-7B-FFT</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nbeerbower__Mistral-Gutenberg-Doppel-7B-FFT-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
nbeerbower/Mistral-Gutenberg-Doppel-7B-FFT
|
5735876465b6f2523fdedb73120c3f97d04556d3
| 18.338276
|
apache-2.0
| 2
| 7.242
| true
| false
| false
| true
| 0.873633
| 0.57168
| 57.167981
| 0.407625
| 17.346575
| 0.024924
| 2.492447
| 0.283557
| 4.474273
| 0.405938
| 9.342188
| 0.272856
| 19.206191
| false
| false
|
2024-11-18
|
2024-11-18
| 1
|
nbeerbower/Mistral-Gutenberg-Doppel-7B-FFT (Merge)
|
nbeerbower_Mistral-Nemo-Gutenberg-Doppel-12B_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/nbeerbower/Mistral-Nemo-Gutenberg-Doppel-12B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nbeerbower/Mistral-Nemo-Gutenberg-Doppel-12B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nbeerbower__Mistral-Nemo-Gutenberg-Doppel-12B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
nbeerbower/Mistral-Nemo-Gutenberg-Doppel-12B
|
0eaaac89d4b53e94d5b78220b24439a026ee29e6
| 21.537987
|
apache-2.0
| 5
| 12.248
| true
| false
| false
| false
| 3.553543
| 0.356707
| 35.670687
| 0.527461
| 32.421527
| 0.121601
| 12.160121
| 0.316275
| 8.836689
| 0.413219
| 11.485677
| 0.357879
| 28.653221
| false
| false
|
2024-09-26
|
2024-09-26
| 1
|
nbeerbower/Mistral-Nemo-Gutenberg-Doppel-12B (Merge)
|
nbeerbower_Mistral-Nemo-Gutenberg-Doppel-12B-v2_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/nbeerbower/Mistral-Nemo-Gutenberg-Doppel-12B-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nbeerbower/Mistral-Nemo-Gutenberg-Doppel-12B-v2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nbeerbower__Mistral-Nemo-Gutenberg-Doppel-12B-v2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
nbeerbower/Mistral-Nemo-Gutenberg-Doppel-12B-v2
|
adc1ccd9d83d24e41bed895f989803af87ea2d2c
| 25.901264
|
apache-2.0
| 9
| 12.248
| true
| false
| false
| true
| 2.809713
| 0.653587
| 65.358693
| 0.53745
| 34.357413
| 0.115559
| 11.555891
| 0.270973
| 2.796421
| 0.423302
| 13.046094
| 0.354638
| 28.29307
| false
| false
|
2024-10-04
|
2024-10-09
| 1
|
nbeerbower/Mistral-Nemo-Gutenberg-Doppel-12B-v2 (Merge)
|
nbeerbower_Mistral-Nemo-Moderne-12B-FFT-experimental_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/nbeerbower/Mistral-Nemo-Moderne-12B-FFT-experimental" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nbeerbower/Mistral-Nemo-Moderne-12B-FFT-experimental</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nbeerbower__Mistral-Nemo-Moderne-12B-FFT-experimental-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
nbeerbower/Mistral-Nemo-Moderne-12B-FFT-experimental
|
e52f7b7c3ade2a6212f29dd1054332cee21ab85d
| 18.119466
|
apache-2.0
| 1
| 12.248
| true
| false
| false
| true
| 2.434315
| 0.335225
| 33.522498
| 0.523409
| 32.07154
| 0.077039
| 7.703927
| 0.28104
| 4.138702
| 0.37149
| 4.002865
| 0.345495
| 27.277261
| false
| false
|
2024-11-19
|
2024-11-26
| 1
|
nbeerbower/Mistral-Nemo-Moderne-12B-FFT-experimental (Merge)
|
nbeerbower_Mistral-Nemo-Prism-12B_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/nbeerbower/Mistral-Nemo-Prism-12B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nbeerbower/Mistral-Nemo-Prism-12B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nbeerbower__Mistral-Nemo-Prism-12B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
nbeerbower/Mistral-Nemo-Prism-12B
|
a39e1c8c083c172aaa3ca81faf8ba3b4799a888f
| 27.924007
|
apache-2.0
| 3
| 12.248
| true
| false
| false
| true
| 1.918576
| 0.68581
| 68.581032
| 0.547519
| 35.918008
| 0.086858
| 8.685801
| 0.307886
| 7.718121
| 0.462615
| 17.960156
| 0.358128
| 28.680925
| false
| false
|
2024-11-12
|
2024-11-12
| 1
|
nbeerbower/Mistral-Nemo-Prism-12B (Merge)
|
nbeerbower_Mistral-Nemo-Prism-12B-v2_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/nbeerbower/Mistral-Nemo-Prism-12B-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nbeerbower/Mistral-Nemo-Prism-12B-v2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nbeerbower__Mistral-Nemo-Prism-12B-v2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
nbeerbower/Mistral-Nemo-Prism-12B-v2
|
d7545999274cb56b5f961580b5234e8a647e023a
| 28.070465
|
apache-2.0
| 4
| 12.248
| true
| false
| false
| true
| 1.871104
| 0.697401
| 69.740067
| 0.549188
| 36.199788
| 0.089124
| 8.912387
| 0.305369
| 7.38255
| 0.459979
| 17.664063
| 0.356715
| 28.523936
| false
| false
|
2024-11-12
|
2024-11-26
| 1
|
nbeerbower/Mistral-Nemo-Prism-12B-v2 (Merge)
|
nbeerbower_Mistral-Nemo-Prism-12B-v7_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/nbeerbower/Mistral-Nemo-Prism-12B-v7" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nbeerbower/Mistral-Nemo-Prism-12B-v7</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nbeerbower__Mistral-Nemo-Prism-12B-v7-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
nbeerbower/Mistral-Nemo-Prism-12B-v7
|
0c9da9f3903be14fda1fcae245c22f873442b86f
| 28.034788
|
apache-2.0
| 6
| 12.248
| true
| false
| false
| true
| 1.951022
| 0.696152
| 69.615177
| 0.55211
| 36.440017
| 0.086858
| 8.685801
| 0.299497
| 6.599553
| 0.463885
| 18.085677
| 0.359043
| 28.782506
| false
| false
|
2024-11-13
|
2024-11-26
| 1
|
nbeerbower/Mistral-Nemo-Prism-12B-v7 (Merge)
|
nbeerbower_Mistral-Small-Drummer-22B_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/nbeerbower/Mistral-Small-Drummer-22B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nbeerbower/Mistral-Small-Drummer-22B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nbeerbower__Mistral-Small-Drummer-22B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
nbeerbower/Mistral-Small-Drummer-22B
|
53b21ece0c64ffc8aba81f294ad19e2c06e9852c
| 29.819409
|
other
| 14
| 22.247
| true
| false
| false
| false
| 3.225443
| 0.633129
| 63.312899
| 0.57932
| 40.12177
| 0.188822
| 18.882175
| 0.343121
| 12.416107
| 0.406365
| 9.795573
| 0.409491
| 34.387928
| false
| false
|
2024-09-26
|
2024-10-01
| 1
|
nbeerbower/Mistral-Small-Drummer-22B (Merge)
|
nbeerbower_Mistral-Small-Gutenberg-Doppel-22B_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/nbeerbower/Mistral-Small-Gutenberg-Doppel-22B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nbeerbower/Mistral-Small-Gutenberg-Doppel-22B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nbeerbower__Mistral-Small-Gutenberg-Doppel-22B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
nbeerbower/Mistral-Small-Gutenberg-Doppel-22B
|
d8091aad5f882b714321e4d51f504cc61996ee67
| 27.97204
|
other
| 11
| 22.247
| true
| false
| false
| false
| 3.177206
| 0.489323
| 48.932277
| 0.585893
| 40.931345
| 0.218278
| 21.827795
| 0.346477
| 12.863535
| 0.397063
| 8.566146
| 0.4124
| 34.711141
| false
| false
|
2024-09-25
|
2024-09-25
| 1
|
nbeerbower/Mistral-Small-Gutenberg-Doppel-22B (Merge)
|
nbeerbower_Nemo-Loony-12B-experimental_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/nbeerbower/Nemo-Loony-12B-experimental" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nbeerbower/Nemo-Loony-12B-experimental</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nbeerbower__Nemo-Loony-12B-experimental-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
nbeerbower/Nemo-Loony-12B-experimental
|
7b06f30502a9b58c028ac1079e1b3d2988b76866
| 10.469567
| 0
| 12.248
| false
| false
| false
| true
| 2.475163
| 0.373444
| 37.344357
| 0.382222
| 12.974588
| 0.015106
| 1.510574
| 0.270134
| 2.684564
| 0.334063
| 1.757812
| 0.15891
| 6.545508
| false
| false
|
2024-11-26
|
2024-11-26
| 1
|
nbeerbower/Nemo-Loony-12B-experimental (Merge)
|
|
nbeerbower_Nemoties-ChatML-12B_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/nbeerbower/Nemoties-ChatML-12B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nbeerbower/Nemoties-ChatML-12B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nbeerbower__Nemoties-ChatML-12B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
nbeerbower/Nemoties-ChatML-12B
|
5e088ff3e7e9d09a868be5db3a8d5a03c2e7dd16
| 26.439235
| 1
| 12.248
| false
| false
| false
| true
| 1.605363
| 0.6382
| 63.819998
| 0.547025
| 35.765795
| 0.07855
| 7.854985
| 0.29698
| 6.263982
| 0.450865
| 16.591406
| 0.355053
| 28.339243
| false
| false
|
2025-02-21
|
2025-02-21
| 1
|
nbeerbower/Nemoties-ChatML-12B (Merge)
|
|
nbeerbower_Qwen2.5-Gutenberg-Doppel-14B_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/nbeerbower/Qwen2.5-Gutenberg-Doppel-14B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nbeerbower/Qwen2.5-Gutenberg-Doppel-14B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nbeerbower__Qwen2.5-Gutenberg-Doppel-14B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
nbeerbower/Qwen2.5-Gutenberg-Doppel-14B
|
11a5060f9e7315ea07241106f086ac4694dded60
| 41.327795
|
apache-2.0
| 12
| 14.77
| true
| false
| false
| true
| 3.381224
| 0.809083
| 80.908323
| 0.638174
| 48.238909
| 0.541541
| 54.154079
| 0.333054
| 11.073826
| 0.410063
| 10.024479
| 0.492104
| 43.567154
| false
| false
|
2024-11-11
|
2024-11-11
| 1
|
nbeerbower/Qwen2.5-Gutenberg-Doppel-14B (Merge)
|
nbeerbower_SmolNemo-12B-FFT-experimental_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/nbeerbower/SmolNemo-12B-FFT-experimental" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nbeerbower/SmolNemo-12B-FFT-experimental</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nbeerbower__SmolNemo-12B-FFT-experimental-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
nbeerbower/SmolNemo-12B-FFT-experimental
|
d8d7a90ae9b9cb79cdc0912a685c3cb8d7a25560
| 8.496288
|
apache-2.0
| 0
| 12.248
| true
| false
| false
| true
| 2.45083
| 0.334801
| 33.480055
| 0.333609
| 6.542439
| 0.01284
| 1.283988
| 0.260067
| 1.342282
| 0.384698
| 5.920573
| 0.121676
| 2.408392
| false
| false
|
2024-11-25
|
2024-11-26
| 1
|
nbeerbower/SmolNemo-12B-FFT-experimental (Merge)
|
nbeerbower_Stella-mistral-nemo-12B-v2_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/nbeerbower/Stella-mistral-nemo-12B-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nbeerbower/Stella-mistral-nemo-12B-v2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nbeerbower__Stella-mistral-nemo-12B-v2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
nbeerbower/Stella-mistral-nemo-12B-v2
|
b81bab28f7dcb25a0aa0fe4dcf957f3083ee6b43
| 22.49331
| 5
| 12.248
| false
| false
| false
| false
| 3.481744
| 0.327431
| 32.743122
| 0.548375
| 35.364516
| 0.116314
| 11.63142
| 0.332215
| 10.961969
| 0.430396
| 14.432812
| 0.368434
| 29.82602
| false
| false
|
2024-09-07
|
2024-09-14
| 1
|
nbeerbower/Stella-mistral-nemo-12B-v2 (Merge)
|
|
nbeerbower_gemma2-gutenberg-27B_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/nbeerbower/gemma2-gutenberg-27B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nbeerbower/gemma2-gutenberg-27B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nbeerbower__gemma2-gutenberg-27B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
nbeerbower/gemma2-gutenberg-27B
|
d4febe52e8b7b13a98126dbf1716ed1329f48922
| 10.423664
|
gemma
| 6
| 27.227
| true
| false
| false
| false
| 15.390917
| 0.294708
| 29.470804
| 0.379657
| 13.091525
| 0.018882
| 1.888218
| 0.272651
| 3.020134
| 0.372729
| 4.157813
| 0.198221
| 10.91349
| false
| false
|
2024-09-09
|
2024-09-23
| 1
|
nbeerbower/gemma2-gutenberg-27B (Merge)
|
nbeerbower_gemma2-gutenberg-9B_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/nbeerbower/gemma2-gutenberg-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nbeerbower/gemma2-gutenberg-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nbeerbower__gemma2-gutenberg-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
nbeerbower/gemma2-gutenberg-9B
|
ebdab2d41f257fc9e7c858498653644d13386ce5
| 23.719246
|
gemma
| 12
| 9.242
| true
| false
| false
| false
| 5.619218
| 0.279595
| 27.959481
| 0.59509
| 42.355611
| 0.080816
| 8.081571
| 0.338087
| 11.744966
| 0.45951
| 16.705469
| 0.419215
| 35.468381
| false
| false
|
2024-07-14
|
2024-08-03
| 1
|
nbeerbower/gemma2-gutenberg-9B (Merge)
|
nbeerbower_llama-3-gutenberg-8B_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/nbeerbower/llama-3-gutenberg-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nbeerbower/llama-3-gutenberg-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nbeerbower__llama-3-gutenberg-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
nbeerbower/llama-3-gutenberg-8B
|
4ed3aac5e30c078bee79ae193c2d301d38860b20
| 21.308817
|
other
| 8
| 8.03
| true
| false
| false
| false
| 1.767139
| 0.437191
| 43.71911
| 0.49936
| 27.958133
| 0.07855
| 7.854985
| 0.301174
| 6.823266
| 0.407302
| 10.046094
| 0.383062
| 31.451315
| false
| false
|
2024-05-05
|
2024-07-10
| 1
|
nbeerbower/llama-3-gutenberg-8B (Merge)
|
nbeerbower_llama3.1-cc-8B_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/nbeerbower/llama3.1-cc-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nbeerbower/llama3.1-cc-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nbeerbower__llama3.1-cc-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
nbeerbower/llama3.1-cc-8B
|
5269bb26f1afe005f144564f484e7554f185239f
| 20.256042
|
llama3
| 1
| 8.03
| true
| false
| false
| false
| 1.874475
| 0.506809
| 50.68086
| 0.487119
| 26.483812
| 0.070997
| 7.099698
| 0.285235
| 4.697987
| 0.38851
| 6.497135
| 0.334691
| 26.076758
| false
| false
|
2024-08-18
|
2024-09-14
| 1
|
nbeerbower/llama3.1-cc-8B (Merge)
|
nbeerbower_llama3.1-kartoffeldes-70B_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/nbeerbower/llama3.1-kartoffeldes-70B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nbeerbower/llama3.1-kartoffeldes-70B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nbeerbower__llama3.1-kartoffeldes-70B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
nbeerbower/llama3.1-kartoffeldes-70B
|
4377c98f475f2af018fa4fef77f12106001bc1bf
| 41.11056
|
llama3.1
| 0
| 70.554
| true
| false
| false
| true
| 23.335034
| 0.823022
| 82.30218
| 0.689388
| 55.593933
| 0.321752
| 32.175227
| 0.35151
| 13.534676
| 0.464604
| 18.742187
| 0.498836
| 44.31516
| false
| false
|
2025-01-19
|
2025-01-19
| 1
|
nbeerbower/llama3.1-kartoffeldes-70B (Merge)
|
nbeerbower_mistral-nemo-bophades-12B_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/nbeerbower/mistral-nemo-bophades-12B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nbeerbower/mistral-nemo-bophades-12B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nbeerbower__mistral-nemo-bophades-12B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
nbeerbower/mistral-nemo-bophades-12B
|
252a358e099f77a0a28125e00a57aa3a107b3910
| 25.728602
|
apache-2.0
| 9
| 12.248
| true
| false
| false
| true
| 4.104693
| 0.679441
| 67.944055
| 0.498847
| 29.543905
| 0.123112
| 12.311178
| 0.285235
| 4.697987
| 0.417781
| 12.089323
| 0.350066
| 27.785165
| false
| false
|
2024-08-13
|
2024-09-03
| 1
|
nbeerbower/mistral-nemo-bophades-12B (Merge)
|
nbeerbower_mistral-nemo-bophades3-12B_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/nbeerbower/mistral-nemo-bophades3-12B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nbeerbower/mistral-nemo-bophades3-12B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nbeerbower__mistral-nemo-bophades3-12B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
nbeerbower/mistral-nemo-bophades3-12B
|
0cb218eece0991c7ff585fb31fe82f883fc71a55
| 27.165587
|
apache-2.0
| 2
| 12.248
| true
| false
| false
| true
| 1.877824
| 0.657784
| 65.778357
| 0.544933
| 35.244657
| 0.084592
| 8.459215
| 0.312081
| 8.277405
| 0.460448
| 18.889323
| 0.337101
| 26.344563
| false
| false
|
2025-01-13
|
2025-01-13
| 1
|
nbeerbower/mistral-nemo-bophades3-12B (Merge)
|
nbeerbower_mistral-nemo-cc-12B_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/nbeerbower/mistral-nemo-cc-12B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nbeerbower/mistral-nemo-cc-12B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nbeerbower__mistral-nemo-cc-12B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
nbeerbower/mistral-nemo-cc-12B
|
fc32293e0b022d6daef9bfdb0c54d57a5226bf9a
| 17.20341
|
apache-2.0
| 1
| 12.248
| true
| false
| false
| false
| 2.989245
| 0.143532
| 14.353249
| 0.539941
| 34.446547
| 0.02568
| 2.567976
| 0.315436
| 8.724832
| 0.442365
| 14.26224
| 0.359791
| 28.865618
| false
| false
|
2024-08-18
|
2024-09-14
| 1
|
nbeerbower/mistral-nemo-cc-12B (Merge)
|
nbeerbower_mistral-nemo-gutades-12B_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/nbeerbower/mistral-nemo-gutades-12B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nbeerbower/mistral-nemo-gutades-12B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nbeerbower__mistral-nemo-gutades-12B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
nbeerbower/mistral-nemo-gutades-12B
|
5689f929808a6165f94ba43f872b944a4bdaaea3
| 21.075925
|
apache-2.0
| 2
| 12.248
| true
| false
| false
| false
| 3.64924
| 0.342519
| 34.251896
| 0.540719
| 34.574408
| 0.117825
| 11.782477
| 0.315436
| 8.724832
| 0.404042
| 8.671875
| 0.356051
| 28.450059
| false
| false
|
2024-09-17
|
2024-09-23
| 1
|
nbeerbower/mistral-nemo-gutades-12B (Merge)
|
nbeerbower_mistral-nemo-gutenberg-12B_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/nbeerbower/mistral-nemo-gutenberg-12B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nbeerbower/mistral-nemo-gutenberg-12B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nbeerbower__mistral-nemo-gutenberg-12B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
nbeerbower/mistral-nemo-gutenberg-12B
|
6aeb6f769a53eb111839db8f439b614730e39593
| 21.024155
|
apache-2.0
| 8
| 12.248
| true
| false
| false
| false
| 3.149629
| 0.350387
| 35.038697
| 0.528136
| 32.433874
| 0.116314
| 11.63142
| 0.307047
| 7.606264
| 0.417063
| 10.966146
| 0.356217
| 28.468528
| false
| false
|
2024-08-12
|
2024-09-03
| 1
|
nbeerbower/mistral-nemo-gutenberg-12B (Merge)
|
nbeerbower_mistral-nemo-gutenberg-12B-v2_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/nbeerbower/mistral-nemo-gutenberg-12B-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nbeerbower/mistral-nemo-gutenberg-12B-v2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nbeerbower__mistral-nemo-gutenberg-12B-v2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
nbeerbower/mistral-nemo-gutenberg-12B-v2
|
86bf9c105ff40835132e41699ac1a76ee0e5b683
| 25.51432
|
apache-2.0
| 34
| 12.248
| true
| false
| false
| true
| 4.211238
| 0.62034
| 62.033959
| 0.53972
| 34.730616
| 0.108761
| 10.876133
| 0.277685
| 3.691275
| 0.428698
| 13.98724
| 0.3499
| 27.766696
| false
| false
|
2024-08-13
|
2024-09-03
| 1
|
nbeerbower/mistral-nemo-gutenberg-12B-v2 (Merge)
|
nbeerbower_mistral-nemo-gutenberg-12B-v3_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/nbeerbower/mistral-nemo-gutenberg-12B-v3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nbeerbower/mistral-nemo-gutenberg-12B-v3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nbeerbower__mistral-nemo-gutenberg-12B-v3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
nbeerbower/mistral-nemo-gutenberg-12B-v3
|
3e1a716281f23280abd72e402139c578faca175a
| 19.290512
|
apache-2.0
| 11
| 12.248
| true
| false
| false
| false
| 3.670738
| 0.218271
| 21.827085
| 0.544066
| 34.957915
| 0.059668
| 5.966767
| 0.314597
| 8.612975
| 0.445031
| 14.995573
| 0.364445
| 29.382757
| false
| false
|
2024-08-15
|
2024-09-03
| 1
|
nbeerbower/mistral-nemo-gutenberg-12B-v3 (Merge)
|
nbeerbower_mistral-nemo-gutenberg-12B-v4_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/nbeerbower/mistral-nemo-gutenberg-12B-v4" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nbeerbower/mistral-nemo-gutenberg-12B-v4</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nbeerbower__mistral-nemo-gutenberg-12B-v4-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
nbeerbower/mistral-nemo-gutenberg-12B-v4
|
59409afe585ae6945a588c867f879a9d31e571e6
| 19.838981
|
apache-2.0
| 20
| 12.248
| true
| false
| false
| false
| 3.520923
| 0.23793
| 23.79298
| 0.526903
| 31.971258
| 0.126133
| 12.613293
| 0.316275
| 8.836689
| 0.410427
| 13.203385
| 0.357547
| 28.616283
| false
| false
|
2024-08-22
|
2024-09-03
| 1
|
nbeerbower/mistral-nemo-gutenberg-12B-v4 (Merge)
|
nbeerbower_mistral-nemo-gutenberg2-12B-test_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/nbeerbower/mistral-nemo-gutenberg2-12B-test" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nbeerbower/mistral-nemo-gutenberg2-12B-test</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nbeerbower__mistral-nemo-gutenberg2-12B-test-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
nbeerbower/mistral-nemo-gutenberg2-12B-test
|
10da6150b0bedf8fd59206d72c4c0335ac665df3
| 20.97058
|
apache-2.0
| 1
| 12.248
| true
| false
| false
| false
| 3.350054
| 0.338472
| 33.847192
| 0.525478
| 32.044759
| 0.116314
| 11.63142
| 0.317114
| 8.948546
| 0.415729
| 10.966146
| 0.355469
| 28.385417
| false
| false
|
2024-09-24
|
2024-09-25
| 1
|
nbeerbower/mistral-nemo-gutenberg2-12B-test (Merge)
|
nbeerbower_mistral-nemo-kartoffel-12B_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/nbeerbower/mistral-nemo-kartoffel-12B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nbeerbower/mistral-nemo-kartoffel-12B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nbeerbower__mistral-nemo-kartoffel-12B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
nbeerbower/mistral-nemo-kartoffel-12B
|
4e0cfd2462bd5761ec200a4bd6ebb20e5fa4a9ad
| 28.219958
|
apache-2.0
| 4
| 12.248
| true
| false
| false
| true
| 1.529124
| 0.703171
| 70.317092
| 0.54838
| 36.052533
| 0.085347
| 8.534743
| 0.30453
| 7.270694
| 0.465281
| 18.426823
| 0.358461
| 28.717863
| false
| false
|
2025-01-12
|
2025-01-12
| 1
|
nbeerbower/mistral-nemo-kartoffel-12B (Merge)
|
nbeerbower_mistral-nemo-narwhal-12B_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/nbeerbower/mistral-nemo-narwhal-12B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nbeerbower/mistral-nemo-narwhal-12B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nbeerbower__mistral-nemo-narwhal-12B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
nbeerbower/mistral-nemo-narwhal-12B
|
9384d7e572a09181c79e19d934ab865f7a7d4efc
| 21.240878
|
apache-2.0
| 1
| 12.248
| true
| false
| false
| true
| 1.88525
| 0.554919
| 55.491873
| 0.505737
| 29.56279
| 0.058157
| 5.81571
| 0.270973
| 2.796421
| 0.384698
| 6.18724
| 0.348321
| 27.591238
| false
| false
|
2025-01-13
|
2025-01-13
| 1
|
nbeerbower/mistral-nemo-narwhal-12B (Merge)
|
nbeerbower_mistral-nemo-wissenschaft-12B_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/nbeerbower/mistral-nemo-wissenschaft-12B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nbeerbower/mistral-nemo-wissenschaft-12B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nbeerbower__mistral-nemo-wissenschaft-12B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
nbeerbower/mistral-nemo-wissenschaft-12B
|
2480f9924415c72fe00ae9391bb15a6d05c889eb
| 25.509926
|
apache-2.0
| 8
| 12.248
| true
| false
| false
| true
| 2.858747
| 0.652013
| 65.201332
| 0.504031
| 29.567999
| 0.121601
| 12.160121
| 0.292785
| 5.704698
| 0.417781
| 12.289323
| 0.353225
| 28.136082
| false
| false
|
2024-08-12
|
2024-08-30
| 1
|
nbeerbower/mistral-nemo-wissenschaft-12B (Merge)
|
nbrahme_IndusQ_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
GPT2LMHeadModel
|
<a target="_blank" href="https://huggingface.co/nbrahme/IndusQ" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nbrahme/IndusQ</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nbrahme__IndusQ-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
nbrahme/IndusQ
|
d4224f753e6a2d6e7476752fb927c26c55ec9467
| 5.636134
|
osl-3.0
| 0
| 1.176
| true
| false
| false
| true
| 0.301234
| 0.243975
| 24.397488
| 0.30624
| 3.747096
| 0.000755
| 0.075529
| 0.265101
| 2.013423
| 0.336635
| 2.246094
| 0.112035
| 1.337175
| false
| false
|
2024-09-16
|
2024-09-18
| 0
|
nbrahme/IndusQ
|
necva_IE-cont-Llama3.1-8B_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/necva/IE-cont-Llama3.1-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">necva/IE-cont-Llama3.1-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/necva__IE-cont-Llama3.1-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
necva/IE-cont-Llama3.1-8B
|
7c56e751113c503f77205f2ac70b52bd5918a15d
| 5.093187
|
mit
| 0
| 8.03
| true
| false
| false
| false
| 0.732557
| 0.204907
| 20.490742
| 0.291178
| 2.347041
| 0
| 0
| 0.260067
| 1.342282
| 0.357531
| 4.52474
| 0.116689
| 1.854314
| false
| false
|
2025-03-05
|
2025-03-06
| 1
|
necva/IE-cont-Llama3.1-8B (Merge)
|
necva_replica-IEPile_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/necva/replica-IEPile" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">necva/replica-IEPile</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/necva__replica-IEPile-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
necva/replica-IEPile
|
5f6ea567a0bd2addd2d4eb3c9d17ef40b18aab05
| 21.560585
| 0
| 4.65
| false
| false
| false
| false
| 1.203073
| 0.467791
| 46.779102
| 0.477858
| 25.316519
| 0.123867
| 12.386707
| 0.306208
| 7.494407
| 0.39976
| 8.936719
| 0.356051
| 28.450059
| false
| false
|
2025-03-01
| 0
|
Removed
|
||
neopolita_jessi-v0.1-bf16-falcon3-7b-instruct_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/neopolita/jessi-v0.1-bf16-falcon3-7b-instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">neopolita/jessi-v0.1-bf16-falcon3-7b-instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/neopolita__jessi-v0.1-bf16-falcon3-7b-instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
neopolita/jessi-v0.1-bf16-falcon3-7b-instruct
|
da9877089d4975e84e10cbb026125aa5574a4dc7
| 34.924792
| 0
| 7.456
| false
| false
| false
| true
| 1.22835
| 0.752705
| 75.270504
| 0.551613
| 36.134679
| 0.380665
| 38.066465
| 0.302852
| 7.04698
| 0.48249
| 20.544531
| 0.39237
| 32.485594
| false
| false
|
2025-01-08
| 0
|
Removed
|
||
neopolita_jessi-v0.1-falcon3-10b-instruct_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/neopolita/jessi-v0.1-falcon3-10b-instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">neopolita/jessi-v0.1-falcon3-10b-instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/neopolita__jessi-v0.1-falcon3-10b-instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
neopolita/jessi-v0.1-falcon3-10b-instruct
|
f50e54e6bf8af96e770139a2ae67ce51d5c00f14
| 32.335608
| 0
| 10.306
| false
| false
| false
| true
| 1.581637
| 0.755153
| 75.515299
| 0.595288
| 41.440336
| 0.200151
| 20.015106
| 0.318792
| 9.17226
| 0.427854
| 12.448437
| 0.4188
| 35.422207
| false
| false
|
2025-01-19
| 0
|
Removed
|
||
neopolita_jessi-v0.1-qwen2.5-7b-instruct_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/neopolita/jessi-v0.1-qwen2.5-7b-instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">neopolita/jessi-v0.1-qwen2.5-7b-instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/neopolita__jessi-v0.1-qwen2.5-7b-instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
neopolita/jessi-v0.1-qwen2.5-7b-instruct
|
e73d1545a57a4c8e20df50d6641b08015fb039a3
| 32.775633
|
apache-2.0
| 0
| 7.616
| true
| false
| false
| false
| 5.537725
| 0.732672
| 73.267153
| 0.529232
| 33.342259
| 0.40861
| 40.861027
| 0.29698
| 6.263982
| 0.391365
| 7.053906
| 0.422789
| 35.86547
| false
| false
|
2025-01-08
|
2025-01-26
| 1
|
neopolita/jessi-v0.1-qwen2.5-7b-instruct (Merge)
|
neopolita_jessi-v0.1-virtuoso-small_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/neopolita/jessi-v0.1-virtuoso-small" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">neopolita/jessi-v0.1-virtuoso-small</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/neopolita__jessi-v0.1-virtuoso-small-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
neopolita/jessi-v0.1-virtuoso-small
|
b7cd58e26d637faf1576bee68bfa35432fa3ce11
| 38.869604
|
apache-2.0
| 1
| 14.77
| true
| false
| false
| true
| 3.123383
| 0.795919
| 79.591927
| 0.644286
| 48.860315
| 0.339879
| 33.987915
| 0.330537
| 10.738255
| 0.436167
| 14.154167
| 0.512965
| 45.885047
| false
| false
|
2025-01-15
|
2025-01-15
| 1
|
neopolita/jessi-v0.1-virtuoso-small (Merge)
|
neopolita_jessi-v0.2-falcon3-10b-instruct_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/neopolita/jessi-v0.2-falcon3-10b-instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">neopolita/jessi-v0.2-falcon3-10b-instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/neopolita__jessi-v0.2-falcon3-10b-instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
neopolita/jessi-v0.2-falcon3-10b-instruct
|
94f3286c5724f51b319ff19ed1befd55341e34c2
| 34.104501
| 0
| 10.306
| false
| false
| false
| true
| 1.7127
| 0.77681
| 77.680998
| 0.620485
| 45.021844
| 0.212236
| 21.223565
| 0.328859
| 10.514541
| 0.428135
| 12.916927
| 0.435422
| 37.269134
| false
| false
|
2025-01-21
| 0
|
Removed
|
||
neopolita_jessi-v0.2-falcon3-7b-instruct_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/neopolita/jessi-v0.2-falcon3-7b-instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">neopolita/jessi-v0.2-falcon3-7b-instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/neopolita__jessi-v0.2-falcon3-7b-instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
neopolita/jessi-v0.2-falcon3-7b-instruct
|
efd74223ccaf140aab43df0c9a271007e826124b
| 29.040362
|
other
| 0
| 7.456
| true
| false
| false
| true
| 1.691995
| 0.577075
| 57.707549
| 0.536308
| 34.382892
| 0.253776
| 25.377644
| 0.317114
| 8.948546
| 0.447885
| 15.552344
| 0.390459
| 32.273197
| false
| false
|
2025-01-08
|
2025-01-08
| 1
|
neopolita/jessi-v0.2-falcon3-7b-instruct (Merge)
|
neopolita_jessi-v0.3-falcon3-7b-instruct_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/neopolita/jessi-v0.3-falcon3-7b-instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">neopolita/jessi-v0.3-falcon3-7b-instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/neopolita__jessi-v0.3-falcon3-7b-instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
neopolita/jessi-v0.3-falcon3-7b-instruct
|
6f95f4e365f8f087f6d037e72493ecff68d9298f
| 31.561579
|
other
| 0
| 7.456
| true
| false
| false
| true
| 1.822888
| 0.750906
| 75.090648
| 0.538794
| 34.565268
| 0.188822
| 18.882175
| 0.319631
| 9.284116
| 0.469156
| 18.544531
| 0.397025
| 33.002733
| false
| false
|
2025-01-08
|
2025-01-08
| 1
|
neopolita/jessi-v0.3-falcon3-7b-instruct (Merge)
|
neopolita_jessi-v0.4-falcon3-7b-instruct_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/neopolita/jessi-v0.4-falcon3-7b-instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">neopolita/jessi-v0.4-falcon3-7b-instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/neopolita__jessi-v0.4-falcon3-7b-instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
neopolita/jessi-v0.4-falcon3-7b-instruct
|
c27fe9b5f1c31a3c7e8a7f19be037a6d1fb1b090
| 35.582653
|
other
| 0
| 7.456
| true
| false
| false
| true
| 1.321472
| 0.760374
| 76.037359
| 0.552167
| 36.167444
| 0.376888
| 37.688822
| 0.302852
| 7.04698
| 0.497125
| 23.173958
| 0.400432
| 33.381353
| false
| false
|
2025-01-17
|
2025-01-17
| 1
|
neopolita/jessi-v0.4-falcon3-7b-instruct (Merge)
|
neopolita_jessi-v0.5-falcon3-7b-instruct_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/neopolita/jessi-v0.5-falcon3-7b-instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">neopolita/jessi-v0.5-falcon3-7b-instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/neopolita__jessi-v0.5-falcon3-7b-instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
neopolita/jessi-v0.5-falcon3-7b-instruct
|
1b70a742251a75c8a5fd047f0c9cdd5bffc27a43
| 35.17363
|
other
| 0
| 7.456
| true
| false
| false
| true
| 1.266255
| 0.741165
| 74.116455
| 0.558963
| 37.168069
| 0.373867
| 37.386707
| 0.311242
| 8.165548
| 0.486521
| 21.248437
| 0.396609
| 32.95656
| false
| false
|
2025-01-09
|
2025-01-09
| 1
|
neopolita/jessi-v0.5-falcon3-7b-instruct (Merge)
|
Subsets and Splits
Top Models by Combined Score
Identifies top-performing models with fewer than 34 billion parameters based on a combined score of two evaluation metrics, providing insights into efficient model performance.
Top 100 Official Models <70
This query identifies the top 100 high-scoring, officially provided models with fewer than 70 billion parameters, offering a useful overview for comparing performance metrics.
Top 100 Official Models < 2
Identifies top-performing AI models with fewer than 20 billion parameters, offering insights into efficiency and precision in smaller models.
Top 500 Official Models by Score
Identifies top performing models based on a combined score of IFEval and MMLU-PRO metrics, filtering by official providers and parameter count, offering insights into efficient model performance.
Top 200 Official Models by Score
Discovers top high-performing models with less than 70 billion parameters, highlighting their evaluation scores and characteristics, which is valuable for model selection and optimization.
SQL Console for open-llm-leaderboard/contents
Identifies top-performing models with fewer than 70 billion parameters, combining two evaluation metrics to reveal the best balanced options.
Top 10 Official Leaderboard Models
The query identifies top 10 official providers with under 13 billion parameters, ordered by their average metric, revealing valuable insights into efficient models.
SQL Console for open-llm-leaderboard/contents
This query filters and ranks models within a specific parameter range (6-8 billion) for the LlamaForCausalLM architecture based on their average performance metric.
SQL Console for open-llm-leaderboard/contents
Retrieves entries related to chat models that are officially provided, offering a filtered view of the dataset.
SQL Console for open-llm-leaderboard/contents
The query retrieves entries marked as "Official Providers", offering basic filtering but limited analytical value.
Top 10 Official Training Data
The query retrieves a small sample of records from the 'train' dataset where the "Official Providers" flag is true, providing basic filtering with limited analytical value.