eval_name
stringlengths 12
111
| Precision
stringclasses 3
values | Type
stringclasses 7
values | T
stringclasses 7
values | Weight type
stringclasses 2
values | Architecture
stringclasses 64
values | Model
stringlengths 355
689
| fullname
stringlengths 4
102
| Model sha
stringlengths 0
40
| Average ⬆️
float64 0.74
52.1
| Hub License
stringclasses 27
values | Hub ❤️
int64 0
6.09k
| #Params (B)
float64 -1
141
| Available on the hub
bool 2
classes | MoE
bool 2
classes | Flagged
bool 2
classes | Chat Template
bool 2
classes | CO₂ cost (kg)
float64 0.04
187
| IFEval Raw
float64 0
0.9
| IFEval
float64 0
90
| BBH Raw
float64 0.22
0.83
| BBH
float64 0.25
76.7
| MATH Lvl 5 Raw
float64 0
0.71
| MATH Lvl 5
float64 0
71.5
| GPQA Raw
float64 0.21
0.47
| GPQA
float64 0
29.4
| MUSR Raw
float64 0.29
0.6
| MUSR
float64 0
38.7
| MMLU-PRO Raw
float64 0.1
0.73
| MMLU-PRO
float64 0
70
| Merged
bool 2
classes | Official Providers
bool 2
classes | Upload To Hub Date
stringclasses 525
values | Submission Date
stringclasses 263
values | Generation
int64 0
10
| Base Model
stringlengths 4
102
|
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
nvidia_Hymba-1.5B-Instruct_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
HymbaForCausalLM
|
<a target="_blank" href="https://huggingface.co/nvidia/Hymba-1.5B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nvidia/Hymba-1.5B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nvidia__Hymba-1.5B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
nvidia/Hymba-1.5B-Instruct
|
ffc758eefef247c0ee4d7ce41636562759027ce6
| 14.192384 |
other
| 224 | 1.523 | true | false | false | true | 13.425332 | 0.600906 | 60.09056 | 0.306713 | 4.591464 | 0.02719 | 2.719033 | 0.288591 | 5.145414 | 0.331583 | 1.047917 | 0.204039 | 11.559914 | false | true |
2024-10-31
|
2024-12-06
| 1 |
nvidia/Hymba-1.5B-Instruct (Merge)
|
nvidia_Llama-3.1-Minitron-4B-Depth-Base_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/nvidia/Llama-3.1-Minitron-4B-Depth-Base" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nvidia/Llama-3.1-Minitron-4B-Depth-Base</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nvidia__Llama-3.1-Minitron-4B-Depth-Base-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
nvidia/Llama-3.1-Minitron-4B-Depth-Base
|
40d82bc951b4f39e9c9e11176334250c30975098
| 11.658051 |
other
| 21 | 4.02 | true | false | false | false | 0.935381 | 0.160694 | 16.069363 | 0.41707 | 19.44411 | 0.019637 | 1.963746 | 0.263423 | 1.789709 | 0.401063 | 10.699479 | 0.279837 | 19.9819 | false | true |
2024-08-13
|
2024-09-25
| 0 |
nvidia/Llama-3.1-Minitron-4B-Depth-Base
|
nvidia_Llama-3.1-Nemotron-70B-Instruct-HF_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/nvidia/Llama-3.1-Nemotron-70B-Instruct-HF" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nvidia/Llama-3.1-Nemotron-70B-Instruct-HF</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nvidia__Llama-3.1-Nemotron-70B-Instruct-HF-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
nvidia/Llama-3.1-Nemotron-70B-Instruct-HF
|
250db5cf2323e04a6d2025a2ca2b94a95c439e88
| 36.907173 |
llama3.1
| 2,028 | 70.554 | true | false | false | true | 27.257495 | 0.738067 | 73.806722 | 0.6316 | 47.10953 | 0.426737 | 42.673716 | 0.258389 | 1.118568 | 0.43276 | 13.195052 | 0.491855 | 43.53945 | false | true |
2024-10-12
|
2024-10-16
| 2 |
meta-llama/Meta-Llama-3.1-70B
|
nvidia_Minitron-4B-Base_bfloat16
|
bfloat16
|
🟢 pretrained
|
🟢
|
Original
|
NemotronForCausalLM
|
<a target="_blank" href="https://huggingface.co/nvidia/Minitron-4B-Base" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nvidia/Minitron-4B-Base</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nvidia__Minitron-4B-Base-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
nvidia/Minitron-4B-Base
|
d6321f64412982046a32d761701167e752fedc02
| 11.977737 |
other
| 133 | 4 | true | false | false | false | 2.378534 | 0.221794 | 22.179373 | 0.408388 | 17.215601 | 0.019637 | 1.963746 | 0.269295 | 2.572707 | 0.413375 | 9.938542 | 0.261968 | 17.996454 | false | true |
2024-07-19
|
2024-09-25
| 0 |
nvidia/Minitron-4B-Base
|
nvidia_Minitron-8B-Base_bfloat16
|
bfloat16
|
🟢 pretrained
|
🟢
|
Original
|
NemotronForCausalLM
|
<a target="_blank" href="https://huggingface.co/nvidia/Minitron-8B-Base" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nvidia/Minitron-8B-Base</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nvidia__Minitron-8B-Base-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
nvidia/Minitron-8B-Base
|
70fa5997afc42807f41eebd5d481f040556fdf97
| 14.216491 |
other
| 64 | 7.22 | true | false | false | false | 2.825041 | 0.242427 | 24.242676 | 0.439506 | 22.040793 | 0.02568 | 2.567976 | 0.27349 | 3.131991 | 0.402552 | 9.085677 | 0.318068 | 24.229832 | false | true |
2024-07-19
|
2024-09-25
| 0 |
nvidia/Minitron-8B-Base
|
nvidia_Mistral-NeMo-Minitron-8B-Base_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/nvidia/Mistral-NeMo-Minitron-8B-Base" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nvidia/Mistral-NeMo-Minitron-8B-Base</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nvidia__Mistral-NeMo-Minitron-8B-Base-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
nvidia/Mistral-NeMo-Minitron-8B-Base
|
cc94637b669b62c4829b1e0c3b9074fecd883b74
| 17.697926 |
other
| 171 | 7.88 | true | false | false | false | 5.115429 | 0.194566 | 19.456597 | 0.52191 | 31.822015 | 0.048338 | 4.833837 | 0.325503 | 10.067114 | 0.409156 | 8.944531 | 0.379571 | 31.06346 | false | true |
2024-08-19
|
2024-08-22
| 0 |
nvidia/Mistral-NeMo-Minitron-8B-Base
|
nvidia_Mistral-NeMo-Minitron-8B-Instruct_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/nvidia/Mistral-NeMo-Minitron-8B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nvidia/Mistral-NeMo-Minitron-8B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nvidia__Mistral-NeMo-Minitron-8B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
nvidia/Mistral-NeMo-Minitron-8B-Instruct
|
27964e305f862f9947f577332a943d7013abc30f
| 23.572596 |
other
| 74 | 8.414 | true | false | false | true | 3.987796 | 0.500389 | 50.038897 | 0.532092 | 34.126491 | 0.116314 | 11.63142 | 0.287752 | 5.033557 | 0.388573 | 7.371615 | 0.399102 | 33.233599 | false | true |
2024-10-02
|
2024-10-04
| 1 |
nvidia/Mistral-NeMo-Minitron-8B-Instruct (Merge)
|
nvidia_Nemotron-Mini-4B-Instruct_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
NemotronForCausalLM
|
<a target="_blank" href="https://huggingface.co/nvidia/Nemotron-Mini-4B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nvidia/Nemotron-Mini-4B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nvidia__Nemotron-Mini-4B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
nvidia/Nemotron-Mini-4B-Instruct
|
6a417790c444fd65a3da6a5c8821de6afc9654a6
| 18.363511 |
other
| 159 | 4 | true | false | false | true | 2.234628 | 0.666876 | 66.687611 | 0.386484 | 14.203825 | 0.02568 | 2.567976 | 0.280201 | 4.026846 | 0.376729 | 4.624479 | 0.262633 | 18.070331 | false | true |
2024-09-10
|
2024-09-25
| 0 |
nvidia/Nemotron-Mini-4B-Instruct
|
nvidia_OpenMath2-Llama3.1-8B_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/nvidia/OpenMath2-Llama3.1-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nvidia/OpenMath2-Llama3.1-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nvidia__OpenMath2-Llama3.1-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
nvidia/OpenMath2-Llama3.1-8B
|
4187cd28e77e76367261992b3274c77ffcbfd3d5
| 12.751665 |
llama3.1
| 28 | 8.03 | true | false | false | false | 2.552212 | 0.233059 | 23.305939 | 0.409552 | 16.29437 | 0.267372 | 26.73716 | 0.265101 | 2.013423 | 0.343552 | 2.010677 | 0.155336 | 6.148419 | false | true |
2024-09-30
|
2024-12-07
| 1 |
nvidia/OpenMath2-Llama3.1-8B (Merge)
|
nxmwxm_Beast-Soul-new_float16
|
float16
|
🤝 base merges and moerges
|
🤝
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/nxmwxm/Beast-Soul-new" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nxmwxm/Beast-Soul-new</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nxmwxm__Beast-Soul-new-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
nxmwxm/Beast-Soul-new
|
dd2ae8a96b7d088eb94a1cfa6ff84c3489e8c010
| 21.817673 | 0 | 7.242 | false | false | false | false | 1.314046 | 0.486875 | 48.687483 | 0.522714 | 33.072759 | 0.074018 | 7.401813 | 0.281879 | 4.250559 | 0.445927 | 14.140885 | 0.310173 | 23.352541 | false | false |
2024-08-07
|
2024-08-07
| 1 |
nxmwxm/Beast-Soul-new (Merge)
|
|
occiglot_occiglot-7b-es-en-instruct_float16
|
float16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/occiglot/occiglot-7b-es-en-instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">occiglot/occiglot-7b-es-en-instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/occiglot__occiglot-7b-es-en-instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
occiglot/occiglot-7b-es-en-instruct
|
5858f6ee118eef70896f1870fd61052348ff571e
| 12.457904 |
apache-2.0
| 2 | 7.242 | true | false | false | true | 1.377476 | 0.348514 | 34.851416 | 0.411097 | 17.23541 | 0.024169 | 2.416918 | 0.259228 | 1.230425 | 0.37375 | 4.452083 | 0.231051 | 14.56117 | false | false |
2024-03-05
|
2024-09-02
| 0 |
occiglot/occiglot-7b-es-en-instruct
|
odyssey-labs_Astral-1-10B_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/odyssey-labs/Astral-1-10B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">odyssey-labs/Astral-1-10B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/odyssey-labs__Astral-1-10B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
odyssey-labs/Astral-1-10B
|
00b55cd83fb4b97cd2d83604c04bd0b96da4b26f
| 18.690197 | 0 | 10.732 | false | false | false | true | 1.513799 | 0.387807 | 38.780658 | 0.487256 | 28.313232 | 0.034743 | 3.47432 | 0.305369 | 7.38255 | 0.427979 | 12.130729 | 0.298537 | 22.059693 | false | false |
2025-01-28
| 0 |
Removed
|
||
olabs-ai_reflection_model_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Adapter
|
?
|
<a target="_blank" href="https://huggingface.co/olabs-ai/reflection_model" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">olabs-ai/reflection_model</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/olabs-ai__reflection_model-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
olabs-ai/reflection_model
|
a8b0fc584b10e0110e04f9d21c7f10d24391c1d5
| 14.079166 |
apache-2.0
| 1 | 9.3 | true | false | false | false | 4.815087 | 0.159869 | 15.986915 | 0.471251 | 25.206882 | 0.05136 | 5.135952 | 0.300336 | 6.711409 | 0.350833 | 5.754167 | 0.331117 | 25.679669 | false | false |
2024-09-08
|
2024-09-08
| 0 |
olabs-ai/reflection_model
|
ontocord_Llama_3.2_1b-autoredteam_helpfulness-train_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/ontocord/Llama_3.2_1b-autoredteam_helpfulness-train" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ontocord/Llama_3.2_1b-autoredteam_helpfulness-train</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ontocord__Llama_3.2_1b-autoredteam_helpfulness-train-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
ontocord/Llama_3.2_1b-autoredteam_helpfulness-train
|
3115f5fa8573b9766a25a0e5e966b99652ecb77c
| 6.603005 |
llama3.2
| 0 | 1.498 | true | false | false | true | 0.762954 | 0.276548 | 27.654845 | 0.311508 | 4.336962 | 0.016616 | 1.661631 | 0.259228 | 1.230425 | 0.345875 | 3.267708 | 0.113198 | 1.46646 | false | false |
2025-01-30
|
2025-01-31
| 0 |
ontocord/Llama_3.2_1b-autoredteam_helpfulness-train
|
ontocord_RedPajama-3B-v1-AutoRedteam_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
GPTNeoXForCausalLM
|
<a target="_blank" href="https://huggingface.co/ontocord/RedPajama-3B-v1-AutoRedteam" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ontocord/RedPajama-3B-v1-AutoRedteam</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ontocord__RedPajama-3B-v1-AutoRedteam-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
ontocord/RedPajama-3B-v1-AutoRedteam
|
abfffba25b38db573761a30ee5cb2238224d3d35
| 3.563282 | 0 | 2.776 | false | false | false | false | 0.353992 | 0.13434 | 13.434022 | 0.302568 | 2.949522 | 0.009063 | 0.906344 | 0.24245 | 0 | 0.366062 | 2.891146 | 0.110788 | 1.198655 | false | false |
2025-01-14
| 0 |
Removed
|
||
ontocord_RedPajama-3B-v1-AutoRedteam-Harmless-only_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
GPTNeoXForCausalLM
|
<a target="_blank" href="https://huggingface.co/ontocord/RedPajama-3B-v1-AutoRedteam-Harmless-only" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ontocord/RedPajama-3B-v1-AutoRedteam-Harmless-only</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ontocord__RedPajama-3B-v1-AutoRedteam-Harmless-only-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
ontocord/RedPajama-3B-v1-AutoRedteam-Harmless-only
|
39445554944ee8b7b135c177d96348b5be4cea11
| 3.950544 |
apache-2.0
| 0 | 2.776 | true | false | false | true | 0.445491 | 0.152475 | 15.247543 | 0.312367 | 3.779556 | 0.006042 | 0.60423 | 0.231544 | 0 | 0.366125 | 2.965625 | 0.109957 | 1.106309 | false | false |
2025-01-15
|
2025-01-28
| 0 |
ontocord/RedPajama-3B-v1-AutoRedteam-Harmless-only
|
ontocord_RedPajama3b_v1-autoredteam_helpfulness-train_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
GPTNeoXForCausalLM
|
<a target="_blank" href="https://huggingface.co/ontocord/RedPajama3b_v1-autoredteam_helpfulness-train" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ontocord/RedPajama3b_v1-autoredteam_helpfulness-train</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ontocord__RedPajama3b_v1-autoredteam_helpfulness-train-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
ontocord/RedPajama3b_v1-autoredteam_helpfulness-train
|
624ffe9d59c306768a13ae6953be54a04501f272
| 6.038455 |
apache-2.0
| 0 | 2.776 | true | false | false | true | 0.664259 | 0.284767 | 28.476664 | 0.309274 | 3.372125 | 0.006798 | 0.679758 | 0.245805 | 0 | 0.357969 | 2.51276 | 0.110705 | 1.189421 | false | false |
2025-01-26
|
2025-01-26
| 0 |
ontocord/RedPajama3b_v1-autoredteam_helpfulness-train
|
ontocord_merged_0.2_expert_0.8_float16
|
float16
|
🟢 pretrained
|
🟢
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/ontocord/merged_0.2_expert_0.8" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ontocord/merged_0.2_expert_0.8</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ontocord__merged_0.2_expert_0.8-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
ontocord/merged_0.2_expert_0.8
|
dccbe9510988e82eb0025b8c02f6e866a4d90223
| 4.808387 | 0 | 3.759 | false | false | false | false | 0.266009 | 0.174258 | 17.425764 | 0.3046 | 3.288316 | 0.026435 | 2.643505 | 0.261745 | 1.565996 | 0.362062 | 2.691146 | 0.11112 | 1.235594 | false | false |
2025-03-10
|
2025-03-10
| 0 |
ontocord/merged_0.2_expert_0.8
|
|
ontocord_merged_0.2_expert_0.8-stack_2x_float16
|
float16
|
🟢 pretrained
|
🟢
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/ontocord/merged_0.2_expert_0.8-stack_2x" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ontocord/merged_0.2_expert_0.8-stack_2x</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ontocord__merged_0.2_expert_0.8-stack_2x-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
ontocord/merged_0.2_expert_0.8-stack_2x
|
2cf083ad639ee2dfe56225af554af62e3922357a
| 4.697285 | 0 | 6.512 | false | false | false | false | 0.4856 | 0.179603 | 17.960345 | 0.300613 | 2.818672 | 0.024924 | 2.492447 | 0.262584 | 1.677852 | 0.354063 | 2.091146 | 0.110289 | 1.143248 | false | false |
2025-03-10
|
2025-03-10
| 0 |
ontocord/merged_0.2_expert_0.8-stack_2x
|
|
ontocord_merged_0.5_expert_0.5_float16
|
float16
|
🟢 pretrained
|
🟢
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/ontocord/merged_0.5_expert_0.5" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ontocord/merged_0.5_expert_0.5</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ontocord__merged_0.5_expert_0.5-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
ontocord/merged_0.5_expert_0.5
|
2cb783ca7d4cdc171c583fbe94348d7417c2ce78
| 4.592378 | 0 | 3.759 | false | false | false | false | 0.269819 | 0.178729 | 17.872911 | 0.301701 | 3.102805 | 0.019637 | 1.963746 | 0.264262 | 1.901566 | 0.35425 | 1.514583 | 0.110788 | 1.198655 | false | false |
2025-03-10
|
2025-03-10
| 0 |
ontocord/merged_0.5_expert_0.5
|
|
ontocord_ontocord_wide_3b-stage1_shuf_sample1_jsonl-pretrained-autoredteam_helpful-0.25_helpful_float16
|
float16
|
🟢 pretrained
|
🟢
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/ontocord/ontocord_wide_3b-stage1_shuf_sample1_jsonl-pretrained-autoredteam_helpful-0.25_helpful" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ontocord/ontocord_wide_3b-stage1_shuf_sample1_jsonl-pretrained-autoredteam_helpful-0.25_helpful</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ontocord__ontocord_wide_3b-stage1_shuf_sample1_jsonl-pretrained-autoredteam_helpful-0.25_helpful-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
ontocord/ontocord_wide_3b-stage1_shuf_sample1_jsonl-pretrained-autoredteam_helpful-0.25_helpful
|
b48b9e6ace48c16205a8d09ccda47d9ed7cbe97b
| 4.001014 | 0 | 3.759 | false | false | false | true | 0.275225 | 0.131842 | 13.18424 | 0.300447 | 2.348853 | 0.010574 | 1.057402 | 0.267617 | 2.348993 | 0.363115 | 3.489323 | 0.114195 | 1.577275 | false | false |
2025-02-23
|
2025-02-27
| 0 |
ontocord/ontocord_wide_3b-stage1_shuf_sample1_jsonl-pretrained-autoredteam_helpful-0.25_helpful
|
|
ontocord_ontocord_wide_7b-stacked-stage1_float16
|
float16
|
🟢 pretrained
|
🟢
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/ontocord/ontocord_wide_7b-stacked-stage1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ontocord/ontocord_wide_7b-stacked-stage1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ontocord__ontocord_wide_7b-stacked-stage1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
ontocord/ontocord_wide_7b-stacked-stage1
|
791875c466470bb40c0d90297395a066c04c5029
| 3.907682 | 0 | 7.888 | false | false | false | true | 0.582657 | 0.148454 | 14.845388 | 0.289652 | 1.565046 | 0.009063 | 0.906344 | 0.253356 | 0.447427 | 0.360354 | 4.510938 | 0.110539 | 1.170952 | false | false |
2025-02-27
|
2025-02-27
| 0 |
ontocord/ontocord_wide_7b-stacked-stage1
|
|
ontocord_ontocord_wide_7b-stacked-stage1-instruct_float16
|
float16
|
🟢 pretrained
|
🟢
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/ontocord/ontocord_wide_7b-stacked-stage1-instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ontocord/ontocord_wide_7b-stacked-stage1-instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ontocord__ontocord_wide_7b-stacked-stage1-instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
ontocord/ontocord_wide_7b-stacked-stage1-instruct
|
a421c77d76f14dcd8add189ccae2bc15d1a63dd0
| 3.665682 | 0 | 7.888 | false | false | false | true | 0.596375 | 0.153025 | 15.302508 | 0.285391 | 1.488934 | 0.006798 | 0.679758 | 0.246644 | 0 | 0.353781 | 3.222656 | 0.111702 | 1.300236 | false | false |
2025-02-27
|
2025-02-27
| 0 |
ontocord/ontocord_wide_7b-stacked-stage1-instruct
|
|
ontocord_starcoder2-29b-ls_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Starcoder2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/ontocord/starcoder2-29b-ls" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ontocord/starcoder2-29b-ls</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ontocord__starcoder2-29b-ls-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
ontocord/starcoder2-29b-ls
|
5218578d74cbd7ca3a573ce2acc8f1d61e061f13
| 8.448974 | 0 | 29.009 | false | false | false | false | 2.598554 | 0.214924 | 21.492418 | 0.373498 | 10.973636 | 0.018882 | 1.888218 | 0.27349 | 3.131991 | 0.37 | 3.55 | 0.186918 | 9.65758 | false | false |
2025-02-12
|
2025-02-12
| 0 |
ontocord/starcoder2-29b-ls
|
|
ontocord_starcoder2_3b-AutoRedteam_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Starcoder2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/ontocord/starcoder2_3b-AutoRedteam" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ontocord/starcoder2_3b-AutoRedteam</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ontocord__starcoder2_3b-AutoRedteam-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
ontocord/starcoder2_3b-AutoRedteam
|
5369e550124b39b17a6350d3e77a696329460c7b
| 5.416511 |
bigscience-openrail-m
| 0 | 3.181 | true | false | false | false | 0.479409 | 0.157371 | 15.737133 | 0.349764 | 8.637687 | 0.010574 | 1.057402 | 0.251678 | 0.223714 | 0.364573 | 3.104948 | 0.133644 | 3.73818 | false | false |
2025-01-16
|
2025-01-27
| 0 |
ontocord/starcoder2_3b-AutoRedteam
|
ontocord_wide_3b-merge_test_float16
|
float16
|
🟢 pretrained
|
🟢
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/ontocord/wide_3b-merge_test" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ontocord/wide_3b-merge_test</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ontocord__wide_3b-merge_test-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
ontocord/wide_3b-merge_test
|
8d49cfe516485cc9dc177c0319736902c2eaa09b
| 3.941643 | 0 | 3.759 | false | false | false | true | 0.272965 | 0.176281 | 17.628116 | 0.301147 | 2.934818 | 0 | 0 | 0.239933 | 0 | 0.342 | 2.35 | 0.106632 | 0.736924 | false | false |
2025-03-06
|
2025-03-06
| 0 |
ontocord/wide_3b-merge_test
|
|
ontocord_wide_3b-stage1_shuf_sample1_jsonl-pretrained_float16
|
float16
|
🟢 pretrained
|
🟢
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/ontocord/wide_3b-stage1_shuf_sample1_jsonl-pretrained" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ontocord/wide_3b-stage1_shuf_sample1_jsonl-pretrained</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ontocord__wide_3b-stage1_shuf_sample1_jsonl-pretrained-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
ontocord/wide_3b-stage1_shuf_sample1_jsonl-pretrained
|
e92c4a6984f7ef2c36338e8c21b55f0017fb7102
| 4.323686 | 0 | 3.759 | false | false | false | true | 0.273509 | 0.139461 | 13.946107 | 0.300361 | 2.582585 | 0.016616 | 1.661631 | 0.26594 | 2.12528 | 0.363208 | 4.067708 | 0.114029 | 1.558806 | false | false |
2025-02-23
|
2025-02-27
| 0 |
ontocord/wide_3b-stage1_shuf_sample1_jsonl-pretrained
|
|
ontocord_wide_3b_sft_stag1.2-lyrical_law_news_software_howto_formattedtext_math_wiki-merge_float16
|
float16
|
🟢 pretrained
|
🟢
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/ontocord/wide_3b_sft_stag1.2-lyrical_law_news_software_howto_formattedtext_math_wiki-merge" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ontocord/wide_3b_sft_stag1.2-lyrical_law_news_software_howto_formattedtext_math_wiki-merge</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ontocord__wide_3b_sft_stag1.2-lyrical_law_news_software_howto_formattedtext_math_wiki-merge-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
ontocord/wide_3b_sft_stag1.2-lyrical_law_news_software_howto_formattedtext_math_wiki-merge
|
e413857a7181e403b1ab0a444d626d3211cfd417
| 4.866411 | 0 | 3.759 | false | false | false | false | 0.268132 | 0.166364 | 16.636414 | 0.303091 | 3.183536 | 0.011329 | 1.132931 | 0.260067 | 1.342282 | 0.384542 | 5.667708 | 0.11112 | 1.235594 | false | false |
2025-03-06
|
2025-03-06
| 0 |
ontocord/wide_3b_sft_stag1.2-lyrical_law_news_software_howto_formattedtext_math_wiki-merge
|
|
ontocord_wide_3b_sft_stag1.2-lyrical_news_software_howto_formattedtext-merge_float16
|
float16
|
🟢 pretrained
|
🟢
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/ontocord/wide_3b_sft_stag1.2-lyrical_news_software_howto_formattedtext-merge" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ontocord/wide_3b_sft_stag1.2-lyrical_news_software_howto_formattedtext-merge</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ontocord__wide_3b_sft_stag1.2-lyrical_news_software_howto_formattedtext-merge-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
ontocord/wide_3b_sft_stag1.2-lyrical_news_software_howto_formattedtext-merge
|
195789ff05c22b511c05cd851666fad67ccce173
| 4.700635 | 0 | 3.759 | false | false | false | false | 0.266949 | 0.169736 | 16.97363 | 0.297513 | 2.551802 | 0.013595 | 1.359517 | 0.260067 | 1.342282 | 0.377812 | 4.593229 | 0.11245 | 1.383348 | false | false |
2025-03-06
|
2025-03-06
| 0 |
ontocord/wide_3b_sft_stag1.2-lyrical_news_software_howto_formattedtext-merge
|
|
ontocord_wide_3b_sft_stage1.1-ss1-no_redteam_skg_poem.no_issue_float16
|
float16
|
🟢 pretrained
|
🟢
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/ontocord/wide_3b_sft_stage1.1-ss1-no_redteam_skg_poem.no_issue" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ontocord/wide_3b_sft_stage1.1-ss1-no_redteam_skg_poem.no_issue</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ontocord__wide_3b_sft_stage1.1-ss1-no_redteam_skg_poem.no_issue-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
ontocord/wide_3b_sft_stage1.1-ss1-no_redteam_skg_poem.no_issue
|
10b3121e0ea5a94c5ac712522441d59cdb5beb0b
| 4.636126 | 0 | 3.759 | false | false | false | true | 0.277305 | 0.148004 | 14.800396 | 0.309534 | 3.351678 | 0.020393 | 2.039275 | 0.270134 | 2.684564 | 0.357938 | 3.742188 | 0.110788 | 1.198655 | false | false |
2025-03-06
|
2025-03-06
| 0 |
ontocord/wide_3b_sft_stage1.1-ss1-no_redteam_skg_poem.no_issue
|
|
ontocord_wide_3b_sft_stage1.1-ss1-with_generics_intr.no_issue_float16
|
float16
|
🟢 pretrained
|
🟢
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/ontocord/wide_3b_sft_stage1.1-ss1-with_generics_intr.no_issue" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ontocord/wide_3b_sft_stage1.1-ss1-with_generics_intr.no_issue</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ontocord__wide_3b_sft_stage1.1-ss1-with_generics_intr.no_issue-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
ontocord/wide_3b_sft_stage1.1-ss1-with_generics_intr.no_issue
|
53427374dcdc3b0c6a67daf89e6653e1733d9b51
| 4.124471 | 0 | 3.759 | false | false | false | true | 0.277485 | 0.123674 | 12.367407 | 0.306009 | 3.367053 | 0.010574 | 1.057402 | 0.274329 | 3.243848 | 0.367271 | 3.475521 | 0.11112 | 1.235594 | false | false |
2025-03-06
|
2025-03-06
| 0 |
ontocord/wide_3b_sft_stage1.1-ss1-with_generics_intr.no_issue
|
|
ontocord_wide_3b_sft_stage1.1-ss1-with_generics_intr_math.no_issue_float16
|
float16
|
🟢 pretrained
|
🟢
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/ontocord/wide_3b_sft_stage1.1-ss1-with_generics_intr_math.no_issue" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ontocord/wide_3b_sft_stage1.1-ss1-with_generics_intr_math.no_issue</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ontocord__wide_3b_sft_stage1.1-ss1-with_generics_intr_math.no_issue-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
ontocord/wide_3b_sft_stage1.1-ss1-with_generics_intr_math.no_issue
|
b8e967c6f14b5a16eacdd25d62bead5eb4a34f07
| 3.633704 | 0 | 3.759 | false | false | false | true | 0.278736 | 0.119153 | 11.915274 | 0.295559 | 2.229757 | 0.006798 | 0.679758 | 0.264262 | 1.901566 | 0.355302 | 3.046094 | 0.118268 | 2.029772 | false | false |
2025-03-06
|
2025-03-06
| 0 |
ontocord/wide_3b_sft_stage1.1-ss1-with_generics_intr_math.no_issue
|
|
ontocord_wide_3b_sft_stage1.1-ss1-with_generics_intr_math_stories.no_issue_bfloat16
|
bfloat16
|
🟢 pretrained
|
🟢
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/ontocord/wide_3b_sft_stage1.1-ss1-with_generics_intr_math_stories.no_issue" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ontocord/wide_3b_sft_stage1.1-ss1-with_generics_intr_math_stories.no_issue</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ontocord__wide_3b_sft_stage1.1-ss1-with_generics_intr_math_stories.no_issue-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
ontocord/wide_3b_sft_stage1.1-ss1-with_generics_intr_math_stories.no_issue
|
14c37892622d92668836b3fdd54af2eed0acf1c1
| 3.864133 | 0 | 3.759 | false | false | false | true | 0.311055 | 0.112833 | 11.283284 | 0.317144 | 4.415069 | 0.011329 | 1.132931 | 0.268456 | 2.46085 | 0.346031 | 2.453906 | 0.112949 | 1.438756 | false | false |
2025-03-02
|
2025-03-02
| 0 |
ontocord/wide_3b_sft_stage1.1-ss1-with_generics_intr_math_stories.no_issue
|
|
ontocord_wide_3b_sft_stage1.1-ss1-with_generics_intr_math_stories.no_issue_float16
|
float16
|
🟢 pretrained
|
🟢
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/ontocord/wide_3b_sft_stage1.1-ss1-with_generics_intr_math_stories.no_issue" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ontocord/wide_3b_sft_stage1.1-ss1-with_generics_intr_math_stories.no_issue</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ontocord__wide_3b_sft_stage1.1-ss1-with_generics_intr_math_stories.no_issue-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
ontocord/wide_3b_sft_stage1.1-ss1-with_generics_intr_math_stories.no_issue
|
0d15e00653a0c3f9e7994873d1ffbbc7580f051a
| 3.712731 | 0 | 3.759 | false | false | false | true | 0.556019 | 0.116155 | 11.615514 | 0.318434 | 4.55452 | 0.007553 | 0.755287 | 0.263423 | 1.789709 | 0.344698 | 2.18724 | 0.112367 | 1.374113 | false | false |
2025-03-02
|
2025-03-06
| 0 |
ontocord/wide_3b_sft_stage1.1-ss1-with_generics_intr_math_stories.no_issue
|
|
ontocord_wide_3b_sft_stage1.1-ss1-with_generics_intr_math_stories_no_orig_instr.no_issue_float16
|
float16
|
🟢 pretrained
|
🟢
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/ontocord/wide_3b_sft_stage1.1-ss1-with_generics_intr_math_stories_no_orig_instr.no_issue" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ontocord/wide_3b_sft_stage1.1-ss1-with_generics_intr_math_stories_no_orig_instr.no_issue</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ontocord__wide_3b_sft_stage1.1-ss1-with_generics_intr_math_stories_no_orig_instr.no_issue-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
ontocord/wide_3b_sft_stage1.1-ss1-with_generics_intr_math_stories_no_orig_instr.no_issue
|
09658cf698c378e6c02afdf867024a8529c53cbc
| 4.284722 | 0 | 3.759 | false | false | false | true | 0.279815 | 0.131693 | 13.16928 | 0.306401 | 2.938783 | 0.009063 | 0.906344 | 0.265101 | 2.013423 | 0.344604 | 5.075521 | 0.114445 | 1.604979 | false | false |
2025-03-06
|
2025-03-06
| 0 |
ontocord/wide_3b_sft_stage1.1-ss1-with_generics_intr_math_stories_no_orig_instr.no_issue
|
|
ontocord_wide_3b_sft_stage1.1-ss1-with_generics_intr_stories.no_issue_float16
|
float16
|
🟢 pretrained
|
🟢
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/ontocord/wide_3b_sft_stage1.1-ss1-with_generics_intr_stories.no_issue" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ontocord/wide_3b_sft_stage1.1-ss1-with_generics_intr_stories.no_issue</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ontocord__wide_3b_sft_stage1.1-ss1-with_generics_intr_stories.no_issue-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
ontocord/wide_3b_sft_stage1.1-ss1-with_generics_intr_stories.no_issue
|
55a5b5212c6be3133e04626dfef338c2c7bd9e52
| 3.670762 | 0 | 3.759 | false | false | false | true | 0.272359 | 0.118179 | 11.817865 | 0.30375 | 2.9978 | 0.008308 | 0.830816 | 0.26594 | 2.12528 | 0.356698 | 2.453906 | 0.11619 | 1.798907 | false | false |
2025-03-06
|
2025-03-06
| 0 |
ontocord/wide_3b_sft_stage1.1-ss1-with_generics_intr_stories.no_issue
|
|
ontocord_wide_3b_sft_stage1.1-ss1-with_generics_math.no_issue_float16
|
float16
|
🟢 pretrained
|
🟢
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/ontocord/wide_3b_sft_stage1.1-ss1-with_generics_math.no_issue" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ontocord/wide_3b_sft_stage1.1-ss1-with_generics_math.no_issue</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ontocord__wide_3b_sft_stage1.1-ss1-with_generics_math.no_issue-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
ontocord/wide_3b_sft_stage1.1-ss1-with_generics_math.no_issue
|
1608275d180108e1b692fbaecae7cd19d0a48445
| 3.526158 | 0 | 3.759 | false | false | false | true | 0.272252 | 0.123999 | 12.399877 | 0.303244 | 3.27569 | 0.007553 | 0.755287 | 0.258389 | 1.118568 | 0.348698 | 2.18724 | 0.112783 | 1.420287 | false | false |
2025-03-06
|
2025-03-06
| 0 |
ontocord/wide_3b_sft_stage1.1-ss1-with_generics_math.no_issue
|
|
ontocord_wide_3b_sft_stage1.1-ss1-with_math.no_issue_float16
|
float16
|
🟢 pretrained
|
🟢
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/ontocord/wide_3b_sft_stage1.1-ss1-with_math.no_issue" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ontocord/wide_3b_sft_stage1.1-ss1-with_math.no_issue</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ontocord__wide_3b_sft_stage1.1-ss1-with_math.no_issue-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
ontocord/wide_3b_sft_stage1.1-ss1-with_math.no_issue
|
411920369fdde84e168c9821ffb3a9cc1a260d0c
| 4.7675 | 0 | 3.759 | false | false | false | true | 0.270395 | 0.129819 | 12.981888 | 0.30519 | 3.133656 | 0.015861 | 1.586103 | 0.260067 | 1.342282 | 0.39276 | 7.928385 | 0.114694 | 1.632683 | false | false |
2025-03-06
|
2025-03-06
| 0 |
ontocord/wide_3b_sft_stage1.1-ss1-with_math.no_issue
|
|
ontocord_wide_3b_sft_stage1.1-ss1-with_r1_generics_intr_math_stories.no_issue_float16
|
float16
|
🟢 pretrained
|
🟢
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/ontocord/wide_3b_sft_stage1.1-ss1-with_r1_generics_intr_math_stories.no_issue" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ontocord/wide_3b_sft_stage1.1-ss1-with_r1_generics_intr_math_stories.no_issue</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ontocord__wide_3b_sft_stage1.1-ss1-with_r1_generics_intr_math_stories.no_issue-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
ontocord/wide_3b_sft_stage1.1-ss1-with_r1_generics_intr_math_stories.no_issue
|
83824d233440e268368bc7b2ce41fb0f2c939574
| 5.093187 | 0 | 3.759 | false | false | false | true | 0.292391 | 0.204907 | 20.490742 | 0.291178 | 2.347041 | 0 | 0 | 0.260067 | 1.342282 | 0.357531 | 4.52474 | 0.116689 | 1.854314 | false | false |
2025-03-02
| 0 |
Removed
|
||
ontocord_wide_3b_sft_stage1.2-ss1-expert_fictional_lyrical_float16
|
float16
|
🟢 pretrained
|
🟢
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/ontocord/wide_3b_sft_stage1.2-ss1-expert_fictional_lyrical" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ontocord/wide_3b_sft_stage1.2-ss1-expert_fictional_lyrical</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ontocord__wide_3b_sft_stage1.2-ss1-expert_fictional_lyrical-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
ontocord/wide_3b_sft_stage1.2-ss1-expert_fictional_lyrical
|
efe7ecbb26b6e3a02645587382532d5869325506
| 4.837021 | 0 | 3.759 | true | false | false | false | 0.275867 | 0.146106 | 14.610567 | 0.299812 | 2.47749 | 0.013595 | 1.359517 | 0.264262 | 1.901566 | 0.392573 | 7.104948 | 0.114112 | 1.568041 | false | false |
2025-03-07
|
2025-03-05
| 0 |
ontocord/wide_3b_sft_stage1.2-ss1-expert_fictional_lyrical
|
|
ontocord_wide_3b_sft_stage1.2-ss1-expert_formatted_text_float16
|
float16
|
🟢 pretrained
|
🟢
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/ontocord/wide_3b_sft_stage1.2-ss1-expert_formatted_text" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ontocord/wide_3b_sft_stage1.2-ss1-expert_formatted_text</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ontocord__wide_3b_sft_stage1.2-ss1-expert_formatted_text-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
ontocord/wide_3b_sft_stage1.2-ss1-expert_formatted_text
|
1343c0ad6de8bba0e926a12bc839cdfed9336d2d
| 4.304343 | 0 | 3.759 | false | false | false | true | 0.270496 | 0.148729 | 14.872871 | 0.306895 | 3.730806 | 0.012085 | 1.208459 | 0.261745 | 1.565996 | 0.347396 | 2.824479 | 0.114611 | 1.623449 | false | false |
2025-03-06
|
2025-03-06
| 0 |
ontocord/wide_3b_sft_stage1.2-ss1-expert_formatted_text
|
|
ontocord_wide_3b_sft_stage1.2-ss1-expert_how-to_float16
|
float16
|
🟢 pretrained
|
🟢
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/ontocord/wide_3b_sft_stage1.2-ss1-expert_how-to" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ontocord/wide_3b_sft_stage1.2-ss1-expert_how-to</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ontocord__wide_3b_sft_stage1.2-ss1-expert_how-to-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
ontocord/wide_3b_sft_stage1.2-ss1-expert_how-to
|
2a9729fe267aa6a070236ebde081d11b02d4b42b
| 4.21527 |
apache-2.0
| 0 | 3.759 | true | false | false | true | 0.535195 | 0.124548 | 12.454842 | 0.30474 | 3.814087 | 0.01435 | 1.435045 | 0.259228 | 1.230425 | 0.365813 | 4.659896 | 0.115276 | 1.697326 | false | false |
2025-03-06
|
2025-03-06
| 0 |
ontocord/wide_3b_sft_stage1.2-ss1-expert_how-to
|
ontocord_wide_3b_sft_stage1.2-ss1-expert_math_float16
|
float16
|
🟢 pretrained
|
🟢
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/ontocord/wide_3b_sft_stage1.2-ss1-expert_math" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ontocord/wide_3b_sft_stage1.2-ss1-expert_math</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ontocord__wide_3b_sft_stage1.2-ss1-expert_math-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
ontocord/wide_3b_sft_stage1.2-ss1-expert_math
|
92eb1ef051529df71a66f1c7841781dcf9cbd4e7
| 5.136739 | 0 | 3.759 | false | false | false | false | 0.545443 | 0.191519 | 19.15185 | 0.305958 | 3.166491 | 0.027946 | 2.794562 | 0.259228 | 1.230425 | 0.370031 | 3.453906 | 0.109209 | 1.023197 | false | false |
2025-03-06
|
2025-03-06
| 0 |
ontocord/wide_3b_sft_stage1.2-ss1-expert_math
|
|
ontocord_wide_3b_sft_stage1.2-ss1-expert_news_float16
|
float16
|
🟢 pretrained
|
🟢
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/ontocord/wide_3b_sft_stage1.2-ss1-expert_news" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ontocord/wide_3b_sft_stage1.2-ss1-expert_news</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ontocord__wide_3b_sft_stage1.2-ss1-expert_news-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
ontocord/wide_3b_sft_stage1.2-ss1-expert_news
|
303215f83d50a86121de57540d1285f592bc37ff
| 4.449975 | 0 | 3.759 | false | false | false | false | 0.263964 | 0.165814 | 16.581448 | 0.292588 | 1.943798 | 0.016616 | 1.661631 | 0.267617 | 2.348993 | 0.362094 | 2.928385 | 0.11112 | 1.235594 | false | false |
2025-03-05
|
2025-03-05
| 0 |
ontocord/wide_3b_sft_stage1.2-ss1-expert_news
|
|
ontocord_wide_3b_sft_stage1.2-ss1-expert_software_float16
|
float16
|
🟢 pretrained
|
🟢
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/ontocord/wide_3b_sft_stage1.2-ss1-expert_software" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ontocord/wide_3b_sft_stage1.2-ss1-expert_software</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ontocord__wide_3b_sft_stage1.2-ss1-expert_software-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
ontocord/wide_3b_sft_stage1.2-ss1-expert_software
|
d3e034d69b18ca2ed506ff262c63ec8e1cf000bc
| 4.290233 | 0 | 3.759 | false | false | false | false | 0.277443 | 0.173383 | 17.338329 | 0.297996 | 2.499488 | 0.015861 | 1.586103 | 0.258389 | 1.118568 | 0.356854 | 1.640104 | 0.114029 | 1.558806 | false | false |
2025-03-05
|
2025-03-05
| 0 |
ontocord/wide_3b_sft_stage1.2-ss1-expert_software
|
|
ontocord_wide_6.6b_sft_stag1.2-lyrical_law_news_software_howto_formattedtext_math_wiki-merge-stacked_float16
|
float16
|
🟢 pretrained
|
🟢
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/ontocord/wide_6.6b_sft_stag1.2-lyrical_law_news_software_howto_formattedtext_math_wiki-merge-stacked" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ontocord/wide_6.6b_sft_stag1.2-lyrical_law_news_software_howto_formattedtext_math_wiki-merge-stacked</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ontocord__wide_6.6b_sft_stag1.2-lyrical_law_news_software_howto_formattedtext_math_wiki-merge-stacked-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
ontocord/wide_6.6b_sft_stag1.2-lyrical_law_news_software_howto_formattedtext_math_wiki-merge-stacked
|
a567e6144b581cdca5917b5d5d9acf9b6023e1a3
| 3.97781 | 0 | 7.888 | false | false | false | false | 0.582179 | 0.124399 | 12.439882 | 0.302645 | 3.014695 | 0.01435 | 1.435045 | 0.26594 | 2.12528 | 0.368635 | 3.579427 | 0.111453 | 1.272533 | false | false |
2025-03-07
|
2025-03-07
| 0 |
ontocord/wide_6.6b_sft_stag1.2-lyrical_law_news_software_howto_formattedtext_math_wiki-merge-stacked
|
|
oobabooga_CodeBooga-34B-v0.1_float16
|
float16
|
🤝 base merges and moerges
|
🤝
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/oobabooga/CodeBooga-34B-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">oobabooga/CodeBooga-34B-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/oobabooga__CodeBooga-34B-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
oobabooga/CodeBooga-34B-v0.1
|
8a4e1e16ac46333cbd0c17d733d3d70a956071a6
| 15.661706 |
llama2
| 145 | 33.744 | true | false | false | true | 4.174007 | 0.525018 | 52.501806 | 0.342744 | 8.562466 | 0.039275 | 3.927492 | 0.256711 | 0.894855 | 0.431021 | 12.977604 | 0.235954 | 15.106014 | false | false |
2023-10-19
|
2024-07-29
| 0 |
oobabooga/CodeBooga-34B-v0.1
|
oopere_Llama-FinSent-S_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/oopere/Llama-FinSent-S" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">oopere/Llama-FinSent-S</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/oopere__Llama-FinSent-S-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
oopere/Llama-FinSent-S
|
0740011dfde2d1a23150dc214e7d74d65512b557
| 5.811806 |
llama3.2
| 5 | 0.914 | true | false | false | false | 0.367407 | 0.211877 | 21.187671 | 0.315621 | 4.156015 | 0.018127 | 1.812689 | 0.256711 | 0.894855 | 0.38324 | 5.371615 | 0.113032 | 1.447991 | false | false |
2025-02-02
|
2025-02-08
| 1 |
oopere/Llama-FinSent-S (Merge)
|
oopere_Llama-FinSent-S_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/oopere/Llama-FinSent-S" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">oopere/Llama-FinSent-S</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/oopere__Llama-FinSent-S-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
oopere/Llama-FinSent-S
|
9d9a76e8910865573e7c25c8d9d250355f2ece86
| 5.866404 |
llama3.2
| 5 | 0.914 | true | false | false | false | 0.367972 | 0.216398 | 21.639805 | 0.316925 | 4.307331 | 0.01284 | 1.283988 | 0.258389 | 1.118568 | 0.383177 | 5.363802 | 0.113364 | 1.484929 | false | false |
2025-02-02
|
2025-02-19
| 1 |
oopere/Llama-FinSent-S (Merge)
|
oopere_pruned10-llama-3.2-3B_float16
|
float16
|
🟢 pretrained
|
🟢
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/oopere/pruned10-llama-3.2-3B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">oopere/pruned10-llama-3.2-3B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/oopere__pruned10-llama-3.2-3B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
oopere/pruned10-llama-3.2-3B
|
5958def83347d0a8f8b95d27e7cdff37329b988c
| 6.919943 |
llama3.2
| 0 | 3.001 | true | false | false | false | 1.321068 | 0.17763 | 17.76298 | 0.334042 | 7.759477 | 0.019637 | 1.963746 | 0.266779 | 2.237136 | 0.372167 | 4.6875 | 0.163979 | 7.108821 | false | false |
2024-12-22
|
2024-12-22
| 1 |
oopere/pruned10-llama-3.2-3B (Merge)
|
oopere_pruned20-llama-1b_float16
|
float16
|
🤝 base merges and moerges
|
🤝
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/oopere/pruned20-llama-1b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">oopere/pruned20-llama-1b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/oopere__pruned20-llama-1b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
oopere/pruned20-llama-1b
|
3351c9a062055ce6c16dd2c9f0c229fb5dd7396b
| 4.98952 |
llama3.2
| 0 | 1.075 | true | false | false | false | 0.802956 | 0.199362 | 19.936214 | 0.303136 | 3.185394 | 0.010574 | 1.057402 | 0.25 | 0 | 0.363146 | 4.393229 | 0.112284 | 1.364879 | false | false |
2024-11-16
|
2024-11-16
| 1 |
oopere/pruned20-llama-1b (Merge)
|
oopere_pruned20-llama-3.2-3b_float16
|
float16
|
🟢 pretrained
|
🟢
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/oopere/pruned20-llama-3.2-3b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">oopere/pruned20-llama-3.2-3b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/oopere__pruned20-llama-3.2-3b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
oopere/pruned20-llama-3.2-3b
|
e92642870b0ad66e589889305608f422ee9be975
| 5.65656 |
llama3.2
| 0 | 2.79 | true | false | false | false | 1.212079 | 0.178879 | 17.887871 | 0.324785 | 6.332745 | 0.015861 | 1.586103 | 0.26594 | 2.12528 | 0.341844 | 2.897135 | 0.127992 | 3.110225 | false | false |
2024-12-12
|
2024-12-12
| 1 |
oopere/pruned20-llama-3.2-3b (Merge)
|
oopere_pruned40-llama-1b_float16
|
float16
|
🟢 pretrained
|
🟢
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/oopere/pruned40-llama-1b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">oopere/pruned40-llama-1b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/oopere__pruned40-llama-1b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
oopere/pruned40-llama-1b
|
3de470d9c61cb57cea821e93b43fb250aa14b975
| 6.608357 |
llama3.2
| 1 | 0.914 | true | false | false | false | 0.753243 | 0.228438 | 22.843832 | 0.296916 | 2.655309 | 0.007553 | 0.755287 | 0.243289 | 0 | 0.428667 | 12.483333 | 0.108211 | 0.912382 | false | false |
2024-11-16
|
2024-11-26
| 1 |
oopere/pruned40-llama-1b (Merge)
|
oopere_pruned40-llama-3.2-1B_bfloat16
|
bfloat16
|
🟢 pretrained
|
🟢
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/oopere/pruned40-llama-3.2-1B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">oopere/pruned40-llama-3.2-1B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/oopere__pruned40-llama-3.2-1B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
oopere/pruned40-llama-3.2-1B
|
fb1abfc3dedee4f37fdcd465881ffe9fd8d87060
| 6.877694 |
llama3.2
| 1 | 0.914 | true | false | false | false | 0.378142 | 0.22664 | 22.663976 | 0.298249 | 2.701273 | 0.008308 | 0.830816 | 0.254195 | 0.559284 | 0.43524 | 13.238281 | 0.111453 | 1.272533 | false | false |
2024-11-16
|
2025-02-19
| 1 |
oopere/pruned40-llama-3.2-1B (Merge)
|
oopere_pruned40-llama-3.2-3b_float16
|
float16
|
🟢 pretrained
|
🟢
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/oopere/pruned40-llama-3.2-3b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">oopere/pruned40-llama-3.2-3b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/oopere__pruned40-llama-3.2-3b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
oopere/pruned40-llama-3.2-3b
|
ceb2073cda2f21afa10efcbae74583fc9b319d54
| 5.371285 |
llama3.2
| 0 | 2.367 | true | false | false | false | 1.195335 | 0.218296 | 21.829634 | 0.316712 | 4.740102 | 0.01284 | 1.283988 | 0.229866 | 0 | 0.353938 | 2.408854 | 0.117686 | 1.96513 | false | false |
2024-12-12
|
2024-12-12
| 1 |
oopere/pruned40-llama-3.2-3b (Merge)
|
oopere_pruned60-llama-1b_float16
|
float16
|
🟢 pretrained
|
🟢
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/oopere/pruned60-llama-1b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">oopere/pruned60-llama-1b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/oopere__pruned60-llama-1b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
oopere/pruned60-llama-1b
|
86b157256928b50ee07cc3cf5b3884b70062f2fe
| 5.467567 |
llama3.2
| 1 | 0.753 | true | false | false | false | 0.764976 | 0.18285 | 18.285039 | 0.301619 | 2.942526 | 0.002266 | 0.226586 | 0.249161 | 0 | 0.408792 | 9.432292 | 0.117271 | 1.918957 | false | false |
2024-11-16
|
2024-11-25
| 1 |
oopere/pruned60-llama-1b (Merge)
|
oopere_pruned60-llama-3.2-3b_float16
|
float16
|
🟢 pretrained
|
🟢
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/oopere/pruned60-llama-3.2-3b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">oopere/pruned60-llama-3.2-3b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/oopere__pruned60-llama-3.2-3b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
oopere/pruned60-llama-3.2-3b
|
c8c061d55288274a59205fa740b51a951ca93335
| 5.128681 |
llama3.2
| 0 | 1.944 | true | false | false | false | 1.241768 | 0.182476 | 18.247583 | 0.316626 | 3.988402 | 0.003776 | 0.377644 | 0.270134 | 2.684564 | 0.363333 | 4.016667 | 0.113115 | 1.457225 | false | false |
2024-12-12
|
2024-12-13
| 1 |
oopere/pruned60-llama-3.2-3b (Merge)
|
open-atlas_Atlas-Flash-1.5B-Preview_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/open-atlas/Atlas-Flash-1.5B-Preview" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">open-atlas/Atlas-Flash-1.5B-Preview</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/open-atlas__Atlas-Flash-1.5B-Preview-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
open-atlas/Atlas-Flash-1.5B-Preview
|
160bf22e66b286a8ae7887a86fb21d7c49f7473e
| 11.111375 |
mit
| 3 | 1.777 | true | false | false | true | 1.210488 | 0.326957 | 32.695692 | 0.321546 | 5.654381 | 0.221299 | 22.129909 | 0.252517 | 0.33557 | 0.348792 | 1.698958 | 0.137384 | 4.153738 | false | false |
2025-01-26
|
2025-02-01
| 1 |
open-atlas/Atlas-Flash-1.5B-Preview (Merge)
|
open-atlas_Atlas-Flash-7B-Preview_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/open-atlas/Atlas-Flash-7B-Preview" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">open-atlas/Atlas-Flash-7B-Preview</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/open-atlas__Atlas-Flash-7B-Preview-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
open-atlas/Atlas-Flash-7B-Preview
|
d3d9a1e00c9c95e961ec8ec5f8e64e00b2cdb3a9
| 17.496191 |
mit
| 4 | 7.616 | true | false | false | true | 1.344133 | 0.390754 | 39.075431 | 0.354199 | 9.394854 | 0.257553 | 25.755287 | 0.288591 | 5.145414 | 0.383583 | 5.78125 | 0.278424 | 19.824911 | false | false |
2025-01-26
|
2025-02-01
| 1 |
open-atlas/Atlas-Flash-7B-Preview (Merge)
|
open-neo_Kyro-n1-3B_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/open-neo/Kyro-n1-3B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">open-neo/Kyro-n1-3B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/open-neo__Kyro-n1-3B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
open-neo/Kyro-n1-3B
|
dc34677fa2a29372519e8e5fb339efd865d7ee76
| 23.492574 |
other
| 12 | 3.086 | true | false | false | true | 0.774931 | 0.459497 | 45.949747 | 0.468538 | 25.78922 | 0.285498 | 28.549849 | 0.281879 | 4.250559 | 0.408792 | 9.498958 | 0.342254 | 26.91711 | false | false |
2025-02-13
|
2025-03-13
| 1 |
open-neo/Kyro-n1-3B (Merge)
|
open-neo_Kyro-n1-7B_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/open-neo/Kyro-n1-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">open-neo/Kyro-n1-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/open-neo__Kyro-n1-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
open-neo/Kyro-n1-7B
|
86c48c5dc7fbeb7ce3ebf605608fc69985d2b0ee
| 28.918698 |
mit
| 6 | 7.616 | true | false | false | true | 0.737034 | 0.557267 | 55.726694 | 0.538656 | 34.401528 | 0.389728 | 38.97281 | 0.260906 | 1.454139 | 0.388417 | 5.91875 | 0.433344 | 37.038268 | false | false |
2025-02-15
|
2025-03-13
| 1 |
open-neo/Kyro-n1-7B (Merge)
|
open-thoughts_OpenThinker-7B_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/open-thoughts/OpenThinker-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">open-thoughts/OpenThinker-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/open-thoughts__OpenThinker-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
open-thoughts/OpenThinker-7B
|
5a931fd3fa8618acda2da8eaec4a3f10ee009739
| 26.578519 |
apache-2.0
| 126 | 7.616 | true | false | false | true | 0.685119 | 0.40889 | 40.888952 | 0.534273 | 34.508818 | 0.425982 | 42.598187 | 0.256711 | 0.894855 | 0.382 | 5.416667 | 0.416473 | 35.163638 | false | false |
2025-01-28
|
2025-02-14
| 2 |
Qwen/Qwen2.5-7B
|
openai-community_gpt2_bfloat16
|
bfloat16
|
🟢 pretrained
|
🟢
|
Original
|
GPT2LMHeadModel
|
<a target="_blank" href="https://huggingface.co/openai-community/gpt2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">openai-community/gpt2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/openai-community__gpt2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
openai-community/gpt2
|
607a30d783dfa663caf39e06633721c8d4cfcd7e
| 6.510807 |
mit
| 2,628 | 0.137 | true | false | false | false | 0.085941 | 0.179253 | 17.925327 | 0.303571 | 2.674981 | 0.002266 | 0.226586 | 0.258389 | 1.118568 | 0.447052 | 15.348177 | 0.115941 | 1.771203 | false | true |
2022-03-02
|
2024-06-12
| 0 |
openai-community/gpt2
|
openai-community_gpt2_float16
|
float16
|
🟢 pretrained
|
🟢
|
Original
|
GPT2LMHeadModel
|
<a target="_blank" href="https://huggingface.co/openai-community/gpt2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">openai-community/gpt2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/openai-community__gpt2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
openai-community/gpt2
|
607a30d783dfa663caf39e06633721c8d4cfcd7e
| 6.334235 |
mit
| 2,628 | 0.137 | true | false | false | false | 0.234774 | 0.177954 | 17.795449 | 0.301658 | 2.815911 | 0.005287 | 0.528701 | 0.258389 | 1.118568 | 0.439021 | 13.910938 | 0.116523 | 1.835845 | false | true |
2022-03-02
|
2024-08-12
| 0 |
openai-community/gpt2
|
openai-community_gpt2-large_bfloat16
|
bfloat16
|
🟢 pretrained
|
🟢
|
Original
|
GPT2LMHeadModel
|
<a target="_blank" href="https://huggingface.co/openai-community/gpt2-large" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">openai-community/gpt2-large</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/openai-community__gpt2-large-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
openai-community/gpt2-large
|
32b71b12589c2f8d625668d2335a01cac3249519
| 5.567707 |
mit
| 301 | 0.812 | true | false | false | false | 0.360924 | 0.204782 | 20.47822 | 0.306884 | 3.253791 | 0.012085 | 1.208459 | 0.259228 | 1.230425 | 0.378865 | 5.658073 | 0.114195 | 1.577275 | false | true |
2022-03-02
|
2024-06-12
| 0 |
openai-community/gpt2-large
|
openai-community_gpt2-medium_bfloat16
|
bfloat16
|
🟢 pretrained
|
🟢
|
Original
|
GPT2LMHeadModel
|
<a target="_blank" href="https://huggingface.co/openai-community/gpt2-medium" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">openai-community/gpt2-medium</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/openai-community__gpt2-medium-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
openai-community/gpt2-medium
|
6dcaa7a952f72f9298047fd5137cd6e4f05f41da
| 5.90234 |
mit
| 171 | 0.38 | true | false | false | false | 0.242124 | 0.220844 | 22.084403 | 0.305028 | 2.719972 | 0.007553 | 0.755287 | 0.262584 | 1.677852 | 0.388448 | 6.15599 | 0.118185 | 2.020538 | false | true |
2022-03-02
|
2024-06-12
| 0 |
openai-community/gpt2-medium
|
openai-community_gpt2-xl_bfloat16
|
bfloat16
|
🟢 pretrained
|
🟢
|
Original
|
GPT2LMHeadModel
|
<a target="_blank" href="https://huggingface.co/openai-community/gpt2-xl" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">openai-community/gpt2-xl</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/openai-community__gpt2-xl-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
openai-community/gpt2-xl
|
15ea56dee5df4983c59b2538573817e1667135e2
| 5.093481 |
mit
| 332 | 1.608 | true | false | false | false | 0.430627 | 0.203858 | 20.385799 | 0.300858 | 2.580961 | 0.009819 | 0.981873 | 0.258389 | 1.118568 | 0.370958 | 4.036458 | 0.113115 | 1.457225 | false | true |
2022-03-02
|
2024-06-12
| 0 |
openai-community/gpt2-xl
|
openbmb_MiniCPM-S-1B-sft-llama-format_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/openbmb/MiniCPM-S-1B-sft-llama-format" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">openbmb/MiniCPM-S-1B-sft-llama-format</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/openbmb__MiniCPM-S-1B-sft-llama-format-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
openbmb/MiniCPM-S-1B-sft-llama-format
|
7de07f8895c168a7ee01f624f50c44f6966c9735
| 8.996066 |
apache-2.0
| 4 | 1 | true | false | false | true | 1.080074 | 0.332877 | 33.287677 | 0.304931 | 3.898455 | 0.030967 | 3.096677 | 0.270973 | 2.796421 | 0.331677 | 1.359635 | 0.185838 | 9.53753 | false | false |
2024-06-14
|
2024-11-19
| 0 |
openbmb/MiniCPM-S-1B-sft-llama-format
|
openchat_openchat-3.5-0106_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/openchat/openchat-3.5-0106" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">openchat/openchat-3.5-0106</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/openchat__openchat-3.5-0106-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
openchat/openchat-3.5-0106
|
ff058fda49726ecf4ea53dc1635f917cdb8ba36b
| 22.709255 |
apache-2.0
| 354 | 7.242 | true | false | false | true | 2.962836 | 0.596659 | 59.665909 | 0.461698 | 24.038711 | 0.076284 | 7.628399 | 0.307886 | 7.718121 | 0.425437 | 11.746354 | 0.329122 | 25.458038 | false | true |
2024-01-07
|
2024-06-27
| 1 |
mistralai/Mistral-7B-v0.1
|
openchat_openchat-3.5-1210_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/openchat/openchat-3.5-1210" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">openchat/openchat-3.5-1210</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/openchat__openchat-3.5-1210-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
openchat/openchat-3.5-1210
|
801f5459b7577241500785f11c2b026912badd6e
| 22.72785 |
apache-2.0
| 272 | 7.242 | true | false | false | true | 1.032902 | 0.603678 | 60.367824 | 0.453536 | 23.236297 | 0.07855 | 7.854985 | 0.301174 | 6.823266 | 0.441438 | 14.279688 | 0.314245 | 23.805038 | false | true |
2023-12-12
|
2024-06-12
| 1 |
mistralai/Mistral-7B-v0.1
|
openchat_openchat-3.6-8b-20240522_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/openchat/openchat-3.6-8b-20240522" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">openchat/openchat-3.6-8b-20240522</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/openchat__openchat-3.6-8b-20240522-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
openchat/openchat-3.6-8b-20240522
|
2264eb98558978f708e88ae52afb78e43b832801
| 23.107316 |
llama3
| 152 | 8.03 | true | false | false | true | 4.349912 | 0.534336 | 53.433556 | 0.533841 | 33.232937 | 0.099698 | 9.969789 | 0.317953 | 9.060403 | 0.399854 | 8.181771 | 0.322889 | 24.76544 | false | true |
2024-05-07
|
2024-06-26
| 1 |
meta-llama/Meta-Llama-3-8B
|
openchat_openchat_3.5_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/openchat/openchat_3.5" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">openchat/openchat_3.5</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/openchat__openchat_3.5-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
openchat/openchat_3.5
|
0fc98e324280bc4bf5d2c30ecf7b97b84fb8a19b
| 21.635827 |
apache-2.0
| 1,120 | 7 | true | false | false | true | 1.002421 | 0.593112 | 59.311183 | 0.442632 | 21.582167 | 0.072508 | 7.250755 | 0.298658 | 6.487696 | 0.422865 | 11.258073 | 0.315326 | 23.925089 | false | true |
2023-10-30
|
2024-06-12
| 0 |
openchat/openchat_3.5
|
openchat_openchat_v3.2_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/openchat/openchat_v3.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">openchat/openchat_v3.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/openchat__openchat_v3.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
openchat/openchat_v3.2
|
acc7ce92558681e749678648189812f15c1465fe
| 13.833146 |
llama2
| 42 | 13 | true | false | false | false | 10.60491 | 0.298056 | 29.805583 | 0.433056 | 20.323003 | 0.01284 | 1.283988 | 0.270134 | 2.684564 | 0.433625 | 13.103125 | 0.242188 | 15.798611 | false | true |
2023-07-30
|
2024-06-12
| 0 |
openchat/openchat_v3.2
|
openchat_openchat_v3.2_super_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/openchat/openchat_v3.2_super" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">openchat/openchat_v3.2_super</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/openchat__openchat_v3.2_super-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
openchat/openchat_v3.2_super
|
9479cc37d43234a57a33628637d1aca0293d745a
| 12.923575 |
llama2
| 35 | 13 | true | false | false | false | 10.055387 | 0.286191 | 28.619064 | 0.422121 | 19.15354 | 0.021148 | 2.114804 | 0.264262 | 1.901566 | 0.416135 | 9.916927 | 0.24252 | 15.83555 | false | true |
2023-09-04
|
2024-06-12
| 0 |
openchat/openchat_v3.2_super
|
orai-nlp_Llama-eus-8B_bfloat16
|
bfloat16
|
🟩 continuously pretrained
|
🟩
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/orai-nlp/Llama-eus-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">orai-nlp/Llama-eus-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/orai-nlp__Llama-eus-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
orai-nlp/Llama-eus-8B
|
75b5645d222047b517a7a9190922ea1b5382c71f
| 13.943754 | 9 | 8.03 | false | false | false | false | 1.738515 | 0.216123 | 21.612322 | 0.441825 | 20.961371 | 0.046828 | 4.682779 | 0.28943 | 5.257271 | 0.391885 | 8.285677 | 0.305768 | 22.863106 | false | false |
2024-09-04
|
2024-09-30
| 1 |
meta-llama/Meta-Llama-3.1-8B
|
|
oxyapi_oxy-1-small_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/oxyapi/oxy-1-small" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">oxyapi/oxy-1-small</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/oxyapi__oxy-1-small-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
oxyapi/oxy-1-small
|
0d100cf65c8574b025b499dd787d8bcbcf678418
| 36.100829 |
apache-2.0
| 80 | 14.77 | true | false | false | true | 2.773817 | 0.624461 | 62.446087 | 0.588459 | 41.175447 | 0.360272 | 36.02719 | 0.371644 | 16.219239 | 0.448667 | 16.283333 | 0.500083 | 44.453679 | false | false |
2024-12-01
|
2024-12-02
| 1 |
oxyapi/oxy-1-small (Merge)
|
ozone-ai_0x-lite_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/ozone-ai/0x-lite" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ozone-ai/0x-lite</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ozone-ai__0x-lite-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
ozone-ai/0x-lite
|
0b8888d3aa74b127e8e33c27306c05c7f0956bd3
| 40.484603 |
apache-2.0
| 61 | 14.77 | true | false | false | true | 3.452688 | 0.773987 | 77.398746 | 0.634058 | 47.528473 | 0.504532 | 50.453172 | 0.319631 | 9.284116 | 0.422063 | 11.757813 | 0.518368 | 46.485298 | false | false |
2025-01-25
|
2025-01-28
| 1 |
ozone-ai/0x-lite (Merge)
|
ozone-research_Chirp-01_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/ozone-research/Chirp-01" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ozone-research/Chirp-01</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ozone-research__Chirp-01-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
ozone-research/Chirp-01
|
eae888f412b7088e8d621b1da2d588944236a14b
| 28.252603 |
other
| 13 | 3.086 | true | false | false | true | 0.715781 | 0.634752 | 63.475246 | 0.464956 | 25.03833 | 0.346677 | 34.667674 | 0.271812 | 2.908277 | 0.448729 | 15.557812 | 0.350814 | 27.868277 | false | false |
2025-02-23
|
2025-02-23
| 1 |
ozone-research/Chirp-01 (Merge)
|
paloalma_ECE-TW3-JRGL-V1_float16
|
float16
|
🤝 base merges and moerges
|
🤝
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/paloalma/ECE-TW3-JRGL-V1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">paloalma/ECE-TW3-JRGL-V1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/paloalma__ECE-TW3-JRGL-V1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
paloalma/ECE-TW3-JRGL-V1
|
2f08c7ab9db03b1b9f455c7beee6a41e99aa910e
| 30.236001 |
apache-2.0
| 1 | 68.977 | true | false | false | false | 12.383388 | 0.553495 | 55.349473 | 0.628367 | 46.697139 | 0.13142 | 13.141994 | 0.347315 | 12.975391 | 0.462083 | 17.460417 | 0.422124 | 35.791593 | true | false |
2024-04-03
|
2024-08-04
| 0 |
paloalma/ECE-TW3-JRGL-V1
|
paloalma_ECE-TW3-JRGL-V2_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/paloalma/ECE-TW3-JRGL-V2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">paloalma/ECE-TW3-JRGL-V2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/paloalma__ECE-TW3-JRGL-V2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
paloalma/ECE-TW3-JRGL-V2
|
f2c15045f1a7a7a34540ab18abcee8a566a74ca6
| 25.792715 |
apache-2.0
| 0 | 72.288 | true | false | false | false | 25.092499 | 0.225489 | 22.548948 | 0.603099 | 43.173268 | 0.185045 | 18.504532 | 0.331376 | 10.850112 | 0.479323 | 19.815365 | 0.458777 | 39.864066 | true | false |
2024-04-04
|
2024-09-19
| 0 |
paloalma/ECE-TW3-JRGL-V2
|
paloalma_ECE-TW3-JRGL-V5_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/paloalma/ECE-TW3-JRGL-V5" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">paloalma/ECE-TW3-JRGL-V5</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/paloalma__ECE-TW3-JRGL-V5-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
paloalma/ECE-TW3-JRGL-V5
|
4061fa10de22945790cad825f7f4dec96d55b204
| 29.492047 |
apache-2.0
| 0 | 72.289 | true | false | false | false | 34.561842 | 0.455251 | 45.525096 | 0.602471 | 43.462514 | 0.183535 | 18.353474 | 0.341443 | 12.192394 | 0.462052 | 16.889844 | 0.464761 | 40.52896 | true | false |
2024-04-11
|
2024-08-30
| 0 |
paloalma/ECE-TW3-JRGL-V5
|
paloalma_Le_Triomphant-ECE-TW3_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/paloalma/Le_Triomphant-ECE-TW3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">paloalma/Le_Triomphant-ECE-TW3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/paloalma__Le_Triomphant-ECE-TW3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
paloalma/Le_Triomphant-ECE-TW3
|
f72399253bb3e65c0f55e50461488c098f658a49
| 31.996294 |
apache-2.0
| 4 | 72.289 | true | false | false | false | 20.836782 | 0.540206 | 54.020554 | 0.611206 | 44.963294 | 0.194864 | 19.486405 | 0.348993 | 13.199105 | 0.4725 | 18.495833 | 0.476313 | 41.812574 | true | false |
2024-04-01
|
2024-07-25
| 0 |
paloalma/Le_Triomphant-ECE-TW3
|
paloalma_TW3-JRGL-v2_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/paloalma/TW3-JRGL-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">paloalma/TW3-JRGL-v2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/paloalma__TW3-JRGL-v2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
paloalma/TW3-JRGL-v2
|
aca3f0ba2bfb90038a9e2cd5b486821d4c181b46
| 32.462539 |
apache-2.0
| 0 | 72.289 | true | false | false | false | 31.846914 | 0.531613 | 53.161279 | 0.613753 | 45.61111 | 0.179003 | 17.900302 | 0.35906 | 14.541387 | 0.485833 | 20.695833 | 0.485788 | 42.865322 | true | false |
2024-04-01
|
2024-08-29
| 0 |
paloalma/TW3-JRGL-v2
|
pankajmathur_Al_Dente_v1_8b_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/pankajmathur/Al_Dente_v1_8b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">pankajmathur/Al_Dente_v1_8b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/pankajmathur__Al_Dente_v1_8b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
pankajmathur/Al_Dente_v1_8b
|
149d70e04085ecd90510a60f916efc55da1294e7
| 17.300059 |
llama3
| 1 | 8.03 | true | false | false | false | 1.817064 | 0.369372 | 36.937215 | 0.483474 | 27.247898 | 0.040785 | 4.07855 | 0.299497 | 6.599553 | 0.398708 | 8.271875 | 0.285987 | 20.665263 | false | false |
2024-06-02
|
2024-06-26
| 0 |
pankajmathur/Al_Dente_v1_8b
|
pankajmathur_model_007_13b_v2_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/pankajmathur/model_007_13b_v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">pankajmathur/model_007_13b_v2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/pankajmathur__model_007_13b_v2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
pankajmathur/model_007_13b_v2
|
2c6ddf25cdb134f22e2543121b5a36b41342a9e2
| 16.007404 |
llama2
| 4 | 13 | true | false | false | false | 4.36356 | 0.305649 | 30.564901 | 0.470229 | 25.45442 | 0.021148 | 2.114804 | 0.283557 | 4.474273 | 0.461094 | 17.203385 | 0.246094 | 16.232639 | false | false |
2023-08-12
|
2024-06-26
| 0 |
pankajmathur/model_007_13b_v2
|
pankajmathur_orca_mini_3b_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/pankajmathur/orca_mini_3b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">pankajmathur/orca_mini_3b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/pankajmathur__orca_mini_3b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
pankajmathur/orca_mini_3b
|
31e1a7bc3f7ea2f247b432d60036d975b8d590e9
| 3.125275 |
cc-by-nc-sa-4.0
| 161 | 3.426 | true | false | false | false | 1.049551 | 0.074214 | 7.42142 | 0.319607 | 4.685985 | 0.008308 | 0.830816 | 0.245805 | 0 | 0.334927 | 4.199219 | 0.114528 | 1.614214 | false | false |
2023-06-22
|
2024-06-26
| 0 |
pankajmathur/orca_mini_3b
|
pankajmathur_orca_mini_7b_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/pankajmathur/orca_mini_7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">pankajmathur/orca_mini_7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/pankajmathur__orca_mini_7b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
pankajmathur/orca_mini_7b
|
fec86e316b7b98d7be6cf74e98fb927092077abb
| 3.405696 |
cc-by-nc-sa-4.0
| 18 | 7 | true | false | false | false | 0.521441 | 0.041216 | 4.12162 | 0.333223 | 7.81893 | 0.01284 | 1.283988 | 0.254195 | 0.559284 | 0.36975 | 3.91875 | 0.124584 | 2.731605 | false | false |
2023-06-23
|
2024-06-26
| 0 |
pankajmathur/orca_mini_7b
|
pankajmathur_orca_mini_phi-4_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/pankajmathur/orca_mini_phi-4" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">pankajmathur/orca_mini_phi-4</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/pankajmathur__orca_mini_phi-4-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
pankajmathur/orca_mini_phi-4
|
d060fa835868ce422521daf7054dbc64ad48aee3
| 40.676282 |
mit
| 8 | 14.66 | true | false | false | true | 1.70077 | 0.778059 | 77.805888 | 0.685633 | 54.63137 | 0.295317 | 29.531722 | 0.374161 | 16.55481 | 0.470302 | 18.254427 | 0.525515 | 47.279477 | false | false |
2025-01-21
|
2025-01-22
| 1 |
pankajmathur/orca_mini_phi-4 (Merge)
|
pankajmathur_orca_mini_v2_7b_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/pankajmathur/orca_mini_v2_7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">pankajmathur/orca_mini_v2_7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/pankajmathur__orca_mini_v2_7b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
pankajmathur/orca_mini_v2_7b
|
66d3f32a4a6bca0a2a261f1bdb54d2582028f75f
| 5.502369 |
cc-by-nc-sa-4.0
| 37 | 7 | true | false | false | false | 1.185023 | 0.135789 | 13.57886 | 0.353634 | 10.199953 | 0.011329 | 1.132931 | 0.249161 | 0 | 0.359333 | 2.083333 | 0.154172 | 6.019134 | false | false |
2023-07-03
|
2024-06-26
| 0 |
pankajmathur/orca_mini_v2_7b
|
pankajmathur_orca_mini_v3_13b_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/pankajmathur/orca_mini_v3_13b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">pankajmathur/orca_mini_v3_13b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/pankajmathur__orca_mini_v3_13b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
pankajmathur/orca_mini_v3_13b
|
7d6e567d24ce2f228beaf54e89c17b0e750bfe99
| 15.041297 |
other
| 31 | 13 | true | false | false | false | 2.194359 | 0.289663 | 28.966254 | 0.471097 | 25.549482 | 0.021148 | 2.114804 | 0.265101 | 2.013423 | 0.459792 | 17.107292 | 0.230469 | 14.496528 | false | false |
2023-08-09
|
2024-06-26
| 0 |
pankajmathur/orca_mini_v3_13b
|
pankajmathur_orca_mini_v3_70b_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/pankajmathur/orca_mini_v3_70b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">pankajmathur/orca_mini_v3_70b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/pankajmathur__orca_mini_v3_70b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
pankajmathur/orca_mini_v3_70b
|
e8e856dfb5c737d1906b50f9e65fd3a4f8d77422
| 25.298159 |
other
| 23 | 70 | true | false | false | false | 12.813074 | 0.40147 | 40.147032 | 0.594931 | 42.975787 | 0.03852 | 3.851964 | 0.317953 | 9.060403 | 0.507854 | 25.115104 | 0.375748 | 30.638667 | false | false |
2023-08-10
|
2024-06-26
| 0 |
pankajmathur/orca_mini_v3_70b
|
pankajmathur_orca_mini_v3_7b_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/pankajmathur/orca_mini_v3_7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">pankajmathur/orca_mini_v3_7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/pankajmathur__orca_mini_v3_7b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
pankajmathur/orca_mini_v3_7b
|
6252eb7ca29da8d951ae7d2bca948bf84e04a2b9
| 13.644021 |
other
| 40 | 7 | true | false | false | false | 1.2799 | 0.282094 | 28.209373 | 0.409533 | 17.843956 | 0.010574 | 1.057402 | 0.246644 | 0 | 0.49824 | 22.713281 | 0.208361 | 12.040115 | false | false |
2023-08-07
|
2024-06-26
| 0 |
pankajmathur/orca_mini_v3_7b
|
pankajmathur_orca_mini_v5_8b_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/pankajmathur/orca_mini_v5_8b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">pankajmathur/orca_mini_v5_8b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/pankajmathur__orca_mini_v5_8b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
pankajmathur/orca_mini_v5_8b
|
f57c84d4cc0b3b74549458c0d38e868bd7fffad1
| 20.498301 |
llama3
| 2 | 8.03 | true | false | false | false | 1.757942 | 0.480605 | 48.06048 | 0.506424 | 29.345795 | 0.098943 | 9.89426 | 0.286913 | 4.9217 | 0.40001 | 7.701302 | 0.307596 | 23.066268 | false | false |
2024-05-26
|
2024-06-26
| 0 |
pankajmathur/orca_mini_v5_8b
|
pankajmathur_orca_mini_v5_8b_dpo_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/pankajmathur/orca_mini_v5_8b_dpo" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">pankajmathur/orca_mini_v5_8b_dpo</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/pankajmathur__orca_mini_v5_8b_dpo-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
pankajmathur/orca_mini_v5_8b_dpo
|
fdc0d0aaa85a58f1abaf2c24ce0ddca10c08f0f1
| 20.334207 |
llama3
| 3 | 8 | true | false | false | false | 1.633379 | 0.489647 | 48.964747 | 0.50746 | 29.605373 | 0.097432 | 9.743202 | 0.274329 | 3.243848 | 0.389375 | 6.938542 | 0.311586 | 23.50953 | false | false |
2024-05-30
|
2024-06-26
| 0 |
pankajmathur/orca_mini_v5_8b_dpo
|
pankajmathur_orca_mini_v5_8b_orpo_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/pankajmathur/orca_mini_v5_8b_orpo" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">pankajmathur/orca_mini_v5_8b_orpo</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/pankajmathur__orca_mini_v5_8b_orpo-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
pankajmathur/orca_mini_v5_8b_orpo
|
4cdc018043ef439f15bd8a09c4f09c6bc528dfc7
| 12.99373 |
llama3
| 1 | 8 | true | false | false | false | 1.9437 | 0.082432 | 8.243239 | 0.496374 | 27.877628 | 0.066465 | 6.646526 | 0.284396 | 4.58613 | 0.413125 | 8.973958 | 0.294714 | 21.6349 | false | false |
2024-05-31
|
2024-06-26
| 0 |
pankajmathur/orca_mini_v5_8b_orpo
|
pankajmathur_orca_mini_v6_8b_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/pankajmathur/orca_mini_v6_8b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">pankajmathur/orca_mini_v6_8b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/pankajmathur__orca_mini_v6_8b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
pankajmathur/orca_mini_v6_8b
|
e95dc8e4c6b6ca5957b657cc2d905683142eaf3e
| 1.476339 |
llama3
| 2 | 8.03 | true | false | false | true | 2.432736 | 0.011116 | 1.111606 | 0.30287 | 3.21981 | 0.003776 | 0.377644 | 0.238255 | 0 | 0.355458 | 2.765625 | 0.11245 | 1.383348 | false | false |
2024-06-02
|
2024-06-26
| 0 |
pankajmathur/orca_mini_v6_8b
|
pankajmathur_orca_mini_v6_8b_dpo_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/pankajmathur/orca_mini_v6_8b_dpo" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">pankajmathur/orca_mini_v6_8b_dpo</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/pankajmathur__orca_mini_v6_8b_dpo-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
pankajmathur/orca_mini_v6_8b_dpo
|
ebb11b63839d38e8c03c7ecac012e047fcb2346e
| 20.392492 |
llama3
| 2 | 8 | true | false | false | false | 1.539647 | 0.388256 | 38.825649 | 0.520281 | 32.478826 | 0.061178 | 6.117825 | 0.301174 | 6.823266 | 0.409031 | 9.26224 | 0.359624 | 28.847148 | false | false |
2024-06-21
|
2024-06-26
| 0 |
pankajmathur/orca_mini_v6_8b_dpo
|
pankajmathur_orca_mini_v7_72b_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/pankajmathur/orca_mini_v7_72b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">pankajmathur/orca_mini_v7_72b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/pankajmathur__orca_mini_v7_72b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
pankajmathur/orca_mini_v7_72b
|
447f11912cfa496e32e188a55214043a05760d3a
| 36.215291 |
apache-2.0
| 11 | 72.706 | true | false | false | true | 28.103414 | 0.592962 | 59.296223 | 0.68423 | 55.055523 | 0.093656 | 9.365559 | 0.385067 | 18.008949 | 0.507042 | 24.213542 | 0.562168 | 51.35195 | false | false |
2024-06-26
|
2025-01-02
| 0 |
pankajmathur/orca_mini_v7_72b
|
pankajmathur_orca_mini_v7_7b_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/pankajmathur/orca_mini_v7_7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">pankajmathur/orca_mini_v7_7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/pankajmathur__orca_mini_v7_7b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
pankajmathur/orca_mini_v7_7b
|
f5e84ff6ea25fb4585908ea45d1520bac416d803
| 23.986504 |
apache-2.0
| 2 | 7.616 | true | false | false | false | 1.850219 | 0.438765 | 43.87647 | 0.527491 | 33.950434 | 0.120846 | 12.084592 | 0.296141 | 6.152125 | 0.435979 | 12.664063 | 0.416722 | 35.191342 | false | false |
2024-06-20
|
2024-06-26
| 0 |
pankajmathur/orca_mini_v7_7b
|
pankajmathur_orca_mini_v8_1_70b_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/pankajmathur/orca_mini_v8_1_70b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">pankajmathur/orca_mini_v8_1_70b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/pankajmathur__orca_mini_v8_1_70b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
pankajmathur/orca_mini_v8_1_70b
|
84663295413819491b08cd3b7e50d04a5eb0bb1a
| 43.191232 |
llama3.3
| 5 | 70.554 | true | false | false | true | 54.448937 | 0.857143 | 85.714349 | 0.678131 | 53.519727 | 0.352719 | 35.271903 | 0.432886 | 24.384787 | 0.443708 | 15.996875 | 0.498338 | 44.259752 | false | false |
2024-12-12
|
2024-12-21
| 1 |
pankajmathur/orca_mini_v8_1_70b (Merge)
|
Subsets and Splits
Top 100 Official Models <70
This query identifies the top 100 high-scoring, officially provided models with fewer than 70 billion parameters, offering a useful overview for comparing performance metrics.
Top 100 Official Models < 2
Identifies top-performing AI models with fewer than 20 billion parameters, offering insights into efficiency and precision in smaller models.
Top 500 Official Models by Score
Identifies top performing models based on a combined score of IFEval and MMLU-PRO metrics, filtering by official providers and parameter count, offering insights into efficient model performance.
Top 200 Official Models by Score
Discovers top high-performing models with less than 70 billion parameters, highlighting their evaluation scores and characteristics, which is valuable for model selection and optimization.
SQL Console for open-llm-leaderboard/contents
Identifies top-performing models with fewer than 70 billion parameters, combining two evaluation metrics to reveal the best balanced options.
Top 10 Official Leaderboard Models
The query identifies top 10 official providers with under 13 billion parameters, ordered by their average metric, revealing valuable insights into efficient models.
SQL Console for open-llm-leaderboard/contents
This query filters and ranks models within a specific parameter range (6-8 billion) for the LlamaForCausalLM architecture based on their average performance metric.
SQL Console for open-llm-leaderboard/contents
Retrieves entries related to chat models that are officially provided, offering a filtered view of the dataset.
SQL Console for open-llm-leaderboard/contents
The query retrieves entries marked as "Official Providers", offering basic filtering but limited analytical value.
Top 10 Official Training Data
The query retrieves a small sample of records from the 'train' dataset where the "Official Providers" flag is true, providing basic filtering with limited analytical value.