eval_name
stringlengths 12
111
| Precision
stringclasses 3
values | Type
stringclasses 7
values | T
stringclasses 7
values | Weight type
stringclasses 2
values | Architecture
stringclasses 63
values | Model
stringlengths 355
689
| fullname
stringlengths 4
102
| Model sha
stringlengths 0
40
| Average ⬆️
float64 0.74
52.1
| Hub License
stringclasses 27
values | Hub ❤️
int64 0
6.04k
| #Params (B)
float64 -1
141
| Available on the hub
bool 2
classes | MoE
bool 2
classes | Flagged
bool 2
classes | Chat Template
bool 2
classes | CO₂ cost (kg)
float64 0.04
187
| IFEval Raw
float64 0
0.9
| IFEval
float64 0
90
| BBH Raw
float64 0.24
0.75
| BBH
float64 0.25
64.1
| MATH Lvl 5 Raw
float64 0
0.71
| MATH Lvl 5
float64 0
71.5
| GPQA Raw
float64 0.21
0.47
| GPQA
float64 0
29.4
| MUSR Raw
float64 0.29
0.6
| MUSR
float64 0
38.5
| MMLU-PRO Raw
float64 0.1
0.73
| MMLU-PRO
float64 0
70
| Merged
bool 2
classes | Official Providers
bool 2
classes | Upload To Hub Date
stringclasses 502
values | Submission Date
stringclasses 241
values | Generation
int64 0
10
| Base Model
stringlengths 4
102
|
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
zelk12_Test01012025155054_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/zelk12/Test01012025155054" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/Test01012025155054</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__Test01012025155054-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | zelk12/Test01012025155054 | c607186b0b079975e3305e0223e0a55f0cbc19e5 | 3.591417 | 0 | 3.817 | false | false | false | true | 1.400948 | 0.155523 | 15.55229 | 0.28295 | 1.280547 | 0 | 0 | 0.241611 | 0 | 0.367021 | 3.710937 | 0.109043 | 1.004728 | false | false | 2025-01-01 | 2025-01-01 | 1 | zelk12/Test01012025155054 (Merge) |
|
zelk12_Test01012025155054t0.5_gemma-2_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/zelk12/Test01012025155054t0.5_gemma-2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/Test01012025155054t0.5_gemma-2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__Test01012025155054t0.5_gemma-2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | zelk12/Test01012025155054t0.5_gemma-2 | 14fcae0d420d303df84bd9b9c8744a6f0fa147fb | 3.591417 | 0 | 3.817 | false | false | false | true | 1.395928 | 0.155523 | 15.55229 | 0.28295 | 1.280547 | 0 | 0 | 0.241611 | 0 | 0.367021 | 3.710937 | 0.109043 | 1.004728 | false | false | 2025-01-01 | 2025-01-01 | 1 | zelk12/Test01012025155054t0.5_gemma-2 (Merge) |
|
zelk12_gemma-2-S2MTM-9B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/zelk12/gemma-2-S2MTM-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/gemma-2-S2MTM-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__gemma-2-S2MTM-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | zelk12/gemma-2-S2MTM-9B | fd6860743943114eeca6fc2e800e27c87873bcc5 | 33.89283 | gemma | 0 | 10.159 | true | false | false | true | 3.530205 | 0.782256 | 78.225553 | 0.606084 | 43.115728 | 0.204683 | 20.468278 | 0.345638 | 12.751678 | 0.421844 | 12.163802 | 0.429688 | 36.631944 | true | false | 2024-12-11 | 2024-12-11 | 1 | zelk12/gemma-2-S2MTM-9B (Merge) |
zelk12_recoilme-gemma-2-Ataraxy-9B-v0.1_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/zelk12/recoilme-gemma-2-Ataraxy-9B-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/recoilme-gemma-2-Ataraxy-9B-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__recoilme-gemma-2-Ataraxy-9B-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | zelk12/recoilme-gemma-2-Ataraxy-9B-v0.1 | b4208ddf6c741884c16c77b9433d9ead8f216354 | 33.919919 | 2 | 10.159 | false | false | false | true | 6.886383 | 0.764895 | 76.489492 | 0.607451 | 43.706516 | 0.228097 | 22.809668 | 0.349832 | 13.310962 | 0.413625 | 10.303125 | 0.432098 | 36.899749 | false | false | 2024-10-03 | 2024-10-03 | 1 | zelk12/recoilme-gemma-2-Ataraxy-9B-v0.1 (Merge) |
|
zelk12_recoilme-gemma-2-Ataraxy-9B-v0.1-t0.25_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/zelk12/recoilme-gemma-2-Ataraxy-9B-v0.1-t0.25" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/recoilme-gemma-2-Ataraxy-9B-v0.1-t0.25</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__recoilme-gemma-2-Ataraxy-9B-v0.1-t0.25-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | zelk12/recoilme-gemma-2-Ataraxy-9B-v0.1-t0.25 | e652c9e07265526851dad994f4640aa265b9ab56 | 34.282119 | 1 | 10.159 | false | false | false | true | 6.389981 | 0.770665 | 77.066517 | 0.607543 | 43.85035 | 0.214502 | 21.450151 | 0.343121 | 12.416107 | 0.43226 | 13.132552 | 0.439993 | 37.777039 | false | false | 2024-10-04 | 2024-10-04 | 1 | zelk12/recoilme-gemma-2-Ataraxy-9B-v0.1-t0.25 (Merge) |
|
zelk12_recoilme-gemma-2-Ataraxy-9B-v0.1-t0.75_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/zelk12/recoilme-gemma-2-Ataraxy-9B-v0.1-t0.75" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/recoilme-gemma-2-Ataraxy-9B-v0.1-t0.75</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__recoilme-gemma-2-Ataraxy-9B-v0.1-t0.75-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | zelk12/recoilme-gemma-2-Ataraxy-9B-v0.1-t0.75 | eb0e589291630ba20328db650f74af949d217a97 | 31.782789 | 0 | 10.159 | false | false | false | true | 7.502906 | 0.720806 | 72.080635 | 0.59952 | 42.487153 | 0.201662 | 20.166163 | 0.349832 | 13.310962 | 0.395115 | 7.75599 | 0.414063 | 34.895833 | false | false | 2024-10-04 | 2024-10-04 | 1 | zelk12/recoilme-gemma-2-Ataraxy-9B-v0.1-t0.75 (Merge) |
|
zelk12_recoilme-gemma-2-Ataraxy-9B-v0.2_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/zelk12/recoilme-gemma-2-Ataraxy-9B-v0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/recoilme-gemma-2-Ataraxy-9B-v0.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__recoilme-gemma-2-Ataraxy-9B-v0.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | zelk12/recoilme-gemma-2-Ataraxy-9B-v0.2 | 76f56b25bf6d8704282f8c77bfda28ca384883bc | 33.626064 | 1 | 10.159 | false | false | false | true | 6.827351 | 0.759999 | 75.999902 | 0.606626 | 43.633588 | 0.22281 | 22.280967 | 0.348154 | 13.087248 | 0.410958 | 9.836458 | 0.432264 | 36.918218 | false | false | 2024-10-07 | 2024-10-11 | 1 | zelk12/recoilme-gemma-2-Ataraxy-9B-v0.2 (Merge) |
|
zelk12_recoilme-gemma-2-Gutenberg-Doppel-9B-v0.1_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/zelk12/recoilme-gemma-2-Gutenberg-Doppel-9B-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/recoilme-gemma-2-Gutenberg-Doppel-9B-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__recoilme-gemma-2-Gutenberg-Doppel-9B-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | zelk12/recoilme-gemma-2-Gutenberg-Doppel-9B-v0.1 | 1e3e623e9f0b386bfd967c629dd39c87daef5bed | 33.904825 | 1 | 10.159 | false | false | false | true | 9.69897 | 0.761523 | 76.152276 | 0.609878 | 43.941258 | 0.20997 | 20.996979 | 0.341443 | 12.192394 | 0.431021 | 13.310937 | 0.431516 | 36.835106 | false | false | 2024-10-07 | 2024-10-07 | 1 | zelk12/recoilme-gemma-2-Gutenberg-Doppel-9B-v0.1 (Merge) |
|
zelk12_recoilme-gemma-2-Ifable-9B-v0.1_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/zelk12/recoilme-gemma-2-Ifable-9B-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/recoilme-gemma-2-Ifable-9B-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__recoilme-gemma-2-Ifable-9B-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | zelk12/recoilme-gemma-2-Ifable-9B-v0.1 | 8af6620b39c9a36239879b6b2bd88f66e9e9d930 | 34.406991 | 0 | 10.159 | false | false | false | true | 9.808856 | 0.794396 | 79.439554 | 0.60644 | 43.39057 | 0.220544 | 22.054381 | 0.35151 | 13.534676 | 0.420229 | 11.095313 | 0.432347 | 36.927453 | false | false | 2024-10-07 | 2024-10-07 | 1 | zelk12/recoilme-gemma-2-Ifable-9B-v0.1 (Merge) |
|
zelk12_recoilme-gemma-2-psy10k-mental_healt-9B-v0.1_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/zelk12/recoilme-gemma-2-psy10k-mental_healt-9B-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/recoilme-gemma-2-psy10k-mental_healt-9B-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__recoilme-gemma-2-psy10k-mental_healt-9B-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | zelk12/recoilme-gemma-2-psy10k-mental_healt-9B-v0.1 | ced039b03be6f65ac0f713efcee76c6534e65639 | 32.586531 | 1 | 10.159 | false | false | false | true | 6.264441 | 0.744537 | 74.453672 | 0.597759 | 42.132683 | 0.188822 | 18.882175 | 0.34396 | 12.527964 | 0.429469 | 12.183594 | 0.418052 | 35.339096 | false | false | 2024-10-07 | 2024-10-07 | 1 | zelk12/recoilme-gemma-2-psy10k-mental_healt-9B-v0.1 (Merge) |
|
zetasepic_Qwen2.5-32B-Instruct-abliterated-v2_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/zetasepic/Qwen2.5-32B-Instruct-abliterated-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zetasepic/Qwen2.5-32B-Instruct-abliterated-v2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zetasepic__Qwen2.5-32B-Instruct-abliterated-v2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | zetasepic/Qwen2.5-32B-Instruct-abliterated-v2 | 5894fbf0a900e682dfc0ed794db337093bd8d26b | 46.888673 | apache-2.0 | 9 | 32.764 | true | false | false | true | 13.489578 | 0.833413 | 83.341312 | 0.693402 | 56.533818 | 0.595166 | 59.516616 | 0.36745 | 15.659955 | 0.435427 | 14.928385 | 0.562168 | 51.35195 | false | false | 2024-10-11 | 2024-12-07 | 2 | Qwen/Qwen2.5-32B |
zetasepic_Qwen2.5-72B-Instruct-abliterated_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/zetasepic/Qwen2.5-72B-Instruct-abliterated" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zetasepic/Qwen2.5-72B-Instruct-abliterated</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zetasepic__Qwen2.5-72B-Instruct-abliterated-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | zetasepic/Qwen2.5-72B-Instruct-abliterated | af94b3c05c9857dbac73afb1cbce00e4833ec9ef | 46.337953 | other | 27 | 72.706 | true | false | false | false | 37.618363 | 0.715261 | 71.526106 | 0.715226 | 59.912976 | 0.524169 | 52.416918 | 0.406879 | 20.917226 | 0.471917 | 19.122917 | 0.587184 | 54.131575 | false | false | 2024-10-01 | 2024-11-08 | 2 | Qwen/Qwen2.5-72B |
zhengr_MixTAO-7Bx2-MoE-v8.1_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | MixtralForCausalLM | <a target="_blank" href="https://huggingface.co/zhengr/MixTAO-7Bx2-MoE-v8.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zhengr/MixTAO-7Bx2-MoE-v8.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zhengr__MixTAO-7Bx2-MoE-v8.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | zhengr/MixTAO-7Bx2-MoE-v8.1 | 828e963abf2db0f5af9ed0d4034e538fc1cf5f40 | 17.067606 | apache-2.0 | 55 | 12.879 | true | true | false | true | 1.85478 | 0.418781 | 41.878106 | 0.420194 | 19.176907 | 0.060423 | 6.042296 | 0.298658 | 6.487696 | 0.397625 | 8.303125 | 0.284658 | 20.517509 | false | false | 2024-02-26 | 2024-06-27 | 0 | zhengr/MixTAO-7Bx2-MoE-v8.1 |
Subsets and Splits