eval_name
stringlengths 12
111
| Precision
stringclasses 3
values | Type
stringclasses 7
values | T
stringclasses 7
values | Weight type
stringclasses 2
values | Architecture
stringclasses 64
values | Model
stringlengths 355
689
| fullname
stringlengths 4
102
| Model sha
stringlengths 0
40
| Average ⬆️
float64 0.74
52.1
| Hub License
stringclasses 27
values | Hub ❤️
int64 0
6.09k
| #Params (B)
float64 -1
141
| Available on the hub
bool 2
classes | MoE
bool 2
classes | Flagged
bool 2
classes | Chat Template
bool 2
classes | CO₂ cost (kg)
float64 0.04
187
| IFEval Raw
float64 0
0.9
| IFEval
float64 0
90
| BBH Raw
float64 0.22
0.83
| BBH
float64 0.25
76.7
| MATH Lvl 5 Raw
float64 0
0.71
| MATH Lvl 5
float64 0
71.5
| GPQA Raw
float64 0.21
0.47
| GPQA
float64 0
29.4
| MUSR Raw
float64 0.29
0.6
| MUSR
float64 0
38.7
| MMLU-PRO Raw
float64 0.1
0.73
| MMLU-PRO
float64 0
70
| Merged
bool 2
classes | Official Providers
bool 2
classes | Upload To Hub Date
stringclasses 525
values | Submission Date
stringclasses 263
values | Generation
int64 0
10
| Base Model
stringlengths 4
102
|
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
pankajmathur_orca_mini_v9_0_3B-Instruct_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/pankajmathur/orca_mini_v9_0_3B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">pankajmathur/orca_mini_v9_0_3B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/pankajmathur__orca_mini_v9_0_3B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
pankajmathur/orca_mini_v9_0_3B-Instruct
|
37710875f7841e72c99cd5494cf450bb5bd6c680
| 20.660838 |
llama3.2
| 5 | 3.213 | true | false | false | true | 1.202363 | 0.575377 | 57.537667 | 0.441295 | 21.368153 | 0.146526 | 14.652568 | 0.301174 | 6.823266 | 0.365906 | 5.771615 | 0.260306 | 17.811761 | false | false |
2024-12-27
|
2024-12-29
| 1 |
pankajmathur/orca_mini_v9_0_3B-Instruct (Merge)
|
pankajmathur_orca_mini_v9_1_1B-Instruct_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/pankajmathur/orca_mini_v9_1_1B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">pankajmathur/orca_mini_v9_1_1B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/pankajmathur__orca_mini_v9_1_1B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
pankajmathur/orca_mini_v9_1_1B-Instruct
|
2c4cc6dacbff82ec76845fcc770322318742e794
| 9.063246 |
llama3.2
| 3 | 1.236 | true | false | false | true | 0.713576 | 0.362927 | 36.292703 | 0.320512 | 6.406449 | 0.046073 | 4.607251 | 0.256711 | 0.894855 | 0.338063 | 2.024479 | 0.137384 | 4.153738 | false | false |
2024-12-27
|
2024-12-29
| 1 |
pankajmathur/orca_mini_v9_1_1B-Instruct (Merge)
|
pankajmathur_orca_mini_v9_2_14B_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/pankajmathur/orca_mini_v9_2_14B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">pankajmathur/orca_mini_v9_2_14B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/pankajmathur__orca_mini_v9_2_14B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
pankajmathur/orca_mini_v9_2_14B
|
fc8e88751753f1757dc84d5ce0ad2384450645a2
| 40.676282 |
mit
| 8 | 14.66 | true | false | false | true | 1.881417 | 0.778059 | 77.805888 | 0.685633 | 54.63137 | 0.295317 | 29.531722 | 0.374161 | 16.55481 | 0.470302 | 18.254427 | 0.525515 | 47.279477 | false | false |
2025-01-21
|
2025-01-21
| 1 |
pankajmathur/orca_mini_v9_2_14B (Merge)
|
pankajmathur_orca_mini_v9_2_70b_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/pankajmathur/orca_mini_v9_2_70b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">pankajmathur/orca_mini_v9_2_70b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/pankajmathur__orca_mini_v9_2_70b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
pankajmathur/orca_mini_v9_2_70b
|
19f021126bc52484fd60fa5daeff59219592e534
| 40.724777 |
llama3.3
| 4 | 70.554 | true | false | false | true | 48.05524 | 0.838259 | 83.825915 | 0.674487 | 53.0331 | 0.293807 | 29.380665 | 0.373322 | 16.442953 | 0.47099 | 19.207031 | 0.482131 | 42.458998 | false | false |
2024-12-30
|
2025-01-02
| 1 |
pankajmathur/orca_mini_v9_2_70b (Merge)
|
pankajmathur_orca_mini_v9_4_70B_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/pankajmathur/orca_mini_v9_4_70B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">pankajmathur/orca_mini_v9_4_70B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/pankajmathur__orca_mini_v9_4_70B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
pankajmathur/orca_mini_v9_4_70B
|
6538f7ad108c90d4aeb317a90eadaf489d687319
| 39.325507 |
llama3.3
| 2 | 70.554 | true | false | false | true | 60.690196 | 0.801465 | 80.146456 | 0.64189 | 48.692617 | 0.326284 | 32.628399 | 0.365772 | 15.436242 | 0.464729 | 19.757813 | 0.453624 | 39.291519 | false | false |
2025-01-17
|
2025-01-17
| 1 |
pankajmathur/orca_mini_v9_4_70B (Merge)
|
pankajmathur_orca_mini_v9_5_1B-Instruct_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/pankajmathur/orca_mini_v9_5_1B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">pankajmathur/orca_mini_v9_5_1B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/pankajmathur__orca_mini_v9_5_1B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
pankajmathur/orca_mini_v9_5_1B-Instruct
|
eaf758bef610953480309044303c8c15985ac24d
| 10.693502 |
llama3.2
| 4 | 1.236 | true | false | false | true | 1.091075 | 0.463794 | 46.379384 | 0.3337 | 6.698817 | 0.030211 | 3.021148 | 0.270134 | 2.684564 | 0.318156 | 1.269531 | 0.136968 | 4.107565 | false | false |
2025-01-02
|
2025-01-02
| 1 |
pankajmathur/orca_mini_v9_5_1B-Instruct (Merge)
|
pankajmathur_orca_mini_v9_5_1B-Instruct_preview_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/pankajmathur/orca_mini_v9_5_1B-Instruct_preview" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">pankajmathur/orca_mini_v9_5_1B-Instruct_preview</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/pankajmathur__orca_mini_v9_5_1B-Instruct_preview-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
pankajmathur/orca_mini_v9_5_1B-Instruct_preview
|
7f4581135998269b83f79624a2435cc314f5f45b
| 9.541823 |
llama3.2
| 2 | 1.236 | true | false | false | true | 0.715596 | 0.393577 | 39.357682 | 0.327695 | 5.582692 | 0.03852 | 3.851964 | 0.263423 | 1.789709 | 0.339458 | 3.032292 | 0.132729 | 3.636599 | false | false |
2024-12-30
|
2024-12-30
| 1 |
pankajmathur/orca_mini_v9_5_1B-Instruct_preview (Merge)
|
pankajmathur_orca_mini_v9_5_3B-Instruct_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/pankajmathur/orca_mini_v9_5_3B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">pankajmathur/orca_mini_v9_5_3B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/pankajmathur__orca_mini_v9_5_3B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
pankajmathur/orca_mini_v9_5_3B-Instruct
|
9d68ed7de708f52e8fa3b173fb7315a941d45b9c
| 24.152681 |
llama3.2
| 6 | 3.213 | true | false | false | true | 1.112053 | 0.720707 | 72.070661 | 0.449638 | 21.517904 | 0.132175 | 13.217523 | 0.286913 | 4.9217 | 0.42699 | 12.273698 | 0.288231 | 20.914598 | false | false |
2025-01-01
|
2025-01-01
| 1 |
pankajmathur/orca_mini_v9_5_3B-Instruct (Merge)
|
pankajmathur_orca_mini_v9_6_1B-Instruct_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/pankajmathur/orca_mini_v9_6_1B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">pankajmathur/orca_mini_v9_6_1B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/pankajmathur__orca_mini_v9_6_1B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
pankajmathur/orca_mini_v9_6_1B-Instruct
|
6219f36f9cb41a659ca721e74b70364dda0a9a8a
| 15.323671 |
llama3.2
| 6 | 1.236 | true | false | false | true | 0.761007 | 0.608574 | 60.857414 | 0.356135 | 9.659037 | 0.077039 | 7.703927 | 0.268456 | 2.46085 | 0.339552 | 2.277344 | 0.180851 | 8.983452 | false | false |
2025-01-06
|
2025-01-06
| 1 |
pankajmathur/orca_mini_v9_6_1B-Instruct (Merge)
|
pankajmathur_orca_mini_v9_6_3B-Instruct_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/pankajmathur/orca_mini_v9_6_3B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">pankajmathur/orca_mini_v9_6_3B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/pankajmathur__orca_mini_v9_6_3B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
pankajmathur/orca_mini_v9_6_3B-Instruct
|
2cd07c6364b883d29036b9c8fe1816221b693d71
| 24.086826 |
llama3.2
| 4 | 3.213 | true | false | false | true | 1.662569 | 0.731648 | 73.164758 | 0.456833 | 22.86989 | 0.132931 | 13.293051 | 0.293624 | 5.816555 | 0.406771 | 8.813021 | 0.285073 | 20.563682 | false | false |
2025-01-03
|
2025-01-03
| 1 |
pankajmathur/orca_mini_v9_6_3B-Instruct (Merge)
|
pankajmathur_orca_mini_v9_7_1B-Instruct_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/pankajmathur/orca_mini_v9_7_1B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">pankajmathur/orca_mini_v9_7_1B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/pankajmathur__orca_mini_v9_7_1B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
pankajmathur/orca_mini_v9_7_1B-Instruct
|
b9e52b91802bd4ae941c0d328e9fa7818e0ce504
| 12.485692 |
llama3.2
| 4 | 1.236 | true | false | false | true | 0.851215 | 0.561014 | 56.101367 | 0.318153 | 5.052028 | 0.044562 | 4.456193 | 0.272651 | 3.020134 | 0.352698 | 2.453906 | 0.134475 | 3.830526 | false | false |
2025-01-04
|
2025-01-05
| 1 |
pankajmathur/orca_mini_v9_7_1B-Instruct (Merge)
|
pankajmathur_orca_mini_v9_7_3B-Instruct_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/pankajmathur/orca_mini_v9_7_3B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">pankajmathur/orca_mini_v9_7_3B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/pankajmathur__orca_mini_v9_7_3B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
pankajmathur/orca_mini_v9_7_3B-Instruct
|
46298c9816c60d18d8b1217a540b75a0a8cf9aab
| 13.034703 |
llama3.2
| 4 | 3.213 | true | false | false | true | 1.071098 | 0.561838 | 56.183815 | 0.329713 | 6.301039 | 0.061934 | 6.193353 | 0.261745 | 1.565996 | 0.361875 | 3.801042 | 0.137467 | 4.162973 | false | false |
2025-01-05
|
2025-01-05
| 1 |
pankajmathur/orca_mini_v9_7_3B-Instruct (Merge)
|
paulml_ECE-ILAB-Q1_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/paulml/ECE-ILAB-Q1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">paulml/ECE-ILAB-Q1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/paulml__ECE-ILAB-Q1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
paulml/ECE-ILAB-Q1
|
393bea0ee85e4c752acd5fd77ce07f577fc13bd9
| 42.503072 |
other
| 0 | 72.706 | true | false | false | false | 22.830284 | 0.786452 | 78.645217 | 0.671776 | 53.702228 | 0.35574 | 35.574018 | 0.386745 | 18.232662 | 0.461375 | 18.805208 | 0.550532 | 50.059102 | true | false |
2024-06-06
|
2024-09-16
| 0 |
paulml/ECE-ILAB-Q1
|
pints-ai_1.5-Pints-16K-v0.1_bfloat16
|
bfloat16
|
🟢 pretrained
|
🟢
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/pints-ai/1.5-Pints-16K-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">pints-ai/1.5-Pints-16K-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/pints-ai__1.5-Pints-16K-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
pints-ai/1.5-Pints-16K-v0.1
|
7862a52f250be68fad593f3a4030f00d658ede56
| 4.250928 |
mit
| 14 | 1.566 | true | false | false | true | 0.559877 | 0.163591 | 16.359149 | 0.313308 | 3.658292 | 0.01435 | 1.435045 | 0.235738 | 0 | 0.357875 | 2.734375 | 0.111868 | 1.318706 | false | false |
2024-08-07
|
2024-09-09
| 0 |
pints-ai/1.5-Pints-16K-v0.1
|
pints-ai_1.5-Pints-2K-v0.1_bfloat16
|
bfloat16
|
🟢 pretrained
|
🟢
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/pints-ai/1.5-Pints-2K-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">pints-ai/1.5-Pints-2K-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/pints-ai__1.5-Pints-2K-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
pints-ai/1.5-Pints-2K-v0.1
|
2e865c18669161ebbf5e9ad79ae0502ee0153df0
| 4.04444 |
mit
| 16 | 1.566 | true | false | false | true | 0.582833 | 0.176156 | 17.615593 | 0.298019 | 2.37447 | 0.01284 | 1.283988 | 0.248322 | 0 | 0.350187 | 1.840104 | 0.110372 | 1.152482 | false | false |
2024-08-07
|
2024-09-09
| 0 |
pints-ai/1.5-Pints-2K-v0.1
|
piotr25691_thea-3b-25r_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/piotr25691/thea-3b-25r" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">piotr25691/thea-3b-25r</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/piotr25691__thea-3b-25r-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
piotr25691/thea-3b-25r
|
4661fb3c8b18bdf2059f703c4f69caea24057151
| 23.996071 |
llama3.2
| 1 | 3.213 | true | false | false | true | 1.381015 | 0.73442 | 73.442023 | 0.448441 | 22.546711 | 0.178248 | 17.824773 | 0.267617 | 2.348993 | 0.331458 | 3.565625 | 0.318235 | 24.248301 | false | false |
2024-10-11
|
2024-10-12
| 1 |
chuanli11/Llama-3.2-3B-Instruct-uncensored
|
piotr25691_thea-c-3b-25r_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/piotr25691/thea-c-3b-25r" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">piotr25691/thea-c-3b-25r</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/piotr25691__thea-c-3b-25r-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
piotr25691/thea-c-3b-25r
|
93a2333a84feda26f020bc8fa92f870462dacd89
| 23.254796 |
llama3.2
| 1 | 3.213 | true | false | false | true | 1.324912 | 0.74019 | 74.019047 | 0.453241 | 22.76785 | 0.152568 | 15.256798 | 0.265101 | 2.013423 | 0.33149 | 1.269531 | 0.317819 | 24.202128 | false | false |
2024-10-14
|
2024-10-17
| 1 |
meta-llama/Llama-3.2-3B-Instruct
|
piotr25691_thea-rp-3b-25r_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/piotr25691/thea-rp-3b-25r" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">piotr25691/thea-rp-3b-25r</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/piotr25691__thea-rp-3b-25r-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
piotr25691/thea-rp-3b-25r
|
ed4c338e07356f1657cf4d08b768ff866bbf0a68
| 21.845382 |
llama3.2
| 1 | 3.213 | true | false | false | true | 1.316907 | 0.657784 | 65.778357 | 0.439029 | 20.007381 | 0.132175 | 13.217523 | 0.274329 | 3.243848 | 0.381875 | 5.934375 | 0.306017 | 22.89081 | false | false |
2024-10-13
|
2024-10-16
| 2 |
SicariusSicariiStuff/Impish_LLAMA_3B (Merge)
|
postbot_gpt2-medium-emailgen_bfloat16
|
bfloat16
|
🟩 continuously pretrained
|
🟩
|
Original
|
GPT2LMHeadModel
|
<a target="_blank" href="https://huggingface.co/postbot/gpt2-medium-emailgen" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">postbot/gpt2-medium-emailgen</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/postbot__gpt2-medium-emailgen-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
postbot/gpt2-medium-emailgen
|
a0299eb6760126e3bd04d2f10cd166c4563f82d2
| 4.743048 |
apache-2.0
| 6 | 0.38 | true | false | false | false | 0.156373 | 0.149203 | 14.9203 | 0.313043 | 3.6737 | 0 | 0 | 0.260067 | 1.342282 | 0.391115 | 6.889323 | 0.114694 | 1.632683 | false | false |
2022-09-29
|
2024-11-17
| 0 |
postbot/gpt2-medium-emailgen
|
prince-canuma_Ministral-8B-Instruct-2410-HF_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/prince-canuma/Ministral-8B-Instruct-2410-HF" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prince-canuma/Ministral-8B-Instruct-2410-HF</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prince-canuma__Ministral-8B-Instruct-2410-HF-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
prince-canuma/Ministral-8B-Instruct-2410-HF
|
e0a14d7a6a8a1d1e5bef1a77a42e86e8bcae0ee7
| 23.744748 |
other
| 10 | 8.02 | true | false | false | true | 2.033869 | 0.591164 | 59.116367 | 0.458561 | 23.778465 | 0.191843 | 19.18429 | 0.28104 | 4.138702 | 0.41375 | 10.71875 | 0.329787 | 25.531915 | false | false |
2024-10-16
|
2024-10-17
| 1 |
prince-canuma/Ministral-8B-Instruct-2410-HF (Merge)
|
princeton-nlp_Llama-3-8B-ProLong-512k-Base_bfloat16
|
bfloat16
|
🟩 continuously pretrained
|
🟩
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/princeton-nlp/Llama-3-8B-ProLong-512k-Base" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Llama-3-8B-ProLong-512k-Base</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Llama-3-8B-ProLong-512k-Base-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
princeton-nlp/Llama-3-8B-ProLong-512k-Base
|
51a333f7c99f5052377154b76909dfe63ff7ab83
| 21.679045 |
llama3
| 9 | 8.03 | true | false | false | true | 1.757328 | 0.532212 | 53.221231 | 0.503321 | 29.847246 | 0.068731 | 6.873112 | 0.261745 | 1.565996 | 0.422271 | 12.683854 | 0.332945 | 25.882831 | false | false |
2024-08-22
|
2024-10-16
| 1 |
princeton-nlp/Llama-3-8B-ProLong-512k-Base (Merge)
|
princeton-nlp_Llama-3-8B-ProLong-512k-Instruct_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/princeton-nlp/Llama-3-8B-ProLong-512k-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Llama-3-8B-ProLong-512k-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Llama-3-8B-ProLong-512k-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
princeton-nlp/Llama-3-8B-ProLong-512k-Instruct
|
eae0626e8597575215276c2b248720f731bc50b8
| 21.942344 |
llama3
| 20 | 8.03 | true | false | false | true | 2.344706 | 0.550822 | 55.082182 | 0.502831 | 29.151153 | 0.05287 | 5.287009 | 0.286074 | 4.809843 | 0.426646 | 12.530729 | 0.323138 | 24.793144 | false | false |
2024-08-22
|
2024-11-16
| 1 |
princeton-nlp/Llama-3-8B-ProLong-512k-Instruct (Merge)
|
princeton-nlp_Llama-3-8B-ProLong-512k-Instruct_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/princeton-nlp/Llama-3-8B-ProLong-512k-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Llama-3-8B-ProLong-512k-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Llama-3-8B-ProLong-512k-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
princeton-nlp/Llama-3-8B-ProLong-512k-Instruct
|
bf92e493b7b0ef1db0242bfa97f1d8f92be02e9c
| 19.242002 |
llama3
| 20 | 8.03 | true | false | false | false | 1.448747 | 0.397773 | 39.777346 | 0.498303 | 28.669219 | 0.058157 | 5.81571 | 0.28104 | 4.138702 | 0.425 | 12.091667 | 0.324634 | 24.959368 | false | false |
2024-08-22
|
2024-11-16
| 1 |
princeton-nlp/Llama-3-8B-ProLong-512k-Instruct (Merge)
|
princeton-nlp_Llama-3-8B-ProLong-64k-Base_bfloat16
|
bfloat16
|
🟩 continuously pretrained
|
🟩
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/princeton-nlp/Llama-3-8B-ProLong-64k-Base" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Llama-3-8B-ProLong-64k-Base</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Llama-3-8B-ProLong-64k-Base-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
princeton-nlp/Llama-3-8B-ProLong-64k-Base
|
97994d6918f80162a893e22d5e7bba586551f941
| 21.652198 |
llama3
| 5 | 8.03 | true | false | false | true | 2.714775 | 0.520072 | 52.00723 | 0.492713 | 28.687899 | 0.064955 | 6.495468 | 0.265101 | 2.013423 | 0.434052 | 14.623177 | 0.334774 | 26.085993 | false | false |
2024-07-22
|
2024-10-16
| 1 |
princeton-nlp/Llama-3-8B-ProLong-64k-Base (Merge)
|
princeton-nlp_Llama-3-8B-ProLong-64k-Instruct_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/princeton-nlp/Llama-3-8B-ProLong-64k-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Llama-3-8B-ProLong-64k-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Llama-3-8B-ProLong-64k-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
princeton-nlp/Llama-3-8B-ProLong-64k-Instruct
|
fe55aed18544c5744239e473bb0d3aa0151776d3
| 23.020992 |
llama3
| 13 | 8.03 | true | false | false | true | 2.41788 | 0.556317 | 55.631724 | 0.508304 | 30.089572 | 0.064955 | 6.495468 | 0.295302 | 6.040268 | 0.439698 | 14.595573 | 0.32746 | 25.273345 | false | false |
2024-07-21
|
2024-10-16
| 1 |
princeton-nlp/Llama-3-8B-ProLong-64k-Instruct (Merge)
|
princeton-nlp_Llama-3-Base-8B-SFT_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/princeton-nlp/Llama-3-Base-8B-SFT" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Llama-3-Base-8B-SFT</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Llama-3-Base-8B-SFT-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
princeton-nlp/Llama-3-Base-8B-SFT
|
b622b7d814aa03aa722328bf88feaf1ad480b7fb
| 15.964206 | 2 | 8.03 | true | false | false | true | 2.62091 | 0.279596 | 27.959592 | 0.464304 | 24.345967 | 0.04003 | 4.003021 | 0.297819 | 6.375839 | 0.411792 | 9.840625 | 0.309342 | 23.260195 | false | false |
2024-05-17
|
2024-10-07
| 0 |
princeton-nlp/Llama-3-Base-8B-SFT
|
|
princeton-nlp_Llama-3-Base-8B-SFT-CPO_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/princeton-nlp/Llama-3-Base-8B-SFT-CPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Llama-3-Base-8B-SFT-CPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Llama-3-Base-8B-SFT-CPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
princeton-nlp/Llama-3-Base-8B-SFT-CPO
|
536ce7e7beb35175c48538fe46e7e9e100f228c9
| 15.953789 | 0 | 8.03 | true | false | false | true | 1.935692 | 0.370346 | 37.034624 | 0.459488 | 25.474649 | 0.054381 | 5.438066 | 0.274329 | 3.243848 | 0.360854 | 2.573438 | 0.297623 | 21.958112 | false | false |
2024-07-06
|
2024-10-07
| 0 |
princeton-nlp/Llama-3-Base-8B-SFT-CPO
|
|
princeton-nlp_Llama-3-Base-8B-SFT-DPO_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/princeton-nlp/Llama-3-Base-8B-SFT-DPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Llama-3-Base-8B-SFT-DPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Llama-3-Base-8B-SFT-DPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
princeton-nlp/Llama-3-Base-8B-SFT-DPO
|
3f5ec47c9beffb37cfbdcd837e76a336a9b1e651
| 18.376219 | 0 | 8.03 | true | false | false | true | 1.85268 | 0.411113 | 41.111251 | 0.466585 | 26.001874 | 0.041541 | 4.154079 | 0.310403 | 8.053691 | 0.38674 | 7.842448 | 0.307846 | 23.093972 | false | false |
2024-05-17
|
2024-10-07
| 0 |
princeton-nlp/Llama-3-Base-8B-SFT-DPO
|
|
princeton-nlp_Llama-3-Base-8B-SFT-IPO_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/princeton-nlp/Llama-3-Base-8B-SFT-IPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Llama-3-Base-8B-SFT-IPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Llama-3-Base-8B-SFT-IPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
princeton-nlp/Llama-3-Base-8B-SFT-IPO
|
85055cc4b9c707e0bd1239d20d1f62927a7a54c3
| 18.722473 | 0 | 8.03 | true | false | false | true | 1.864382 | 0.448656 | 44.865623 | 0.469007 | 25.705433 | 0.039275 | 3.927492 | 0.297819 | 6.375839 | 0.391948 | 7.960156 | 0.311503 | 23.500296 | false | false |
2024-05-17
|
2024-10-07
| 0 |
princeton-nlp/Llama-3-Base-8B-SFT-IPO
|
|
princeton-nlp_Llama-3-Base-8B-SFT-KTO_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/princeton-nlp/Llama-3-Base-8B-SFT-KTO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Llama-3-Base-8B-SFT-KTO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Llama-3-Base-8B-SFT-KTO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
princeton-nlp/Llama-3-Base-8B-SFT-KTO
|
49a8c2e5ccc7a28ed7bbedf093e352015fc1eb9b
| 18.644616 | 0 | 8.03 | true | false | false | true | 1.723704 | 0.452253 | 45.225335 | 0.469285 | 25.55523 | 0.05287 | 5.287009 | 0.305369 | 7.38255 | 0.384198 | 5.591406 | 0.305436 | 22.826167 | false | false |
2024-05-17
|
2024-10-07
| 0 |
princeton-nlp/Llama-3-Base-8B-SFT-KTO
|
|
princeton-nlp_Llama-3-Base-8B-SFT-ORPO_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/princeton-nlp/Llama-3-Base-8B-SFT-ORPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Llama-3-Base-8B-SFT-ORPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Llama-3-Base-8B-SFT-ORPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
princeton-nlp/Llama-3-Base-8B-SFT-ORPO
|
54d58402e0168faff6503e41621ad6c8274a310a
| 19.268326 | 0 | 8.03 | true | false | false | true | 1.813126 | 0.451654 | 45.165383 | 0.473406 | 26.485894 | 0.046828 | 4.682779 | 0.313758 | 8.501119 | 0.370677 | 7.634635 | 0.308261 | 23.140145 | false | false |
2024-05-17
|
2024-10-07
| 0 |
princeton-nlp/Llama-3-Base-8B-SFT-ORPO
|
|
princeton-nlp_Llama-3-Base-8B-SFT-RDPO_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/princeton-nlp/Llama-3-Base-8B-SFT-RDPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Llama-3-Base-8B-SFT-RDPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Llama-3-Base-8B-SFT-RDPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
princeton-nlp/Llama-3-Base-8B-SFT-RDPO
|
b41a964c2135ba34dcc6fa7edf76b6b9ea656949
| 19.142302 | 0 | 8.03 | true | false | false | true | 1.804871 | 0.448007 | 44.800684 | 0.466201 | 25.526521 | 0.057402 | 5.740181 | 0.306208 | 7.494407 | 0.40274 | 8.909115 | 0.301446 | 22.382905 | false | false |
2024-05-17
|
2024-10-07
| 0 |
princeton-nlp/Llama-3-Base-8B-SFT-RDPO
|
|
princeton-nlp_Llama-3-Base-8B-SFT-RRHF_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/princeton-nlp/Llama-3-Base-8B-SFT-RRHF" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Llama-3-Base-8B-SFT-RRHF</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Llama-3-Base-8B-SFT-RRHF-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
princeton-nlp/Llama-3-Base-8B-SFT-RRHF
|
aea8c04b3940cebd1f8296a2c76914f0ce70c276
| 16.282724 | 0 | 8.03 | true | false | false | true | 1.902937 | 0.335725 | 33.572477 | 0.452036 | 23.659142 | 0.045317 | 4.531722 | 0.305369 | 7.38255 | 0.372229 | 7.561979 | 0.288896 | 20.988475 | false | false |
2024-07-06
|
2024-10-07
| 0 |
princeton-nlp/Llama-3-Base-8B-SFT-RRHF
|
|
princeton-nlp_Llama-3-Base-8B-SFT-SLiC-HF_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/princeton-nlp/Llama-3-Base-8B-SFT-SLiC-HF" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Llama-3-Base-8B-SFT-SLiC-HF</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Llama-3-Base-8B-SFT-SLiC-HF-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
princeton-nlp/Llama-3-Base-8B-SFT-SLiC-HF
|
325092c1eddffc3ca7157be1ff9958128e5753ef
| 19.743113 | 0 | 8.03 | true | false | false | true | 1.920942 | 0.489048 | 48.904795 | 0.470408 | 26.373963 | 0.050604 | 5.060423 | 0.286913 | 4.9217 | 0.409094 | 10.270052 | 0.30635 | 22.927748 | false | false |
2024-07-06
|
2024-10-07
| 0 |
princeton-nlp/Llama-3-Base-8B-SFT-SLiC-HF
|
|
princeton-nlp_Llama-3-Base-8B-SFT-SimPO_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/princeton-nlp/Llama-3-Base-8B-SFT-SimPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Llama-3-Base-8B-SFT-SimPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Llama-3-Base-8B-SFT-SimPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
princeton-nlp/Llama-3-Base-8B-SFT-SimPO
|
0a6e518b13b67abe8433bce3f7beee9beb74a794
| 19.858509 | 1 | 8.03 | false | false | false | true | 1.72313 | 0.46854 | 46.854014 | 0.474125 | 26.39595 | 0.055136 | 5.513595 | 0.288591 | 5.145414 | 0.412688 | 11.852604 | 0.310505 | 23.38948 | false | false |
2024-05-24
|
2024-09-28
| 0 |
princeton-nlp/Llama-3-Base-8B-SFT-SimPO
|
|
princeton-nlp_Llama-3-Instruct-8B-CPO_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/princeton-nlp/Llama-3-Instruct-8B-CPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Llama-3-Instruct-8B-CPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Llama-3-Instruct-8B-CPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
princeton-nlp/Llama-3-Instruct-8B-CPO
|
d4645ae4c3b99892f1c59f60a77330be35567835
| 23.999076 | 0 | 8.03 | true | false | false | true | 1.478833 | 0.729299 | 72.929937 | 0.499879 | 28.604299 | 0.098943 | 9.89426 | 0.260067 | 1.342282 | 0.351396 | 1.757812 | 0.365193 | 29.465869 | false | false |
2024-07-06
|
2024-09-28
| 0 |
princeton-nlp/Llama-3-Instruct-8B-CPO
|
|
princeton-nlp_Llama-3-Instruct-8B-CPO-v0.2_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/princeton-nlp/Llama-3-Instruct-8B-CPO-v0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Llama-3-Instruct-8B-CPO-v0.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Llama-3-Instruct-8B-CPO-v0.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
princeton-nlp/Llama-3-Instruct-8B-CPO-v0.2
|
5ed83728712693437bd547f4cd32923ac4e1172d
| 24.883955 | 0 | 8.03 | true | false | false | true | 1.545771 | 0.750582 | 75.058179 | 0.502667 | 29.086407 | 0.108006 | 10.800604 | 0.260906 | 1.454139 | 0.361906 | 2.838281 | 0.370595 | 30.06612 | false | false |
2024-07-06
|
2024-09-28
| 0 |
princeton-nlp/Llama-3-Instruct-8B-CPO-v0.2
|
|
princeton-nlp_Llama-3-Instruct-8B-DPO_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/princeton-nlp/Llama-3-Instruct-8B-DPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Llama-3-Instruct-8B-DPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Llama-3-Instruct-8B-DPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
princeton-nlp/Llama-3-Instruct-8B-DPO
|
0afbf4c012ec7507f61c554999151b95a3651db3
| 23.49824 | 0 | 8.03 | true | false | false | true | 1.129708 | 0.675744 | 67.574369 | 0.49913 | 28.507392 | 0.084592 | 8.459215 | 0.271812 | 2.908277 | 0.373813 | 3.926563 | 0.366523 | 29.613623 | false | false |
2024-05-17
|
2024-09-28
| 0 |
princeton-nlp/Llama-3-Instruct-8B-DPO
|
|
princeton-nlp_Llama-3-Instruct-8B-DPO-v0.2_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/princeton-nlp/Llama-3-Instruct-8B-DPO-v0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Llama-3-Instruct-8B-DPO-v0.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Llama-3-Instruct-8B-DPO-v0.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
princeton-nlp/Llama-3-Instruct-8B-DPO-v0.2
|
d06275e02abbeaf29d911a3c0cf22922dcca6b0b
| 25.208963 | 0 | 8.03 | true | false | false | true | 1.20576 | 0.720806 | 72.080635 | 0.50562 | 28.939587 | 0.089879 | 8.987915 | 0.286913 | 4.9217 | 0.384448 | 5.55599 | 0.376912 | 30.767952 | false | false |
2024-07-06
|
2024-09-28
| 0 |
princeton-nlp/Llama-3-Instruct-8B-DPO-v0.2
|
|
princeton-nlp_Llama-3-Instruct-8B-KTO_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/princeton-nlp/Llama-3-Instruct-8B-KTO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Llama-3-Instruct-8B-KTO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Llama-3-Instruct-8B-KTO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
princeton-nlp/Llama-3-Instruct-8B-KTO
|
e697908201cbab01e0ca54088bb8cd2fd99b4574
| 23.419047 | 0 | 8.03 | true | false | false | true | 1.205034 | 0.68641 | 68.640984 | 0.49819 | 28.649658 | 0.072508 | 7.250755 | 0.276007 | 3.467562 | 0.369844 | 3.630469 | 0.359874 | 28.874852 | false | false |
2024-05-17
|
2024-09-28
| 0 |
princeton-nlp/Llama-3-Instruct-8B-KTO
|
|
princeton-nlp_Llama-3-Instruct-8B-KTO-v0.2_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/princeton-nlp/Llama-3-Instruct-8B-KTO-v0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Llama-3-Instruct-8B-KTO-v0.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Llama-3-Instruct-8B-KTO-v0.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
princeton-nlp/Llama-3-Instruct-8B-KTO-v0.2
|
477d33ea62ed57a0429517170612aa1df21c78d6
| 24.65939 | 0 | 8.03 | true | false | false | true | 1.261073 | 0.729025 | 72.902454 | 0.507977 | 29.648406 | 0.099698 | 9.969789 | 0.260067 | 1.342282 | 0.37775 | 4.452083 | 0.366772 | 29.641327 | false | false |
2024-07-06
|
2024-09-28
| 0 |
princeton-nlp/Llama-3-Instruct-8B-KTO-v0.2
|
|
princeton-nlp_Llama-3-Instruct-8B-ORPO_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/princeton-nlp/Llama-3-Instruct-8B-ORPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Llama-3-Instruct-8B-ORPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Llama-3-Instruct-8B-ORPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
princeton-nlp/Llama-3-Instruct-8B-ORPO
|
4bb3ffcf9ede48cb01a10bf3223eb41b59aa3fef
| 23.622592 | 0 | 8.03 | true | false | false | true | 1.247808 | 0.712813 | 71.281311 | 0.500121 | 28.839356 | 0.07855 | 7.854985 | 0.258389 | 1.118568 | 0.350188 | 3.240104 | 0.364611 | 29.401226 | false | false |
2024-05-17
|
2024-09-28
| 0 |
princeton-nlp/Llama-3-Instruct-8B-ORPO
|
|
princeton-nlp_Llama-3-Instruct-8B-ORPO-v0.2_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/princeton-nlp/Llama-3-Instruct-8B-ORPO-v0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Llama-3-Instruct-8B-ORPO-v0.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Llama-3-Instruct-8B-ORPO-v0.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
princeton-nlp/Llama-3-Instruct-8B-ORPO-v0.2
|
3ea5c542a3d8d61f6afb6cdbef5972a501ddf759
| 25.966145 | 1 | 8.03 | true | false | false | true | 1.188465 | 0.763321 | 76.332132 | 0.507835 | 29.604837 | 0.101964 | 10.196375 | 0.283557 | 4.474273 | 0.377969 | 4.846094 | 0.373088 | 30.343159 | false | false |
2024-07-06
|
2024-09-28
| 0 |
princeton-nlp/Llama-3-Instruct-8B-ORPO-v0.2
|
|
princeton-nlp_Llama-3-Instruct-8B-RDPO_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/princeton-nlp/Llama-3-Instruct-8B-RDPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Llama-3-Instruct-8B-RDPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Llama-3-Instruct-8B-RDPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
princeton-nlp/Llama-3-Instruct-8B-RDPO
|
9497ca226a68981f42df2e5b3a4a1a2ea702a942
| 23.603754 | 0 | 8.03 | true | false | false | true | 1.1325 | 0.666002 | 66.600176 | 0.503363 | 29.032479 | 0.084592 | 8.459215 | 0.282718 | 4.362416 | 0.375208 | 4.201042 | 0.360705 | 28.967199 | false | false |
2024-05-17
|
2024-09-28
| 0 |
princeton-nlp/Llama-3-Instruct-8B-RDPO
|
|
princeton-nlp_Llama-3-Instruct-8B-RDPO-v0.2_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/princeton-nlp/Llama-3-Instruct-8B-RDPO-v0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Llama-3-Instruct-8B-RDPO-v0.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Llama-3-Instruct-8B-RDPO-v0.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
princeton-nlp/Llama-3-Instruct-8B-RDPO-v0.2
|
4e5bc9779cba3a2f615379d3f8ef1bbb3ea487f7
| 25.032225 | 1 | 8.03 | true | false | false | true | 1.115896 | 0.707692 | 70.769226 | 0.504922 | 28.854277 | 0.086858 | 8.685801 | 0.292785 | 5.704698 | 0.380448 | 5.35599 | 0.37741 | 30.82336 | false | false |
2024-07-06
|
2024-09-28
| 0 |
princeton-nlp/Llama-3-Instruct-8B-RDPO-v0.2
|
|
princeton-nlp_Llama-3-Instruct-8B-RRHF_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/princeton-nlp/Llama-3-Instruct-8B-RRHF" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Llama-3-Instruct-8B-RRHF</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Llama-3-Instruct-8B-RRHF-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
princeton-nlp/Llama-3-Instruct-8B-RRHF
|
73561d9b0fd42b94250246f8d794251fe9f9d2e9
| 24.084494 | 0 | 8.03 | true | false | false | true | 1.278431 | 0.727451 | 72.745094 | 0.491055 | 27.216485 | 0.096677 | 9.667674 | 0.280201 | 4.026846 | 0.347552 | 1.477344 | 0.364362 | 29.373522 | false | false |
2024-07-06
|
2024-10-07
| 0 |
princeton-nlp/Llama-3-Instruct-8B-RRHF
|
|
princeton-nlp_Llama-3-Instruct-8B-RRHF-v0.2_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/princeton-nlp/Llama-3-Instruct-8B-RRHF-v0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Llama-3-Instruct-8B-RRHF-v0.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Llama-3-Instruct-8B-RRHF-v0.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
princeton-nlp/Llama-3-Instruct-8B-RRHF-v0.2
|
81191fbb214d17f0a4fec247da5d648f4cb61ef1
| 23.753751 | 0 | 8.03 | true | false | false | true | 1.011747 | 0.712488 | 71.248842 | 0.49839 | 28.498724 | 0.087613 | 8.761329 | 0.260067 | 1.342282 | 0.373781 | 5.089323 | 0.348238 | 27.582004 | false | false |
2024-07-06
|
2024-10-07
| 0 |
princeton-nlp/Llama-3-Instruct-8B-RRHF-v0.2
|
|
princeton-nlp_Llama-3-Instruct-8B-SLiC-HF_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/princeton-nlp/Llama-3-Instruct-8B-SLiC-HF" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Llama-3-Instruct-8B-SLiC-HF</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Llama-3-Instruct-8B-SLiC-HF-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
princeton-nlp/Llama-3-Instruct-8B-SLiC-HF
|
7e9001f6f4fe940c363bb7ea1814d33c79b21737
| 25.308144 | 0 | 8.03 | true | false | false | true | 1.450385 | 0.739966 | 73.996551 | 0.502942 | 29.211612 | 0.097432 | 9.743202 | 0.286074 | 4.809843 | 0.372292 | 5.369792 | 0.358461 | 28.717863 | false | false |
2024-07-06
|
2024-10-07
| 0 |
princeton-nlp/Llama-3-Instruct-8B-SLiC-HF
|
|
princeton-nlp_Llama-3-Instruct-8B-SLiC-HF-v0.2_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/princeton-nlp/Llama-3-Instruct-8B-SLiC-HF-v0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Llama-3-Instruct-8B-SLiC-HF-v0.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Llama-3-Instruct-8B-SLiC-HF-v0.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
princeton-nlp/Llama-3-Instruct-8B-SLiC-HF-v0.2
|
1821cc42189d8dab9e157c31b223dc60fc037c2d
| 23.728355 | 0 | 8.03 | true | false | false | true | 1.042479 | 0.710965 | 71.096468 | 0.49839 | 28.498724 | 0.087613 | 8.761329 | 0.260067 | 1.342282 | 0.373781 | 5.089323 | 0.348238 | 27.582004 | false | false |
2024-07-06
|
2024-10-07
| 0 |
princeton-nlp/Llama-3-Instruct-8B-SLiC-HF-v0.2
|
|
princeton-nlp_Llama-3-Instruct-8B-SimPO_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/princeton-nlp/Llama-3-Instruct-8B-SimPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Llama-3-Instruct-8B-SimPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Llama-3-Instruct-8B-SimPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
princeton-nlp/Llama-3-Instruct-8B-SimPO
|
f700cb6afb4509b10dea43ab72bb0e260e166be4
| 23.664165 | 58 | 8.03 | true | false | false | true | 1.066691 | 0.65039 | 65.038985 | 0.484468 | 26.709133 | 0.086103 | 8.610272 | 0.293624 | 5.816555 | 0.394833 | 8.154167 | 0.348903 | 27.655881 | false | false |
2024-05-17
|
2024-09-28
| 0 |
princeton-nlp/Llama-3-Instruct-8B-SimPO
|
|
princeton-nlp_Llama-3-Instruct-8B-SimPO-v0.2_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/princeton-nlp/Llama-3-Instruct-8B-SimPO-v0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Llama-3-Instruct-8B-SimPO-v0.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Llama-3-Instruct-8B-SimPO-v0.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
princeton-nlp/Llama-3-Instruct-8B-SimPO-v0.2
|
9ac0fbee445e7755e50520e9881d67588b4b854c
| 24.75154 | 6 | 8.03 | true | false | false | true | 1.159963 | 0.680865 | 68.086455 | 0.503834 | 29.214022 | 0.074018 | 7.401813 | 0.301174 | 6.823266 | 0.398802 | 7.85026 | 0.362201 | 29.133422 | false | false |
2024-07-06
|
2024-09-28
| 0 |
princeton-nlp/Llama-3-Instruct-8B-SimPO-v0.2
|
|
princeton-nlp_Mistral-7B-Base-SFT-CPO_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/princeton-nlp/Mistral-7B-Base-SFT-CPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Mistral-7B-Base-SFT-CPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Mistral-7B-Base-SFT-CPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
princeton-nlp/Mistral-7B-Base-SFT-CPO
|
7f67394668b94a9ddfb64daff8976b48b135d96c
| 17.39897 | 1 | 7.242 | true | false | false | true | 1.619538 | 0.465493 | 46.549267 | 0.438215 | 21.857696 | 0.027946 | 2.794562 | 0.291946 | 5.592841 | 0.407083 | 9.252083 | 0.265126 | 18.34737 | false | false |
2024-07-06
|
2024-10-07
| 0 |
princeton-nlp/Mistral-7B-Base-SFT-CPO
|
|
princeton-nlp_Mistral-7B-Base-SFT-DPO_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/princeton-nlp/Mistral-7B-Base-SFT-DPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Mistral-7B-Base-SFT-DPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Mistral-7B-Base-SFT-DPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
princeton-nlp/Mistral-7B-Base-SFT-DPO
|
17134fd80cfbf3980353967a30dc6f450f18f78f
| 16.311854 | 0 | 7.242 | true | false | false | true | 1.335239 | 0.440338 | 44.03383 | 0.435011 | 20.79098 | 0.021148 | 2.114804 | 0.272651 | 3.020134 | 0.412229 | 9.628646 | 0.264545 | 18.282728 | false | false |
2024-05-17
|
2024-10-07
| 0 |
princeton-nlp/Mistral-7B-Base-SFT-DPO
|
|
princeton-nlp_Mistral-7B-Base-SFT-IPO_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/princeton-nlp/Mistral-7B-Base-SFT-IPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Mistral-7B-Base-SFT-IPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Mistral-7B-Base-SFT-IPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
princeton-nlp/Mistral-7B-Base-SFT-IPO
|
eea781724e4d2ab8bdda7c13526f042de4cfae41
| 17.273368 | 0 | 7.242 | true | false | false | true | 1.334669 | 0.482953 | 48.295301 | 0.445802 | 23.703491 | 0.028701 | 2.870091 | 0.280201 | 4.026846 | 0.377625 | 4.836458 | 0.279172 | 19.908023 | false | false |
2024-05-17
|
2024-10-07
| 0 |
princeton-nlp/Mistral-7B-Base-SFT-IPO
|
|
princeton-nlp_Mistral-7B-Base-SFT-KTO_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/princeton-nlp/Mistral-7B-Base-SFT-KTO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Mistral-7B-Base-SFT-KTO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Mistral-7B-Base-SFT-KTO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
princeton-nlp/Mistral-7B-Base-SFT-KTO
|
02148bb9241b0f4bb0c75e93893eed005abe25e8
| 19.012992 | 0 | 7.242 | true | false | false | true | 1.332033 | 0.478482 | 47.848154 | 0.447643 | 23.107642 | 0.039275 | 3.927492 | 0.290268 | 5.369128 | 0.436781 | 13.03099 | 0.287151 | 20.794548 | false | false |
2024-05-17
|
2024-10-07
| 0 |
princeton-nlp/Mistral-7B-Base-SFT-KTO
|
|
princeton-nlp_Mistral-7B-Base-SFT-RDPO_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/princeton-nlp/Mistral-7B-Base-SFT-RDPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Mistral-7B-Base-SFT-RDPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Mistral-7B-Base-SFT-RDPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
princeton-nlp/Mistral-7B-Base-SFT-RDPO
|
2a63a6d9e1978c99444e440371268f7c2b7e0375
| 16.490934 | 0 | 7.242 | true | false | false | true | 1.32501 | 0.460647 | 46.064664 | 0.443953 | 22.98201 | 0.021903 | 2.190332 | 0.277685 | 3.691275 | 0.357938 | 4.275521 | 0.277676 | 19.7418 | false | false |
2024-05-17
|
2024-10-07
| 0 |
princeton-nlp/Mistral-7B-Base-SFT-RDPO
|
|
princeton-nlp_Mistral-7B-Base-SFT-RRHF_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/princeton-nlp/Mistral-7B-Base-SFT-RRHF" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Mistral-7B-Base-SFT-RRHF</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Mistral-7B-Base-SFT-RRHF-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
princeton-nlp/Mistral-7B-Base-SFT-RRHF
|
0d5861072e9d01f420451bf6a5b108bc8d3a76bc
| 16.182025 | 0 | 7.242 | true | false | false | true | 1.338002 | 0.440663 | 44.0663 | 0.428059 | 19.598831 | 0.024924 | 2.492447 | 0.290268 | 5.369128 | 0.418677 | 10.034635 | 0.239777 | 15.530807 | false | false |
2024-07-06
|
2024-10-07
| 0 |
princeton-nlp/Mistral-7B-Base-SFT-RRHF
|
|
princeton-nlp_Mistral-7B-Base-SFT-SLiC-HF_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/princeton-nlp/Mistral-7B-Base-SFT-SLiC-HF" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Mistral-7B-Base-SFT-SLiC-HF</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Mistral-7B-Base-SFT-SLiC-HF-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
princeton-nlp/Mistral-7B-Base-SFT-SLiC-HF
|
65d2cc49ad05258da3d982b39682c7f672f5e4ab
| 19.005886 | 0 | 7.242 | true | false | false | true | 1.336884 | 0.512728 | 51.272845 | 0.44224 | 22.304723 | 0.035498 | 3.549849 | 0.291946 | 5.592841 | 0.426083 | 11.527083 | 0.278092 | 19.787973 | false | false |
2024-07-06
|
2024-10-07
| 0 |
princeton-nlp/Mistral-7B-Base-SFT-SLiC-HF
|
|
princeton-nlp_Mistral-7B-Base-SFT-SimPO_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/princeton-nlp/Mistral-7B-Base-SFT-SimPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Mistral-7B-Base-SFT-SimPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Mistral-7B-Base-SFT-SimPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
princeton-nlp/Mistral-7B-Base-SFT-SimPO
|
9d9e8b8de4f673d45bc826efc4a1444f9d480222
| 17.032015 | 0 | 7.242 | true | false | false | true | 1.271413 | 0.470064 | 47.006387 | 0.439805 | 22.332886 | 0.01435 | 1.435045 | 0.283557 | 4.474273 | 0.397063 | 8.032813 | 0.270196 | 18.910683 | false | false |
2024-05-17
|
2024-09-21
| 0 |
princeton-nlp/Mistral-7B-Base-SFT-SimPO
|
|
princeton-nlp_Mistral-7B-Instruct-CPO_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/princeton-nlp/Mistral-7B-Instruct-CPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Mistral-7B-Instruct-CPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Mistral-7B-Instruct-CPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
princeton-nlp/Mistral-7B-Instruct-CPO
|
32492f8e5588f06005689ac944c2ea39c394c28e
| 15.540359 | 0 | 7.242 | true | false | false | true | 1.291845 | 0.420305 | 42.030479 | 0.406922 | 17.248538 | 0.020393 | 2.039275 | 0.26594 | 2.12528 | 0.417844 | 10.897135 | 0.270113 | 18.901448 | false | false |
2024-07-06
|
2024-10-07
| 0 |
princeton-nlp/Mistral-7B-Instruct-CPO
|
|
princeton-nlp_Mistral-7B-Instruct-DPO_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/princeton-nlp/Mistral-7B-Instruct-DPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Mistral-7B-Instruct-DPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Mistral-7B-Instruct-DPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
princeton-nlp/Mistral-7B-Instruct-DPO
|
5e96cff70d8db87cf17c616429c17c8dc9352543
| 16.562196 | 0 | 7.242 | true | false | false | true | 1.210533 | 0.517624 | 51.762435 | 0.406036 | 16.875389 | 0.030967 | 3.096677 | 0.268456 | 2.46085 | 0.383333 | 5.75 | 0.27485 | 19.427822 | false | false |
2024-05-17
|
2024-10-07
| 0 |
princeton-nlp/Mistral-7B-Instruct-DPO
|
|
princeton-nlp_Mistral-7B-Instruct-IPO_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/princeton-nlp/Mistral-7B-Instruct-IPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Mistral-7B-Instruct-IPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Mistral-7B-Instruct-IPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
princeton-nlp/Mistral-7B-Instruct-IPO
|
32ad99c6e7231bbe8ebd9d24b28e084c60848558
| 17.719684 | 0 | 7.242 | true | false | false | true | 1.251495 | 0.49292 | 49.29199 | 0.432218 | 20.09411 | 0.020393 | 2.039275 | 0.27349 | 3.131991 | 0.432417 | 12.785417 | 0.270778 | 18.975325 | false | false |
2024-05-17
|
2024-10-07
| 0 |
princeton-nlp/Mistral-7B-Instruct-IPO
|
|
princeton-nlp_Mistral-7B-Instruct-KTO_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/princeton-nlp/Mistral-7B-Instruct-KTO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Mistral-7B-Instruct-KTO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Mistral-7B-Instruct-KTO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
princeton-nlp/Mistral-7B-Instruct-KTO
|
834422e5b9b9eee6aac2f8d4822b925a6574d628
| 16.702592 | 0 | 7.242 | true | false | false | true | 1.206756 | 0.490797 | 49.079664 | 0.413959 | 17.812648 | 0.026435 | 2.643505 | 0.27349 | 3.131991 | 0.395271 | 7.408854 | 0.28125 | 20.138889 | false | false |
2024-05-17
|
2024-10-07
| 0 |
princeton-nlp/Mistral-7B-Instruct-KTO
|
|
princeton-nlp_Mistral-7B-Instruct-ORPO_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/princeton-nlp/Mistral-7B-Instruct-ORPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Mistral-7B-Instruct-ORPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Mistral-7B-Instruct-ORPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
princeton-nlp/Mistral-7B-Instruct-ORPO
|
69c0481f4100629a49ae73f760ddbb61d8e98e48
| 16.088293 | 0 | 7.242 | true | false | false | true | 1.248593 | 0.471962 | 47.196217 | 0.410406 | 18.038373 | 0.029456 | 2.945619 | 0.274329 | 3.243848 | 0.39124 | 6.638281 | 0.266207 | 18.46742 | false | false |
2024-05-17
|
2024-10-07
| 0 |
princeton-nlp/Mistral-7B-Instruct-ORPO
|
|
princeton-nlp_Mistral-7B-Instruct-RDPO_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/princeton-nlp/Mistral-7B-Instruct-RDPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Mistral-7B-Instruct-RDPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Mistral-7B-Instruct-RDPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
princeton-nlp/Mistral-7B-Instruct-RDPO
|
23ec6ab4f996134eb15c19322dabb34d7332d7cd
| 16.433079 | 0 | 7.242 | true | false | false | true | 1.221231 | 0.488723 | 48.872325 | 0.405015 | 17.048388 | 0.024924 | 2.492447 | 0.280201 | 4.026846 | 0.387333 | 6.416667 | 0.277676 | 19.7418 | false | false |
2024-05-17
|
2024-10-07
| 0 |
princeton-nlp/Mistral-7B-Instruct-RDPO
|
|
princeton-nlp_Mistral-7B-Instruct-RRHF_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/princeton-nlp/Mistral-7B-Instruct-RRHF" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Mistral-7B-Instruct-RRHF</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Mistral-7B-Instruct-RRHF-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
princeton-nlp/Mistral-7B-Instruct-RRHF
|
493d3ceb571232fe3b2f55c0bf78692760f4fc7e
| 16.892024 | 0 | 7.242 | true | false | false | true | 1.175503 | 0.496017 | 49.601723 | 0.418977 | 19.206552 | 0.027946 | 2.794562 | 0.276007 | 3.467562 | 0.397875 | 7.934375 | 0.265126 | 18.34737 | false | false |
2024-07-06
|
2024-10-07
| 0 |
princeton-nlp/Mistral-7B-Instruct-RRHF
|
|
princeton-nlp_Mistral-7B-Instruct-SLiC-HF_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/princeton-nlp/Mistral-7B-Instruct-SLiC-HF" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Mistral-7B-Instruct-SLiC-HF</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Mistral-7B-Instruct-SLiC-HF-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
princeton-nlp/Mistral-7B-Instruct-SLiC-HF
|
3d08c8b7c3e73beb2a3264848f17246b74c3d162
| 16.389144 | 0 | 7.242 | true | false | false | true | 1.244906 | 0.511529 | 51.152941 | 0.404001 | 16.653429 | 0.017372 | 1.73716 | 0.272651 | 3.020134 | 0.391302 | 6.71276 | 0.271526 | 19.058437 | false | false |
2024-07-06
|
2024-10-16
| 0 |
princeton-nlp/Mistral-7B-Instruct-SLiC-HF
|
|
princeton-nlp_Mistral-7B-Instruct-SimPO_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
MistralForCausalLM
|
<a target="_blank" href="https://huggingface.co/princeton-nlp/Mistral-7B-Instruct-SimPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Mistral-7B-Instruct-SimPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Mistral-7B-Instruct-SimPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
princeton-nlp/Mistral-7B-Instruct-SimPO
|
03191ee1e60d21a698d11a515703a037073724f8
| 17.607316 | 2 | 7.242 | false | false | false | true | 1.141125 | 0.46869 | 46.868974 | 0.450723 | 22.382277 | 0.028701 | 2.870091 | 0.278523 | 3.803132 | 0.409781 | 9.75599 | 0.279671 | 19.963431 | false | false |
2024-05-24
|
2024-09-21
| 0 |
princeton-nlp/Mistral-7B-Instruct-SimPO
|
|
princeton-nlp_Sheared-LLaMA-1.3B_bfloat16
|
bfloat16
|
🟢 pretrained
|
🟢
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/princeton-nlp/Sheared-LLaMA-1.3B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Sheared-LLaMA-1.3B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Sheared-LLaMA-1.3B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
princeton-nlp/Sheared-LLaMA-1.3B
|
a4b76938edbf571ea7d7d9904861cbdca08809b4
| 5.580926 |
apache-2.0
| 94 | 1.3 | true | false | false | false | 0.7092 | 0.21977 | 21.977021 | 0.319705 | 4.74463 | 0.01284 | 1.283988 | 0.239933 | 0 | 0.371302 | 3.579427 | 0.117104 | 1.900488 | false | false |
2023-10-10
|
2024-07-29
| 0 |
princeton-nlp/Sheared-LLaMA-1.3B
|
princeton-nlp_Sheared-LLaMA-2.7B_bfloat16
|
bfloat16
|
🟢 pretrained
|
🟢
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/princeton-nlp/Sheared-LLaMA-2.7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Sheared-LLaMA-2.7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Sheared-LLaMA-2.7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
princeton-nlp/Sheared-LLaMA-2.7B
|
2f157a0306b75d37694ae05f6a4067220254d540
| 6.43792 |
apache-2.0
| 60 | 2.7 | true | false | false | false | 0.9401 | 0.241652 | 24.165215 | 0.325869 | 5.655521 | 0.01284 | 1.283988 | 0.275168 | 3.355705 | 0.356729 | 2.091146 | 0.118684 | 2.075946 | false | false |
2023-10-10
|
2024-07-29
| 0 |
princeton-nlp/Sheared-LLaMA-2.7B
|
princeton-nlp_gemma-2-9b-it-DPO_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/princeton-nlp/gemma-2-9b-it-DPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/gemma-2-9b-it-DPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__gemma-2-9b-it-DPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
princeton-nlp/gemma-2-9b-it-DPO
|
f646c99fc3aa7afc7b22c3c7115fd03a40fc1d22
| 20.818727 | 9 | 9.242 | false | false | false | true | 5.781254 | 0.276872 | 27.687203 | 0.594144 | 41.593654 | 0.083082 | 8.308157 | 0.33557 | 11.409396 | 0.382031 | 5.653906 | 0.37234 | 30.260047 | false | false |
2024-07-16
|
2024-09-19
| 2 |
google/gemma-2-9b
|
|
princeton-nlp_gemma-2-9b-it-SimPO_bfloat16
|
bfloat16
|
💬 chat models (RLHF, DPO, IFT, ...)
|
💬
|
Original
|
Gemma2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/princeton-nlp/gemma-2-9b-it-SimPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/gemma-2-9b-it-SimPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__gemma-2-9b-it-SimPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
princeton-nlp/gemma-2-9b-it-SimPO
|
8c87091f412e3aa6f74f66bd86c57fb81cbc3fde
| 22.344935 |
mit
| 159 | 9.242 | true | false | false | true | 5.538007 | 0.320686 | 32.068578 | 0.583918 | 40.09343 | 0.070997 | 7.099698 | 0.33557 | 11.409396 | 0.412323 | 10.340365 | 0.397523 | 33.058141 | false | false |
2024-07-16
|
2024-08-10
| 2 |
google/gemma-2-9b
|
prithivMLmods_Bellatrix-1.5B-xElite_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/prithivMLmods/Bellatrix-1.5B-xElite" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Bellatrix-1.5B-xElite</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Bellatrix-1.5B-xElite-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
prithivMLmods/Bellatrix-1.5B-xElite
|
4ec39cef1bf7701abb30dda694b4918c517d1c0d
| 12.22887 |
apache-2.0
| 3 | 1.777 | true | false | false | false | 1.199328 | 0.196414 | 19.64144 | 0.35012 | 9.486709 | 0.287009 | 28.700906 | 0.278523 | 3.803132 | 0.361906 | 4.438281 | 0.165725 | 7.302748 | false | false |
2025-01-25
|
2025-01-27
| 1 |
prithivMLmods/Bellatrix-1.5B-xElite (Merge)
|
prithivMLmods_Bellatrix-Tiny-1.5B-R1_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/prithivMLmods/Bellatrix-Tiny-1.5B-R1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Bellatrix-Tiny-1.5B-R1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Bellatrix-Tiny-1.5B-R1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
prithivMLmods/Bellatrix-Tiny-1.5B-R1
|
db777568b86dc8aebb654b9167497912e004843e
| 14.322565 |
apache-2.0
| 2 | 1.544 | true | false | false | false | 1.168961 | 0.335225 | 33.522498 | 0.402217 | 15.85758 | 0.060423 | 6.042296 | 0.298658 | 6.487696 | 0.368292 | 4.569792 | 0.2751 | 19.455526 | false | false |
2025-01-31
|
2025-02-02
| 1 |
prithivMLmods/Bellatrix-Tiny-1.5B-R1 (Merge)
|
prithivMLmods_Bellatrix-Tiny-1B-v2_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/prithivMLmods/Bellatrix-Tiny-1B-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Bellatrix-Tiny-1B-v2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Bellatrix-Tiny-1B-v2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
prithivMLmods/Bellatrix-Tiny-1B-v2
|
d82282c0853688ed16e3b9e121a09d063c566cc5
| 6.033864 |
llama3.2
| 2 | 1.236 | true | false | false | false | 0.773732 | 0.150952 | 15.09517 | 0.326768 | 6.032562 | 0.028701 | 2.870091 | 0.272651 | 3.020134 | 0.343021 | 3.710937 | 0.149269 | 5.474291 | false | false |
2025-01-26
|
2025-01-27
| 1 |
prithivMLmods/Bellatrix-Tiny-1B-v2 (Merge)
|
prithivMLmods_Blaze-14B-xElite_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/prithivMLmods/Blaze-14B-xElite" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Blaze-14B-xElite</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Blaze-14B-xElite-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
prithivMLmods/Blaze-14B-xElite
|
1795ffecee7322e697edfd0f900c7155ae2878b9
| 29.122992 |
llama3.1
| 1 | 14.66 | true | false | false | false | 1.881711 | 0.03632 | 3.63203 | 0.662782 | 51.573264 | 0.369335 | 36.933535 | 0.394295 | 19.239374 | 0.46249 | 17.677865 | 0.511137 | 45.681885 | false | false |
2025-01-28
|
2025-01-28
| 0 |
prithivMLmods/Blaze-14B-xElite
|
prithivMLmods_COCO-7B-Instruct-1M_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/prithivMLmods/COCO-7B-Instruct-1M" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/COCO-7B-Instruct-1M</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__COCO-7B-Instruct-1M-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
prithivMLmods/COCO-7B-Instruct-1M
|
a8ccc848bd1db0f05172a4e1c2197a0d3b4f25c5
| 28.952308 |
apache-2.0
| 4 | 7.616 | true | false | false | false | 1.338104 | 0.47431 | 47.431039 | 0.540996 | 34.677883 | 0.349698 | 34.969789 | 0.307886 | 7.718121 | 0.43824 | 13.513281 | 0.418634 | 35.403738 | false | false |
2025-01-25
|
2025-01-27
| 1 |
prithivMLmods/COCO-7B-Instruct-1M (Merge)
|
prithivMLmods_Calcium-Opus-14B-Elite_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/prithivMLmods/Calcium-Opus-14B-Elite" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Calcium-Opus-14B-Elite</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Calcium-Opus-14B-Elite-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
prithivMLmods/Calcium-Opus-14B-Elite
|
a8661f82079677c777595e4259dbaf5a72c8f134
| 40.077353 |
apache-2.0
| 4 | 14.766 | true | false | false | false | 4.024797 | 0.605152 | 60.515211 | 0.631736 | 46.934158 | 0.478852 | 47.885196 | 0.374161 | 16.55481 | 0.485958 | 20.778125 | 0.53017 | 47.796616 | false | false |
2025-01-23
|
2025-01-23
| 1 |
prithivMLmods/Calcium-Opus-14B-Elite (Merge)
|
prithivMLmods_Calcium-Opus-14B-Elite_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/prithivMLmods/Calcium-Opus-14B-Elite" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Calcium-Opus-14B-Elite</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Calcium-Opus-14B-Elite-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
prithivMLmods/Calcium-Opus-14B-Elite
|
a8661f82079677c777595e4259dbaf5a72c8f134
| 38.249365 |
apache-2.0
| 4 | 14.766 | true | false | false | false | 2.022333 | 0.606351 | 60.635115 | 0.62959 | 46.532809 | 0.370846 | 37.084592 | 0.373322 | 16.442953 | 0.487323 | 20.948698 | 0.530668 | 47.852024 | false | false |
2025-01-23
|
2025-01-23
| 1 |
prithivMLmods/Calcium-Opus-14B-Elite (Merge)
|
prithivMLmods_Calcium-Opus-14B-Elite-1M_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/prithivMLmods/Calcium-Opus-14B-Elite-1M" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Calcium-Opus-14B-Elite-1M</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Calcium-Opus-14B-Elite-1M-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
prithivMLmods/Calcium-Opus-14B-Elite-1M
|
07f093df0a87d5d13e4325aa54eb62de9322721c
| 37.615179 |
apache-2.0
| 4 | 14.77 | true | false | false | false | 3.893609 | 0.561288 | 56.128849 | 0.63294 | 46.935523 | 0.445619 | 44.561934 | 0.352349 | 13.646532 | 0.467604 | 18.283854 | 0.515209 | 46.134382 | false | false |
2025-01-25
|
2025-01-27
| 1 |
prithivMLmods/Calcium-Opus-14B-Elite-1M (Merge)
|
prithivMLmods_Calcium-Opus-14B-Elite-Stock_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/prithivMLmods/Calcium-Opus-14B-Elite-Stock" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Calcium-Opus-14B-Elite-Stock</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Calcium-Opus-14B-Elite-Stock-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
prithivMLmods/Calcium-Opus-14B-Elite-Stock
|
e3b7fa2d20fa3e7a92bb7a99ad05219c9a86a95d
| 39.739834 | 3 | 14.766 | false | false | false | false | 3.974459 | 0.614295 | 61.429452 | 0.632877 | 46.897899 | 0.466767 | 46.676737 | 0.368289 | 15.771812 | 0.48075 | 20.060417 | 0.528424 | 47.602689 | false | false |
2025-01-25
|
2025-01-25
| 1 |
prithivMLmods/Calcium-Opus-14B-Elite-Stock (Merge)
|
|
prithivMLmods_Calcium-Opus-14B-Elite2_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/prithivMLmods/Calcium-Opus-14B-Elite2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Calcium-Opus-14B-Elite2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Calcium-Opus-14B-Elite2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
prithivMLmods/Calcium-Opus-14B-Elite2
|
0d948a368ff62658c06f90219849d8a6be29b78e
| 40.249809 |
apache-2.0
| 2 | 14.766 | true | false | false | false | 4.025446 | 0.617617 | 61.761681 | 0.631826 | 46.80615 | 0.469033 | 46.903323 | 0.369966 | 15.995526 | 0.493958 | 22.244792 | 0.530086 | 47.787382 | false | false |
2025-01-24
|
2025-01-25
| 1 |
prithivMLmods/Calcium-Opus-14B-Elite2 (Merge)
|
prithivMLmods_Calcium-Opus-14B-Elite2-R1_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/prithivMLmods/Calcium-Opus-14B-Elite2-R1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Calcium-Opus-14B-Elite2-R1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Calcium-Opus-14B-Elite2-R1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
prithivMLmods/Calcium-Opus-14B-Elite2-R1
|
8d57bcd85bdfe2cb41f0e84ceb7beabcdc1e63fb
| 38.563732 |
apache-2.0
| 7 | 14.766 | true | false | false | false | 3.67803 | 0.632579 | 63.257933 | 0.636236 | 47.337096 | 0.333837 | 33.383686 | 0.39094 | 18.791946 | 0.48999 | 21.415365 | 0.524767 | 47.196365 | false | false |
2025-02-01
|
2025-02-02
| 1 |
prithivMLmods/Calcium-Opus-14B-Elite2-R1 (Merge)
|
prithivMLmods_Calcium-Opus-14B-Elite3_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/prithivMLmods/Calcium-Opus-14B-Elite3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Calcium-Opus-14B-Elite3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Calcium-Opus-14B-Elite3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
prithivMLmods/Calcium-Opus-14B-Elite3
|
6be2c8ea522ff941fa1ed5bec18949ac4c3b5651
| 38.803353 |
apache-2.0
| 2 | 14.766 | true | false | false | false | 4.02463 | 0.542829 | 54.282858 | 0.63504 | 47.0746 | 0.470544 | 47.054381 | 0.370805 | 16.107383 | 0.479479 | 20.134896 | 0.533494 | 48.166002 | false | false |
2025-01-25
|
2025-01-25
| 1 |
prithivMLmods/Calcium-Opus-14B-Elite3 (Merge)
|
prithivMLmods_Calcium-Opus-14B-Elite4_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/prithivMLmods/Calcium-Opus-14B-Elite4" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Calcium-Opus-14B-Elite4</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Calcium-Opus-14B-Elite4-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
prithivMLmods/Calcium-Opus-14B-Elite4
|
59525af6aae57e700ff9cd6ce9c6b3257f422f4c
| 36.743869 |
apache-2.0
| 3 | 14.766 | true | false | false | false | 3.916601 | 0.611197 | 61.119718 | 0.619526 | 45.208475 | 0.362538 | 36.253776 | 0.355705 | 14.09396 | 0.468719 | 17.689844 | 0.514877 | 46.097444 | false | false |
2025-01-25
|
2025-01-25
| 1 |
prithivMLmods/Calcium-Opus-14B-Elite4 (Merge)
|
prithivMLmods_Calcium-Opus-14B-Merge_bfloat16
|
bfloat16
|
🤝 base merges and moerges
|
🤝
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/prithivMLmods/Calcium-Opus-14B-Merge" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Calcium-Opus-14B-Merge</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Calcium-Opus-14B-Merge-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
prithivMLmods/Calcium-Opus-14B-Merge
|
ceb41ff76990a24d2f4ff29f1c342fcd7322948a
| 38.011161 | 2 | 14.766 | false | false | false | false | 4.138515 | 0.494943 | 49.494342 | 0.631929 | 46.766668 | 0.463746 | 46.374622 | 0.370805 | 16.107383 | 0.486083 | 20.927083 | 0.535572 | 48.396868 | false | false |
2025-01-24
|
2025-01-24
| 1 |
prithivMLmods/Calcium-Opus-14B-Merge (Merge)
|
|
prithivMLmods_Calcium-Opus-20B-v1_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/prithivMLmods/Calcium-Opus-20B-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Calcium-Opus-20B-v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Calcium-Opus-20B-v1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
prithivMLmods/Calcium-Opus-20B-v1
|
28395429552eb6f22cd3dc8b54cd03e47c6132c9
| 31.041734 |
apache-2.0
| 2 | 19.173 | true | false | false | false | 5.47254 | 0.309272 | 30.927162 | 0.599033 | 41.805576 | 0.361782 | 36.178248 | 0.353188 | 13.758389 | 0.494333 | 22.091667 | 0.473404 | 41.489362 | false | false |
2025-01-19
|
2025-01-23
| 1 |
prithivMLmods/Calcium-Opus-20B-v1 (Merge)
|
prithivMLmods_Codepy-Deepthink-3B_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/prithivMLmods/Codepy-Deepthink-3B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Codepy-Deepthink-3B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Codepy-Deepthink-3B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
prithivMLmods/Codepy-Deepthink-3B
|
73551f0560645b098ff8293e70ff633bfc72c125
| 17.430765 |
creativeml-openrail-m
| 3 | 3.213 | true | false | false | false | 1.211007 | 0.43272 | 43.271963 | 0.425945 | 18.640888 | 0.115559 | 11.555891 | 0.279362 | 3.914989 | 0.331021 | 3.977604 | 0.309009 | 23.223257 | false | false |
2024-12-26
|
2025-01-12
| 1 |
prithivMLmods/Codepy-Deepthink-3B (Merge)
|
prithivMLmods_Coma-II-14B_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/prithivMLmods/Coma-II-14B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Coma-II-14B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Coma-II-14B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
prithivMLmods/Coma-II-14B
|
8ff81f7007503d74d8a4d7d076c1aaf70a9e8487
| 39.451469 |
apache-2.0
| 4 | 14.766 | true | false | false | false | 3.700583 | 0.416833 | 41.683289 | 0.632071 | 46.891471 | 0.55136 | 55.135952 | 0.400168 | 20.022371 | 0.535104 | 28.088021 | 0.503989 | 44.887707 | false | false |
2025-03-02
|
2025-03-03
| 1 |
prithivMLmods/Coma-II-14B (Merge)
|
prithivMLmods_Condor-Opus-14B-Exp_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/prithivMLmods/Condor-Opus-14B-Exp" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Condor-Opus-14B-Exp</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Condor-Opus-14B-Exp-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
prithivMLmods/Condor-Opus-14B-Exp
|
1da0a33b3d6937f6e494c2d856bd85f4ba19c12b
| 37.6172 |
apache-2.0
| 2 | 14.77 | true | false | false | false | 1.712769 | 0.404318 | 40.431832 | 0.615422 | 44.077092 | 0.522659 | 52.265861 | 0.391779 | 18.903803 | 0.519385 | 25.423177 | 0.501413 | 44.601433 | false | false |
2025-03-02
|
2025-03-03
| 1 |
prithivMLmods/Condor-Opus-14B-Exp (Merge)
|
prithivMLmods_Cygnus-II-14B_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/prithivMLmods/Cygnus-II-14B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Cygnus-II-14B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Cygnus-II-14B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
prithivMLmods/Cygnus-II-14B
|
abf660630df7ef04e9e8b4ff74260752bf9501f5
| 40.529488 |
apache-2.0
| 2 | 14.766 | true | false | false | false | 2.001976 | 0.618441 | 61.844129 | 0.666057 | 52.140382 | 0.439577 | 43.957704 | 0.387584 | 18.344519 | 0.468844 | 18.105469 | 0.539063 | 48.784722 | false | false |
2025-03-02
|
2025-03-03
| 1 |
prithivMLmods/Cygnus-II-14B (Merge)
|
prithivMLmods_Deepthink-Llama-3-8B-Preview_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
LlamaForCausalLM
|
<a target="_blank" href="https://huggingface.co/prithivMLmods/Deepthink-Llama-3-8B-Preview" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Deepthink-Llama-3-8B-Preview</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Deepthink-Llama-3-8B-Preview-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
prithivMLmods/Deepthink-Llama-3-8B-Preview
|
9037f9ba590696402412233fabdb0a1d7eb7a714
| 20.957476 |
llama3
| 5 | 8.03 | true | false | false | false | 0.7296 | 0.295533 | 29.553252 | 0.466451 | 24.80088 | 0.354985 | 35.498489 | 0.316275 | 8.836689 | 0.370708 | 7.738542 | 0.273853 | 19.317007 | false | false |
2025-02-18
|
2025-03-12
| 1 |
prithivMLmods/Deepthink-Llama-3-8B-Preview (Merge)
|
prithivMLmods_Deepthink-Reasoning-14B_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/prithivMLmods/Deepthink-Reasoning-14B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Deepthink-Reasoning-14B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Deepthink-Reasoning-14B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
prithivMLmods/Deepthink-Reasoning-14B
|
08fd00d4ac2bf07766c8bab7e73d17028487d23a
| 37.765949 |
apache-2.0
| 4 | 14.77 | true | false | false | false | 3.900782 | 0.542354 | 54.235429 | 0.633405 | 47.306257 | 0.422961 | 42.296073 | 0.366611 | 15.548098 | 0.473156 | 19.477865 | 0.529588 | 47.731974 | false | false |
2025-01-20
|
2025-01-22
| 1 |
prithivMLmods/Deepthink-Reasoning-14B (Merge)
|
prithivMLmods_Deepthink-Reasoning-7B_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/prithivMLmods/Deepthink-Reasoning-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Deepthink-Reasoning-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Deepthink-Reasoning-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
prithivMLmods/Deepthink-Reasoning-7B
|
0ccaa3825ded55cf8cfa18f7db53d91848e3733b
| 29.122241 |
creativeml-openrail-m
| 10 | 7.616 | true | false | false | false | 1.253995 | 0.484002 | 48.400245 | 0.550507 | 35.623731 | 0.334592 | 33.459215 | 0.299497 | 6.599553 | 0.443229 | 13.436979 | 0.434924 | 37.213726 | false | false |
2024-12-28
|
2025-01-09
| 1 |
prithivMLmods/Deepthink-Reasoning-7B (Merge)
|
prithivMLmods_Dinobot-Opus-14B-Exp_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/prithivMLmods/Dinobot-Opus-14B-Exp" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Dinobot-Opus-14B-Exp</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Dinobot-Opus-14B-Exp-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
prithivMLmods/Dinobot-Opus-14B-Exp
|
f51cbd56106c3caa88882ab45252a2edec321d40
| 41.765081 |
apache-2.0
| 3 | 14.77 | true | false | false | true | 1.704632 | 0.823996 | 82.399589 | 0.637009 | 48.19595 | 0.531722 | 53.172205 | 0.324664 | 9.955257 | 0.426031 | 12.653906 | 0.497922 | 44.213579 | false | false |
2025-02-12
|
2025-02-15
| 1 |
prithivMLmods/Dinobot-Opus-14B-Exp (Merge)
|
prithivMLmods_Elita-0.1-Distilled-R1-abliterated_float16
|
float16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/prithivMLmods/Elita-0.1-Distilled-R1-abliterated" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Elita-0.1-Distilled-R1-abliterated</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Elita-0.1-Distilled-R1-abliterated-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
prithivMLmods/Elita-0.1-Distilled-R1-abliterated
|
253ce07bed5dcd928325172cd5c0cb4f7e98e8e6
| 17.399217 |
apache-2.0
| 3 | 7.616 | true | false | false | true | 0.773659 | 0.354235 | 35.423454 | 0.382779 | 13.606417 | 0.306647 | 30.664653 | 0.26594 | 2.12528 | 0.365969 | 3.046094 | 0.275765 | 19.529403 | false | false |
2025-02-09
|
2025-02-09
| 1 |
prithivMLmods/Elita-0.1-Distilled-R1-abliterated (Merge)
|
prithivMLmods_Elita-1_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/prithivMLmods/Elita-1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Elita-1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Elita-1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
prithivMLmods/Elita-1
|
a304ec55887200703ebb1d0188e7b0fb0b8173de
| 36.54537 |
apache-2.0
| 2 | 14.766 | true | false | false | false | 3.843754 | 0.490647 | 49.064704 | 0.652041 | 49.928735 | 0.3429 | 34.29003 | 0.375839 | 16.778523 | 0.483417 | 20.527083 | 0.538148 | 48.683141 | false | false |
2025-02-05
|
2025-02-07
| 1 |
prithivMLmods/Elita-1 (Merge)
|
prithivMLmods_Epimetheus-14B-Axo_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/prithivMLmods/Epimetheus-14B-Axo" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Epimetheus-14B-Axo</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Epimetheus-14B-Axo-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
prithivMLmods/Epimetheus-14B-Axo
|
206a58dd4edc1c57011840380014f723100b9620
| 39.08056 |
apache-2.0
| 2 | 14.766 | true | false | false | false | 1.994603 | 0.554644 | 55.46439 | 0.661334 | 51.455447 | 0.410121 | 41.012085 | 0.392617 | 19.01566 | 0.481958 | 19.711458 | 0.530419 | 47.82432 | false | false |
2025-03-03
|
2025-03-04
| 1 |
prithivMLmods/Epimetheus-14B-Axo (Merge)
|
prithivMLmods_Equuleus-Opus-14B-Exp_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/prithivMLmods/Equuleus-Opus-14B-Exp" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Equuleus-Opus-14B-Exp</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Equuleus-Opus-14B-Exp-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
prithivMLmods/Equuleus-Opus-14B-Exp
|
5c2cacb51ef84468d5fc1de3d786f79f592d0b7c
| 42.199751 |
apache-2.0
| 2 | 14.766 | true | false | false | false | 1.902015 | 0.700074 | 70.007358 | 0.643377 | 48.616701 | 0.458459 | 45.845921 | 0.386745 | 18.232662 | 0.495167 | 21.895833 | 0.5374 | 48.60003 | false | false |
2025-02-26
|
2025-02-27
| 1 |
prithivMLmods/Equuleus-Opus-14B-Exp (Merge)
|
prithivMLmods_Eridanus-Opus-14B-r999_bfloat16
|
bfloat16
|
🔶 fine-tuned on domain-specific datasets
|
🔶
|
Original
|
Qwen2ForCausalLM
|
<a target="_blank" href="https://huggingface.co/prithivMLmods/Eridanus-Opus-14B-r999" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Eridanus-Opus-14B-r999</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Eridanus-Opus-14B-r999-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
|
prithivMLmods/Eridanus-Opus-14B-r999
|
ad0375d28983a476829bc07f61e70dbebbfe6263
| 40.111313 |
apache-2.0
| 3 | 14.77 | true | false | false | false | 1.957942 | 0.638575 | 63.857454 | 0.658392 | 51.038335 | 0.385952 | 38.595166 | 0.394295 | 19.239374 | 0.476875 | 19.476042 | 0.536154 | 48.46151 | false | false |
2025-02-28
|
2025-03-01
| 1 |
prithivMLmods/Eridanus-Opus-14B-r999 (Merge)
|
Subsets and Splits
Top 100 Official Models <70
This query identifies the top 100 high-scoring, officially provided models with fewer than 70 billion parameters, offering a useful overview for comparing performance metrics.
Top 100 Official Models < 2
Identifies top-performing AI models with fewer than 20 billion parameters, offering insights into efficiency and precision in smaller models.
Top 500 Official Models by Score
Identifies top performing models based on a combined score of IFEval and MMLU-PRO metrics, filtering by official providers and parameter count, offering insights into efficient model performance.
Top 200 Official Models by Score
Discovers top high-performing models with less than 70 billion parameters, highlighting their evaluation scores and characteristics, which is valuable for model selection and optimization.
SQL Console for open-llm-leaderboard/contents
Identifies top-performing models with fewer than 70 billion parameters, combining two evaluation metrics to reveal the best balanced options.
Top 10 Official Leaderboard Models
The query identifies top 10 official providers with under 13 billion parameters, ordered by their average metric, revealing valuable insights into efficient models.
SQL Console for open-llm-leaderboard/contents
This query filters and ranks models within a specific parameter range (6-8 billion) for the LlamaForCausalLM architecture based on their average performance metric.
SQL Console for open-llm-leaderboard/contents
Retrieves entries related to chat models that are officially provided, offering a filtered view of the dataset.
SQL Console for open-llm-leaderboard/contents
The query retrieves entries marked as "Official Providers", offering basic filtering but limited analytical value.
Top 10 Official Training Data
The query retrieves a small sample of records from the 'train' dataset where the "Official Providers" flag is true, providing basic filtering with limited analytical value.