eval_name
stringlengths
12
111
Precision
stringclasses
3 values
Type
stringclasses
6 values
T
stringclasses
6 values
Weight type
stringclasses
3 values
Architecture
stringclasses
59 values
Model
stringlengths
355
689
fullname
stringlengths
4
102
Model sha
stringlengths
0
40
Average ⬆️
float64
1.03
52
Hub License
stringclasses
25 values
Hub ❤️
int64
0
5.96k
#Params (B)
float64
-1
141
Available on the hub
bool
2 classes
MoE
bool
2 classes
Flagged
bool
2 classes
Chat Template
bool
2 classes
CO₂ cost (kg)
float64
0.03
107
IFEval Raw
float64
0
0.9
IFEval
float64
0
90
BBH Raw
float64
0.27
0.75
BBH
float64
0.81
63.5
MATH Lvl 5 Raw
float64
0
0.51
MATH Lvl 5
float64
0
50.7
GPQA Raw
float64
0.22
0.44
GPQA
float64
0
24.9
MUSR Raw
float64
0.29
0.6
MUSR
float64
0
38.5
MMLU-PRO Raw
float64
0.1
0.73
MMLU-PRO
float64
0
70
Merged
bool
2 classes
Official Providers
bool
2 classes
Upload To Hub Date
stringclasses
457 values
Submission Date
stringclasses
200 values
Generation
int64
0
10
Base Model
stringlengths
4
102
openai-community_gpt2-xl_bfloat16
bfloat16
🟢 pretrained
🟢
Original
GPT2LMHeadModel
<a target="_blank" href="https://huggingface.co/openai-community/gpt2-xl" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">openai-community/gpt2-xl</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/openai-community__gpt2-xl-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
openai-community/gpt2-xl
15ea56dee5df4983c59b2538573817e1667135e2
4.980188
mit
321
1.608
true
false
false
false
0.215314
0.203858
20.385799
0.300858
2.580961
0.003021
0.302115
0.258389
1.118568
0.370958
4.036458
0.113115
1.457225
false
true
2022-03-02
2024-06-12
0
openai-community/gpt2-xl
openbmb_MiniCPM-S-1B-sft-llama-format_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/openbmb/MiniCPM-S-1B-sft-llama-format" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">openbmb/MiniCPM-S-1B-sft-llama-format</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/openbmb__MiniCPM-S-1B-sft-llama-format-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
openbmb/MiniCPM-S-1B-sft-llama-format
7de07f8895c168a7ee01f624f50c44f6966c9735
8.870185
apache-2.0
4
1
true
false
false
true
0.540037
0.332877
33.287677
0.304931
3.898455
0.023414
2.34139
0.270973
2.796421
0.331677
1.359635
0.185838
9.53753
false
false
2024-06-14
2024-11-19
0
openbmb/MiniCPM-S-1B-sft-llama-format
openchat_openchat-3.5-0106_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/openchat/openchat-3.5-0106" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">openchat/openchat-3.5-0106</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/openchat__openchat-3.5-0106-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
openchat/openchat-3.5-0106
ff058fda49726ecf4ea53dc1635f917cdb8ba36b
22.658683
apache-2.0
348
7.242
true
false
false
true
2.354959
0.595135
59.513535
0.461698
24.038711
0.074773
7.477341
0.307886
7.718121
0.425437
11.746354
0.329122
25.458038
false
true
2024-01-07
2024-06-27
1
mistralai/Mistral-7B-v0.1
openchat_openchat-3.5-1210_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/openchat/openchat-3.5-1210" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">openchat/openchat-3.5-1210</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/openchat__openchat-3.5-1210-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
openchat/openchat-3.5-1210
801f5459b7577241500785f11c2b026912badd6e
22.690085
apache-2.0
276
7.242
true
false
false
true
0.516451
0.603678
60.367824
0.453536
23.236297
0.076284
7.628399
0.301174
6.823266
0.441438
14.279688
0.314245
23.805038
false
true
2023-12-12
2024-06-12
1
mistralai/Mistral-7B-v0.1
openchat_openchat-3.6-8b-20240522_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/openchat/openchat-3.6-8b-20240522" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">openchat/openchat-3.6-8b-20240522</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/openchat__openchat-3.6-8b-20240522-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
openchat/openchat-3.6-8b-20240522
2264eb98558978f708e88ae52afb78e43b832801
22.830377
llama3
151
8.03
true
false
false
true
3.267332
0.534336
53.433556
0.533841
33.232937
0.083082
8.308157
0.317953
9.060403
0.399854
8.181771
0.322889
24.76544
false
true
2024-05-07
2024-06-26
1
meta-llama/Meta-Llama-3-8B
openchat_openchat_3.5_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/openchat/openchat_3.5" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">openchat/openchat_3.5</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/openchat__openchat_3.5-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
openchat/openchat_3.5
0fc98e324280bc4bf5d2c30ecf7b97b84fb8a19b
21.648415
apache-2.0
1,119
7
true
false
false
true
0.501211
0.593112
59.311183
0.442632
21.582167
0.073263
7.326284
0.298658
6.487696
0.422865
11.258073
0.315326
23.925089
false
true
2023-10-30
2024-06-12
0
openchat/openchat_3.5
openchat_openchat_v3.2_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/openchat/openchat_v3.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">openchat/openchat_v3.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/openchat__openchat_v3.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
openchat/openchat_v3.2
acc7ce92558681e749678648189812f15c1465fe
13.845734
llama2
42
13
true
false
false
false
5.302455
0.298056
29.805583
0.433056
20.323003
0.013595
1.359517
0.270134
2.684564
0.433625
13.103125
0.242188
15.798611
false
true
2023-07-30
2024-06-12
0
openchat/openchat_v3.2
openchat_openchat_v3.2_super_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/openchat/openchat_v3.2_super" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">openchat/openchat_v3.2_super</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/openchat__openchat_v3.2_super-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
openchat/openchat_v3.2_super
9479cc37d43234a57a33628637d1aca0293d745a
12.848046
llama2
36
13
true
false
false
false
5.027694
0.286191
28.619064
0.422121
19.15354
0.016616
1.661631
0.264262
1.901566
0.416135
9.916927
0.24252
15.83555
false
true
2023-09-04
2024-06-12
0
openchat/openchat_v3.2_super
orai-nlp_Llama-eus-8B_bfloat16
bfloat16
🟩 continuously pretrained
🟩
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/orai-nlp/Llama-eus-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">orai-nlp/Llama-eus-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/orai-nlp__Llama-eus-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
orai-nlp/Llama-eus-8B
75b5645d222047b517a7a9190922ea1b5382c71f
13.90599
7
8.03
false
false
false
false
0.869258
0.216123
21.612322
0.441825
20.961371
0.044562
4.456193
0.28943
5.257271
0.391885
8.285677
0.305768
22.863106
false
false
2024-09-04
2024-09-30
1
meta-llama/Meta-Llama-3.1-8B
oxyapi_oxy-1-small_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/oxyapi/oxy-1-small" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">oxyapi/oxy-1-small</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/oxyapi__oxy-1-small-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
oxyapi/oxy-1-small
0d100cf65c8574b025b499dd787d8bcbcf678418
33.142622
apache-2.0
71
14.77
true
false
false
true
1.386909
0.624461
62.446087
0.588459
41.175447
0.182779
18.277946
0.371644
16.219239
0.448667
16.283333
0.500083
44.453679
false
false
2024-12-01
2024-12-02
1
oxyapi/oxy-1-small (Merge)
paloalma_ECE-TW3-JRGL-V1_float16
float16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/paloalma/ECE-TW3-JRGL-V1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">paloalma/ECE-TW3-JRGL-V1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/paloalma__ECE-TW3-JRGL-V1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
paloalma/ECE-TW3-JRGL-V1
2f08c7ab9db03b1b9f455c7beee6a41e99aa910e
30.223413
apache-2.0
1
68.977
true
false
false
false
6.191694
0.553495
55.349473
0.628367
46.697139
0.130665
13.066465
0.347315
12.975391
0.462083
17.460417
0.422124
35.791593
true
false
2024-04-03
2024-08-04
0
paloalma/ECE-TW3-JRGL-V1
paloalma_ECE-TW3-JRGL-V2_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/paloalma/ECE-TW3-JRGL-V2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">paloalma/ECE-TW3-JRGL-V2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/paloalma__ECE-TW3-JRGL-V2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
paloalma/ECE-TW3-JRGL-V2
f2c15045f1a7a7a34540ab18abcee8a566a74ca6
25.679422
apache-2.0
0
72.288
true
false
false
false
12.546249
0.225489
22.548948
0.603099
43.173268
0.178248
17.824773
0.331376
10.850112
0.479323
19.815365
0.458777
39.864066
true
false
2024-04-04
2024-09-19
0
paloalma/ECE-TW3-JRGL-V2
paloalma_ECE-TW3-JRGL-V5_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/paloalma/ECE-TW3-JRGL-V5" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">paloalma/ECE-TW3-JRGL-V5</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/paloalma__ECE-TW3-JRGL-V5-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
paloalma/ECE-TW3-JRGL-V5
4061fa10de22945790cad825f7f4dec96d55b204
29.454282
apache-2.0
0
72.289
true
false
false
false
23.031124
0.455251
45.525096
0.602471
43.462514
0.181269
18.126888
0.341443
12.192394
0.462052
16.889844
0.464761
40.52896
true
false
2024-04-11
2024-08-30
0
paloalma/ECE-TW3-JRGL-V5
paloalma_Le_Triomphant-ECE-TW3_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/paloalma/Le_Triomphant-ECE-TW3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">paloalma/Le_Triomphant-ECE-TW3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/paloalma__Le_Triomphant-ECE-TW3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
paloalma/Le_Triomphant-ECE-TW3
f72399253bb3e65c0f55e50461488c098f658a49
31.933354
apache-2.0
4
72.289
true
false
false
false
10.418391
0.540206
54.020554
0.611206
44.963294
0.191088
19.108761
0.348993
13.199105
0.4725
18.495833
0.476313
41.812574
true
false
2024-04-01
2024-07-25
0
paloalma/Le_Triomphant-ECE-TW3
paloalma_TW3-JRGL-v2_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/paloalma/TW3-JRGL-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">paloalma/TW3-JRGL-v2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/paloalma__TW3-JRGL-v2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
paloalma/TW3-JRGL-v2
aca3f0ba2bfb90038a9e2cd5b486821d4c181b46
32.399598
apache-2.0
0
72.289
true
false
false
false
20.896294
0.531613
53.161279
0.613753
45.61111
0.175227
17.522659
0.35906
14.541387
0.485833
20.695833
0.485788
42.865322
true
false
2024-04-01
2024-08-29
0
paloalma/TW3-JRGL-v2
pankajmathur_Al_Dente_v1_8b_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/pankajmathur/Al_Dente_v1_8b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">pankajmathur/Al_Dente_v1_8b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/pankajmathur__Al_Dente_v1_8b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
pankajmathur/Al_Dente_v1_8b
149d70e04085ecd90510a60f916efc55da1294e7
17.237118
llama3
1
8.03
true
false
false
false
0.908532
0.369372
36.937215
0.483474
27.247898
0.037009
3.700906
0.299497
6.599553
0.398708
8.271875
0.285987
20.665263
false
false
2024-06-02
2024-06-26
0
pankajmathur/Al_Dente_v1_8b
pankajmathur_model_007_13b_v2_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/pankajmathur/model_007_13b_v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">pankajmathur/model_007_13b_v2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/pankajmathur__model_007_13b_v2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
pankajmathur/model_007_13b_v2
2c6ddf25cdb134f22e2543121b5a36b41342a9e2
15.868934
llama2
4
13
true
false
false
false
2.18178
0.305649
30.564901
0.470229
25.45442
0.01284
1.283988
0.283557
4.474273
0.461094
17.203385
0.246094
16.232639
false
false
2023-08-12
2024-06-26
0
pankajmathur/model_007_13b_v2
pankajmathur_orca_mini_3b_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/pankajmathur/orca_mini_3b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">pankajmathur/orca_mini_3b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/pankajmathur__orca_mini_3b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
pankajmathur/orca_mini_3b
31e1a7bc3f7ea2f247b432d60036d975b8d590e9
3.074923
cc-by-nc-sa-4.0
159
3.426
true
false
false
false
0.524776
0.074214
7.42142
0.319607
4.685985
0.005287
0.528701
0.245805
0
0.334927
4.199219
0.114528
1.614214
false
false
2023-06-22
2024-06-26
0
pankajmathur/orca_mini_3b
pankajmathur_orca_mini_v2_7b_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/pankajmathur/orca_mini_v2_7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">pankajmathur/orca_mini_v2_7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/pankajmathur__orca_mini_v2_7b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
pankajmathur/orca_mini_v2_7b
66d3f32a4a6bca0a2a261f1bdb54d2582028f75f
5.502369
cc-by-nc-sa-4.0
37
7
true
false
false
false
0.592511
0.135789
13.57886
0.353634
10.199953
0.011329
1.132931
0.249161
0
0.359333
2.083333
0.154172
6.019134
false
false
2023-07-03
2024-06-26
0
pankajmathur/orca_mini_v2_7b
pankajmathur_orca_mini_v3_13b_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/pankajmathur/orca_mini_v3_13b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">pankajmathur/orca_mini_v3_13b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/pankajmathur__orca_mini_v3_13b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
pankajmathur/orca_mini_v3_13b
7d6e567d24ce2f228beaf54e89c17b0e750bfe99
15.016121
other
31
13
true
false
false
false
1.097179
0.289663
28.966254
0.471097
25.549482
0.019637
1.963746
0.265101
2.013423
0.459792
17.107292
0.230469
14.496528
false
false
2023-08-09
2024-06-26
0
pankajmathur/orca_mini_v3_13b
pankajmathur_orca_mini_v3_70b_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/pankajmathur/orca_mini_v3_70b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">pankajmathur/orca_mini_v3_70b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/pankajmathur__orca_mini_v3_70b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
pankajmathur/orca_mini_v3_70b
e8e856dfb5c737d1906b50f9e65fd3a4f8d77422
25.298159
other
23
70
true
false
false
false
6.406537
0.40147
40.147032
0.594931
42.975787
0.03852
3.851964
0.317953
9.060403
0.507854
25.115104
0.375748
30.638667
false
false
2023-08-10
2024-06-26
0
pankajmathur/orca_mini_v3_70b
pankajmathur_orca_mini_v3_7b_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/pankajmathur/orca_mini_v3_7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">pankajmathur/orca_mini_v3_7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/pankajmathur__orca_mini_v3_7b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
pankajmathur/orca_mini_v3_7b
6252eb7ca29da8d951ae7d2bca948bf84e04a2b9
13.51814
other
40
7
true
false
false
false
0.63995
0.282094
28.209373
0.409533
17.843956
0.003021
0.302115
0.246644
0
0.49824
22.713281
0.208361
12.040115
false
false
2023-08-07
2024-06-26
0
pankajmathur/orca_mini_v3_7b
pankajmathur_orca_mini_v5_8b_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/pankajmathur/orca_mini_v5_8b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">pankajmathur/orca_mini_v5_8b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/pankajmathur__orca_mini_v5_8b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
pankajmathur/orca_mini_v5_8b
f57c84d4cc0b3b74549458c0d38e868bd7fffad1
20.246538
llama3
2
8.03
true
false
false
false
0.878971
0.480605
48.06048
0.506424
29.345795
0.083837
8.383686
0.286913
4.9217
0.40001
7.701302
0.307596
23.066268
false
false
2024-05-26
2024-06-26
0
pankajmathur/orca_mini_v5_8b
pankajmathur_orca_mini_v5_8b_dpo_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/pankajmathur/orca_mini_v5_8b_dpo" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">pankajmathur/orca_mini_v5_8b_dpo</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/pankajmathur__orca_mini_v5_8b_dpo-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
pankajmathur/orca_mini_v5_8b_dpo
fdc0d0aaa85a58f1abaf2c24ce0ddca10c08f0f1
20.057268
llama3
2
8
true
false
false
false
0.81669
0.489647
48.964747
0.50746
29.605373
0.080816
8.081571
0.274329
3.243848
0.389375
6.938542
0.311586
23.50953
false
false
2024-05-30
2024-06-26
0
pankajmathur/orca_mini_v5_8b_dpo
pankajmathur_orca_mini_v5_8b_orpo_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/pankajmathur/orca_mini_v5_8b_orpo" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">pankajmathur/orca_mini_v5_8b_orpo</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/pankajmathur__orca_mini_v5_8b_orpo-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
pankajmathur/orca_mini_v5_8b_orpo
4cdc018043ef439f15bd8a09c4f09c6bc528dfc7
12.968554
llama3
1
8
true
false
false
false
0.97185
0.082432
8.243239
0.496374
27.877628
0.064955
6.495468
0.284396
4.58613
0.413125
8.973958
0.294714
21.6349
false
false
2024-05-31
2024-06-26
0
pankajmathur/orca_mini_v5_8b_orpo
pankajmathur_orca_mini_v6_8b_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/pankajmathur/orca_mini_v6_8b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">pankajmathur/orca_mini_v6_8b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/pankajmathur__orca_mini_v6_8b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
pankajmathur/orca_mini_v6_8b
e95dc8e4c6b6ca5957b657cc2d905683142eaf3e
1.413398
llama3
1
8.03
true
false
false
true
1.216368
0.011116
1.111606
0.30287
3.21981
0
0
0.238255
0
0.355458
2.765625
0.11245
1.383348
false
false
2024-06-02
2024-06-26
0
pankajmathur/orca_mini_v6_8b
pankajmathur_orca_mini_v6_8b_dpo_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/pankajmathur/orca_mini_v6_8b_dpo" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">pankajmathur/orca_mini_v6_8b_dpo</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/pankajmathur__orca_mini_v6_8b_dpo-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
pankajmathur/orca_mini_v6_8b_dpo
ebb11b63839d38e8c03c7ecac012e047fcb2346e
20.392492
llama3
1
8
true
false
false
false
0.769824
0.388256
38.825649
0.520281
32.478826
0.061178
6.117825
0.301174
6.823266
0.409031
9.26224
0.359624
28.847148
false
false
2024-06-21
2024-06-26
0
pankajmathur/orca_mini_v6_8b_dpo
pankajmathur_orca_mini_v7_72b_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/pankajmathur/orca_mini_v7_72b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">pankajmathur/orca_mini_v7_72b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/pankajmathur__orca_mini_v7_72b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
pankajmathur/orca_mini_v7_72b
447f11912cfa496e32e188a55214043a05760d3a
39.387496
apache-2.0
11
72.706
true
false
false
true
14.051707
0.592962
59.296223
0.68423
55.055523
0.283988
28.398792
0.385067
18.008949
0.507042
24.213542
0.562168
51.35195
false
false
2024-06-26
2025-01-02
0
pankajmathur/orca_mini_v7_72b
pankajmathur_orca_mini_v7_7b_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/pankajmathur/orca_mini_v7_7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">pankajmathur/orca_mini_v7_7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/pankajmathur__orca_mini_v7_7b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
pankajmathur/orca_mini_v7_7b
f5e84ff6ea25fb4585908ea45d1520bac416d803
22.425578
apache-2.0
2
7.616
true
false
false
false
0.925109
0.438765
43.87647
0.527491
33.950434
0.02719
2.719033
0.296141
6.152125
0.435979
12.664063
0.416722
35.191342
false
false
2024-06-20
2024-06-26
0
pankajmathur/orca_mini_v7_7b
pankajmathur_orca_mini_v9_0_3B-Instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/pankajmathur/orca_mini_v9_0_3B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">pankajmathur/orca_mini_v9_0_3B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/pankajmathur__orca_mini_v9_0_3B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
pankajmathur/orca_mini_v9_0_3B-Instruct
37710875f7841e72c99cd5494cf450bb5bd6c680
20.434252
llama3.2
5
3.213
true
false
false
true
0.601181
0.575377
57.537667
0.441295
21.368153
0.132931
13.293051
0.301174
6.823266
0.365906
5.771615
0.260306
17.811761
false
false
2024-12-27
2024-12-29
1
pankajmathur/orca_mini_v9_0_3B-Instruct (Merge)
pankajmathur_orca_mini_v9_1_1B-Instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/pankajmathur/orca_mini_v9_1_1B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">pankajmathur/orca_mini_v9_1_1B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/pankajmathur__orca_mini_v9_1_1B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
pankajmathur/orca_mini_v9_1_1B-Instruct
2c4cc6dacbff82ec76845fcc770322318742e794
8.597485
llama3.2
3
1.236
true
false
false
true
0.356788
0.362927
36.292703
0.320512
6.406449
0.018127
1.812689
0.256711
0.894855
0.338063
2.024479
0.137384
4.153738
false
false
2024-12-27
2024-12-29
1
pankajmathur/orca_mini_v9_1_1B-Instruct (Merge)
pankajmathur_orca_mini_v9_5_1B-Instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/pankajmathur/orca_mini_v9_5_1B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">pankajmathur/orca_mini_v9_5_1B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/pankajmathur__orca_mini_v9_5_1B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
pankajmathur/orca_mini_v9_5_1B-Instruct
eaf758bef610953480309044303c8c15985ac24d
10.681133
llama3.2
4
1.236
true
false
false
true
0.348127
0.465318
46.531758
0.3337
6.698817
0.027946
2.794562
0.270134
2.684564
0.318156
1.269531
0.136968
4.107565
false
false
2025-01-02
2025-01-02
1
pankajmathur/orca_mini_v9_5_1B-Instruct (Merge)
pankajmathur_orca_mini_v9_5_1B-Instruct_preview_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/pankajmathur/orca_mini_v9_5_1B-Instruct_preview" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">pankajmathur/orca_mini_v9_5_1B-Instruct_preview</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/pankajmathur__orca_mini_v9_5_1B-Instruct_preview-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
pankajmathur/orca_mini_v9_5_1B-Instruct_preview
7f4581135998269b83f79624a2435cc314f5f45b
9.415942
llama3.2
2
1.236
true
false
false
true
0.357798
0.393577
39.357682
0.327695
5.582692
0.030967
3.096677
0.263423
1.789709
0.339458
3.032292
0.132729
3.636599
false
false
2024-12-30
2024-12-30
1
pankajmathur/orca_mini_v9_5_1B-Instruct_preview (Merge)
pankajmathur_orca_mini_v9_5_3B-Instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/pankajmathur/orca_mini_v9_5_3B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">pankajmathur/orca_mini_v9_5_3B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/pankajmathur__orca_mini_v9_5_3B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
pankajmathur/orca_mini_v9_5_3B-Instruct
9d68ed7de708f52e8fa3b173fb7315a941d45b9c
23.787625
llama3.2
6
3.213
true
false
false
true
0.556026
0.720707
72.070661
0.449638
21.517904
0.110272
11.02719
0.286913
4.9217
0.42699
12.273698
0.288231
20.914598
false
false
2025-01-01
2025-01-01
1
pankajmathur/orca_mini_v9_5_3B-Instruct (Merge)
pankajmathur_orca_mini_v9_6_1B-Instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/pankajmathur/orca_mini_v9_6_1B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">pankajmathur/orca_mini_v9_6_1B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/pankajmathur__orca_mini_v9_6_1B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
pankajmathur/orca_mini_v9_6_1B-Instruct
6219f36f9cb41a659ca721e74b70364dda0a9a8a
14.429914
llama3.2
3
1.236
true
false
false
true
0.380503
0.608574
60.857414
0.356135
9.659037
0.023414
2.34139
0.268456
2.46085
0.339552
2.277344
0.180851
8.983452
false
false
2025-01-06
2025-01-06
1
pankajmathur/orca_mini_v9_6_1B-Instruct (Merge)
pankajmathur_orca_mini_v9_7_1B-Instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/pankajmathur/orca_mini_v9_7_1B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">pankajmathur/orca_mini_v9_7_1B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/pankajmathur__orca_mini_v9_7_1B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
pankajmathur/orca_mini_v9_7_1B-Instruct
b9e52b91802bd4ae941c0d328e9fa7818e0ce504
11.96958
llama3.2
4
1.236
true
false
false
true
0.425607
0.561014
56.101367
0.318153
5.052028
0.013595
1.359517
0.272651
3.020134
0.352698
2.453906
0.134475
3.830526
false
false
2025-01-04
2025-01-05
1
pankajmathur/orca_mini_v9_7_1B-Instruct (Merge)
pankajmathur_orca_mini_v9_7_3B-Instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/pankajmathur/orca_mini_v9_7_3B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">pankajmathur/orca_mini_v9_7_3B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/pankajmathur__orca_mini_v9_7_3B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
pankajmathur/orca_mini_v9_7_3B-Instruct
46298c9816c60d18d8b1217a540b75a0a8cf9aab
12.543766
llama3.2
3
3.213
true
false
false
true
0.535549
0.561838
56.183815
0.329713
6.301039
0.032477
3.247734
0.261745
1.565996
0.361875
3.801042
0.137467
4.162973
false
false
2025-01-05
2025-01-05
1
pankajmathur/orca_mini_v9_7_3B-Instruct (Merge)
paulml_ECE-ILAB-Q1_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/paulml/ECE-ILAB-Q1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">paulml/ECE-ILAB-Q1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/paulml__ECE-ILAB-Q1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
paulml/ECE-ILAB-Q1
393bea0ee85e4c752acd5fd77ce07f577fc13bd9
41.307201
other
0
72.706
true
false
false
false
11.415142
0.786452
78.645217
0.671776
53.702228
0.283988
28.398792
0.386745
18.232662
0.461375
18.805208
0.550532
50.059102
true
false
2024-06-06
2024-09-16
0
paulml/ECE-ILAB-Q1
pints-ai_1.5-Pints-16K-v0.1_bfloat16
bfloat16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/pints-ai/1.5-Pints-16K-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">pints-ai/1.5-Pints-16K-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/pints-ai__1.5-Pints-16K-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
pints-ai/1.5-Pints-16K-v0.1
7862a52f250be68fad593f3a4030f00d658ede56
4.150223
mit
14
1.566
true
false
false
true
0.279938
0.163591
16.359149
0.313308
3.658292
0.008308
0.830816
0.235738
0
0.357875
2.734375
0.111868
1.318706
false
false
2024-08-07
2024-09-09
0
pints-ai/1.5-Pints-16K-v0.1
pints-ai_1.5-Pints-2K-v0.1_bfloat16
bfloat16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/pints-ai/1.5-Pints-2K-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">pints-ai/1.5-Pints-2K-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/pints-ai__1.5-Pints-2K-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
pints-ai/1.5-Pints-2K-v0.1
2e865c18669161ebbf5e9ad79ae0502ee0153df0
3.830442
mit
16
1.566
true
false
false
true
0.291417
0.176156
17.615593
0.298019
2.37447
0
0
0.248322
0
0.350187
1.840104
0.110372
1.152482
false
false
2024-08-07
2024-09-09
0
pints-ai/1.5-Pints-2K-v0.1
piotr25691_thea-3b-25r_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/piotr25691/thea-3b-25r" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">piotr25691/thea-3b-25r</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/piotr25691__thea-3b-25r-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
piotr25691/thea-3b-25r
4661fb3c8b18bdf2059f703c4f69caea24057151
24.021247
llama3.2
1
3.213
true
false
false
true
0.690508
0.73442
73.442023
0.448441
22.546711
0.179758
17.975831
0.267617
2.348993
0.331458
3.565625
0.318235
24.248301
false
false
2024-10-11
2024-10-12
1
chuanli11/Llama-3.2-3B-Instruct-uncensored
piotr25691_thea-c-3b-25r_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/piotr25691/thea-c-3b-25r" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">piotr25691/thea-c-3b-25r</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/piotr25691__thea-c-3b-25r-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
piotr25691/thea-c-3b-25r
93a2333a84feda26f020bc8fa92f870462dacd89
23.179267
llama3.2
1
3.213
true
false
false
true
0.662456
0.74019
74.019047
0.453241
22.76785
0.148036
14.803625
0.265101
2.013423
0.33149
1.269531
0.317819
24.202128
false
false
2024-10-14
2024-10-17
1
meta-llama/Llama-3.2-3B-Instruct
piotr25691_thea-rp-3b-25r_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/piotr25691/thea-rp-3b-25r" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">piotr25691/thea-rp-3b-25r</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/piotr25691__thea-rp-3b-25r-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
piotr25691/thea-rp-3b-25r
ed4c338e07356f1657cf4d08b768ff866bbf0a68
21.732089
llama3.2
1
3.213
true
false
false
true
0.658453
0.657784
65.778357
0.439029
20.007381
0.125378
12.537764
0.274329
3.243848
0.381875
5.934375
0.306017
22.89081
false
false
2024-10-13
2024-10-16
1
SicariusSicariiStuff/Impish_LLAMA_3B
postbot_gpt2-medium-emailgen_bfloat16
bfloat16
🟩 continuously pretrained
🟩
Original
GPT2LMHeadModel
<a target="_blank" href="https://huggingface.co/postbot/gpt2-medium-emailgen" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">postbot/gpt2-medium-emailgen</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/postbot__gpt2-medium-emailgen-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
postbot/gpt2-medium-emailgen
a0299eb6760126e3bd04d2f10cd166c4563f82d2
4.743048
apache-2.0
6
0.38
true
false
false
false
0.078186
0.149203
14.9203
0.313043
3.6737
0
0
0.260067
1.342282
0.391115
6.889323
0.114694
1.632683
false
false
2022-09-29
2024-11-17
0
postbot/gpt2-medium-emailgen
prince-canuma_Ministral-8B-Instruct-2410-HF_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/prince-canuma/Ministral-8B-Instruct-2410-HF" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prince-canuma/Ministral-8B-Instruct-2410-HF</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prince-canuma__Ministral-8B-Instruct-2410-HF-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
prince-canuma/Ministral-8B-Instruct-2410-HF
e0a14d7a6a8a1d1e5bef1a77a42e86e8bcae0ee7
21.680297
other
10
8.02
true
false
false
true
1.016935
0.591164
59.116367
0.458561
23.778465
0.067976
6.797583
0.28104
4.138702
0.41375
10.71875
0.329787
25.531915
false
false
2024-10-16
2024-10-17
1
prince-canuma/Ministral-8B-Instruct-2410-HF (Merge)
princeton-nlp_Llama-3-8B-ProLong-512k-Base_bfloat16
bfloat16
🟩 continuously pretrained
🟩
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/princeton-nlp/Llama-3-8B-ProLong-512k-Base" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Llama-3-8B-ProLong-512k-Base</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Llama-3-8B-ProLong-512k-Base-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
princeton-nlp/Llama-3-8B-ProLong-512k-Base
51a333f7c99f5052377154b76909dfe63ff7ab83
21.679045
llama3
8
8.03
true
false
false
true
0.878664
0.532212
53.221231
0.503321
29.847246
0.068731
6.873112
0.261745
1.565996
0.422271
12.683854
0.332945
25.882831
false
false
2024-08-22
2024-10-16
1
princeton-nlp/Llama-3-8B-ProLong-512k-Base (Merge)
princeton-nlp_Llama-3-8B-ProLong-512k-Instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/princeton-nlp/Llama-3-8B-ProLong-512k-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Llama-3-8B-ProLong-512k-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Llama-3-8B-ProLong-512k-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
princeton-nlp/Llama-3-8B-ProLong-512k-Instruct
eae0626e8597575215276c2b248720f731bc50b8
21.942344
llama3
19
8.03
true
false
false
true
2.344706
0.550822
55.082182
0.502831
29.151153
0.05287
5.287009
0.286074
4.809843
0.426646
12.530729
0.323138
24.793144
false
false
2024-08-22
2024-11-16
1
princeton-nlp/Llama-3-8B-ProLong-512k-Instruct (Merge)
princeton-nlp_Llama-3-8B-ProLong-512k-Instruct_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/princeton-nlp/Llama-3-8B-ProLong-512k-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Llama-3-8B-ProLong-512k-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Llama-3-8B-ProLong-512k-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
princeton-nlp/Llama-3-8B-ProLong-512k-Instruct
bf92e493b7b0ef1db0242bfa97f1d8f92be02e9c
19.317531
llama3
19
8.03
true
false
false
false
0.724374
0.397773
39.777346
0.498303
28.669219
0.062689
6.268882
0.28104
4.138702
0.425
12.091667
0.324634
24.959368
false
false
2024-08-22
2024-11-16
1
princeton-nlp/Llama-3-8B-ProLong-512k-Instruct (Merge)
princeton-nlp_Llama-3-8B-ProLong-64k-Base_bfloat16
bfloat16
🟩 continuously pretrained
🟩
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/princeton-nlp/Llama-3-8B-ProLong-64k-Base" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Llama-3-8B-ProLong-64k-Base</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Llama-3-8B-ProLong-64k-Base-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
princeton-nlp/Llama-3-8B-ProLong-64k-Base
97994d6918f80162a893e22d5e7bba586551f941
21.601846
llama3
5
8.03
true
false
false
true
1.822461
0.520072
52.00723
0.492713
28.687899
0.061934
6.193353
0.265101
2.013423
0.434052
14.623177
0.334774
26.085993
false
false
2024-07-22
2024-10-16
1
princeton-nlp/Llama-3-8B-ProLong-64k-Base (Merge)
princeton-nlp_Llama-3-8B-ProLong-64k-Instruct_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/princeton-nlp/Llama-3-8B-ProLong-64k-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Llama-3-8B-ProLong-64k-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Llama-3-8B-ProLong-64k-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
princeton-nlp/Llama-3-8B-ProLong-64k-Instruct
fe55aed18544c5744239e473bb0d3aa0151776d3
22.970639
llama3
13
8.03
true
false
false
true
1.656687
0.556317
55.631724
0.508304
30.089572
0.061934
6.193353
0.295302
6.040268
0.439698
14.595573
0.32746
25.273345
false
false
2024-07-21
2024-10-16
1
princeton-nlp/Llama-3-8B-ProLong-64k-Instruct (Merge)
princeton-nlp_Llama-3-Base-8B-SFT_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/princeton-nlp/Llama-3-Base-8B-SFT" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Llama-3-Base-8B-SFT</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Llama-3-Base-8B-SFT-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
princeton-nlp/Llama-3-Base-8B-SFT
b622b7d814aa03aa722328bf88feaf1ad480b7fb
15.437201
1
8.03
true
false
false
true
1.788286
0.309646
30.964618
0.454784
24.388576
0.030967
3.096677
0.283557
4.474273
0.394958
8.036458
0.294963
21.662603
false
false
2024-05-17
2024-10-07
0
princeton-nlp/Llama-3-Base-8B-SFT
princeton-nlp_Llama-3-Base-8B-SFT-CPO_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/princeton-nlp/Llama-3-Base-8B-SFT-CPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Llama-3-Base-8B-SFT-CPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Llama-3-Base-8B-SFT-CPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
princeton-nlp/Llama-3-Base-8B-SFT-CPO
536ce7e7beb35175c48538fe46e7e9e100f228c9
15.878261
0
8.03
true
false
false
true
0.967846
0.370346
37.034624
0.459488
25.474649
0.049849
4.984894
0.274329
3.243848
0.360854
2.573438
0.297623
21.958112
false
false
2024-07-06
2024-10-07
0
princeton-nlp/Llama-3-Base-8B-SFT-CPO
princeton-nlp_Llama-3-Base-8B-SFT-DPO_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/princeton-nlp/Llama-3-Base-8B-SFT-DPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Llama-3-Base-8B-SFT-DPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Llama-3-Base-8B-SFT-DPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
princeton-nlp/Llama-3-Base-8B-SFT-DPO
3f5ec47c9beffb37cfbdcd837e76a336a9b1e651
18.162221
0
8.03
true
false
false
true
0.92634
0.411113
41.111251
0.466585
26.001874
0.028701
2.870091
0.310403
8.053691
0.38674
7.842448
0.307846
23.093972
false
false
2024-05-17
2024-10-07
0
princeton-nlp/Llama-3-Base-8B-SFT-DPO
princeton-nlp_Llama-3-Base-8B-SFT-IPO_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/princeton-nlp/Llama-3-Base-8B-SFT-IPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Llama-3-Base-8B-SFT-IPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Llama-3-Base-8B-SFT-IPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
princeton-nlp/Llama-3-Base-8B-SFT-IPO
85055cc4b9c707e0bd1239d20d1f62927a7a54c3
18.281889
0
8.03
true
false
false
true
0.932191
0.448656
44.865623
0.469007
25.705433
0.01284
1.283988
0.297819
6.375839
0.391948
7.960156
0.311503
23.500296
false
false
2024-05-17
2024-10-07
0
princeton-nlp/Llama-3-Base-8B-SFT-IPO
princeton-nlp_Llama-3-Base-8B-SFT-KTO_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/princeton-nlp/Llama-3-Base-8B-SFT-KTO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Llama-3-Base-8B-SFT-KTO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Llama-3-Base-8B-SFT-KTO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
princeton-nlp/Llama-3-Base-8B-SFT-KTO
49a8c2e5ccc7a28ed7bbedf093e352015fc1eb9b
17.964858
0
8.03
true
false
false
true
0.861852
0.452253
45.225335
0.469285
25.55523
0.012085
1.208459
0.305369
7.38255
0.384198
5.591406
0.305436
22.826167
false
false
2024-05-17
2024-10-07
0
princeton-nlp/Llama-3-Base-8B-SFT-KTO
princeton-nlp_Llama-3-Base-8B-SFT-ORPO_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/princeton-nlp/Llama-3-Base-8B-SFT-ORPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Llama-3-Base-8B-SFT-ORPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Llama-3-Base-8B-SFT-ORPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
princeton-nlp/Llama-3-Base-8B-SFT-ORPO
54d58402e0168faff6503e41621ad6c8274a310a
19.192797
0
8.03
true
false
false
true
0.906563
0.451654
45.165383
0.473406
26.485894
0.042296
4.229607
0.313758
8.501119
0.370677
7.634635
0.308261
23.140145
false
false
2024-05-17
2024-10-07
0
princeton-nlp/Llama-3-Base-8B-SFT-ORPO
princeton-nlp_Llama-3-Base-8B-SFT-RDPO_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/princeton-nlp/Llama-3-Base-8B-SFT-RDPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Llama-3-Base-8B-SFT-RDPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Llama-3-Base-8B-SFT-RDPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
princeton-nlp/Llama-3-Base-8B-SFT-RDPO
b41a964c2135ba34dcc6fa7edf76b6b9ea656949
18.815011
0
8.03
true
false
false
true
0.902435
0.448007
44.800684
0.466201
25.526521
0.037764
3.776435
0.306208
7.494407
0.40274
8.909115
0.301446
22.382905
false
false
2024-05-17
2024-10-07
0
princeton-nlp/Llama-3-Base-8B-SFT-RDPO
princeton-nlp_Llama-3-Base-8B-SFT-RRHF_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/princeton-nlp/Llama-3-Base-8B-SFT-RRHF" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Llama-3-Base-8B-SFT-RRHF</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Llama-3-Base-8B-SFT-RRHF-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
princeton-nlp/Llama-3-Base-8B-SFT-RRHF
aea8c04b3940cebd1f8296a2c76914f0ce70c276
16.081314
0
8.03
true
false
false
true
0.951469
0.335725
33.572477
0.452036
23.659142
0.033233
3.323263
0.305369
7.38255
0.372229
7.561979
0.288896
20.988475
false
false
2024-07-06
2024-10-07
0
princeton-nlp/Llama-3-Base-8B-SFT-RRHF
princeton-nlp_Llama-3-Base-8B-SFT-SLiC-HF_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/princeton-nlp/Llama-3-Base-8B-SFT-SLiC-HF" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Llama-3-Base-8B-SFT-SLiC-HF</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Llama-3-Base-8B-SFT-SLiC-HF-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
princeton-nlp/Llama-3-Base-8B-SFT-SLiC-HF
325092c1eddffc3ca7157be1ff9958128e5753ef
19.730525
0
8.03
true
false
false
true
0.960471
0.489048
48.904795
0.470408
26.373963
0.049849
4.984894
0.286913
4.9217
0.409094
10.270052
0.30635
22.927748
false
false
2024-07-06
2024-10-07
0
princeton-nlp/Llama-3-Base-8B-SFT-SLiC-HF
princeton-nlp_Llama-3-Base-8B-SFT-SimPO_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/princeton-nlp/Llama-3-Base-8B-SFT-SimPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Llama-3-Base-8B-SFT-SimPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Llama-3-Base-8B-SFT-SimPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
princeton-nlp/Llama-3-Base-8B-SFT-SimPO
0a6e518b13b67abe8433bce3f7beee9beb74a794
19.279456
0
8.03
false
false
false
true
0.861565
0.46854
46.854014
0.474125
26.39595
0.020393
2.039275
0.288591
5.145414
0.412688
11.852604
0.310505
23.38948
false
false
2024-05-24
2024-09-28
0
princeton-nlp/Llama-3-Base-8B-SFT-SimPO
princeton-nlp_Llama-3-Instruct-8B-CPO_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/princeton-nlp/Llama-3-Instruct-8B-CPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Llama-3-Instruct-8B-CPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Llama-3-Instruct-8B-CPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
princeton-nlp/Llama-3-Instruct-8B-CPO
d4645ae4c3b99892f1c59f60a77330be35567835
23.91096
0
8.03
true
false
false
true
0.739416
0.729299
72.929937
0.499879
28.604299
0.093656
9.365559
0.260067
1.342282
0.351396
1.757812
0.365193
29.465869
false
false
2024-07-06
2024-09-28
0
princeton-nlp/Llama-3-Instruct-8B-CPO
princeton-nlp_Llama-3-Instruct-8B-CPO-v0.2_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/princeton-nlp/Llama-3-Instruct-8B-CPO-v0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Llama-3-Instruct-8B-CPO-v0.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Llama-3-Instruct-8B-CPO-v0.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
princeton-nlp/Llama-3-Instruct-8B-CPO-v0.2
5ed83728712693437bd547f4cd32923ac4e1172d
24.821014
0
8.03
true
false
false
true
0.772886
0.750582
75.058179
0.502667
29.086407
0.10423
10.422961
0.260906
1.454139
0.361906
2.838281
0.370595
30.06612
false
false
2024-07-06
2024-09-28
0
princeton-nlp/Llama-3-Instruct-8B-CPO-v0.2
princeton-nlp_Llama-3-Instruct-8B-DPO_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/princeton-nlp/Llama-3-Instruct-8B-DPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Llama-3-Instruct-8B-DPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Llama-3-Instruct-8B-DPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
princeton-nlp/Llama-3-Instruct-8B-DPO
0afbf4c012ec7507f61c554999151b95a3651db3
22.667424
0
8.03
true
false
false
true
0.564854
0.675744
67.574369
0.49913
28.507392
0.034743
3.47432
0.271812
2.908277
0.373813
3.926563
0.366523
29.613623
false
false
2024-05-17
2024-09-28
0
princeton-nlp/Llama-3-Instruct-8B-DPO
princeton-nlp_Llama-3-Instruct-8B-DPO-v0.2_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/princeton-nlp/Llama-3-Instruct-8B-DPO-v0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Llama-3-Instruct-8B-DPO-v0.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Llama-3-Instruct-8B-DPO-v0.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
princeton-nlp/Llama-3-Instruct-8B-DPO-v0.2
d06275e02abbeaf29d911a3c0cf22922dcca6b0b
24.718027
0
8.03
true
false
false
true
0.60288
0.720806
72.080635
0.50562
28.939587
0.060423
6.042296
0.286913
4.9217
0.384448
5.55599
0.376912
30.767952
false
false
2024-07-06
2024-09-28
0
princeton-nlp/Llama-3-Instruct-8B-DPO-v0.2
princeton-nlp_Llama-3-Instruct-8B-KTO_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/princeton-nlp/Llama-3-Instruct-8B-KTO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Llama-3-Instruct-8B-KTO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Llama-3-Instruct-8B-KTO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
princeton-nlp/Llama-3-Instruct-8B-KTO
e697908201cbab01e0ca54088bb8cd2fd99b4574
22.789641
0
8.03
true
false
false
true
0.602517
0.68641
68.640984
0.49819
28.649658
0.034743
3.47432
0.276007
3.467562
0.369844
3.630469
0.359874
28.874852
false
false
2024-05-17
2024-09-28
0
princeton-nlp/Llama-3-Instruct-8B-KTO
princeton-nlp_Llama-3-Instruct-8B-KTO-v0.2_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/princeton-nlp/Llama-3-Instruct-8B-KTO-v0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Llama-3-Instruct-8B-KTO-v0.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Llama-3-Instruct-8B-KTO-v0.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
princeton-nlp/Llama-3-Instruct-8B-KTO-v0.2
477d33ea62ed57a0429517170612aa1df21c78d6
24.344687
0
8.03
true
false
false
true
0.630536
0.729025
72.902454
0.507977
29.648406
0.080816
8.081571
0.260067
1.342282
0.37775
4.452083
0.366772
29.641327
false
false
2024-07-06
2024-09-28
0
princeton-nlp/Llama-3-Instruct-8B-KTO-v0.2
princeton-nlp_Llama-3-Instruct-8B-ORPO_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/princeton-nlp/Llama-3-Instruct-8B-ORPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Llama-3-Instruct-8B-ORPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Llama-3-Instruct-8B-ORPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
princeton-nlp/Llama-3-Instruct-8B-ORPO
4bb3ffcf9ede48cb01a10bf3223eb41b59aa3fef
23.534475
0
8.03
true
false
false
true
0.623904
0.712813
71.281311
0.500121
28.839356
0.073263
7.326284
0.258389
1.118568
0.350188
3.240104
0.364611
29.401226
false
false
2024-05-17
2024-09-28
0
princeton-nlp/Llama-3-Instruct-8B-ORPO
princeton-nlp_Llama-3-Instruct-8B-ORPO-v0.2_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/princeton-nlp/Llama-3-Instruct-8B-ORPO-v0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Llama-3-Instruct-8B-ORPO-v0.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Llama-3-Instruct-8B-ORPO-v0.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
princeton-nlp/Llama-3-Instruct-8B-ORPO-v0.2
3ea5c542a3d8d61f6afb6cdbef5972a501ddf759
25.852852
1
8.03
true
false
false
true
0.594232
0.763321
76.332132
0.507835
29.604837
0.095166
9.516616
0.283557
4.474273
0.377969
4.846094
0.373088
30.343159
false
false
2024-07-06
2024-09-28
0
princeton-nlp/Llama-3-Instruct-8B-ORPO-v0.2
princeton-nlp_Llama-3-Instruct-8B-RDPO_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/princeton-nlp/Llama-3-Instruct-8B-RDPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Llama-3-Instruct-8B-RDPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Llama-3-Instruct-8B-RDPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
princeton-nlp/Llama-3-Instruct-8B-RDPO
9497ca226a68981f42df2e5b3a4a1a2ea702a942
22.584117
0
8.03
true
false
false
true
0.56625
0.666002
66.600176
0.503363
29.032479
0.023414
2.34139
0.282718
4.362416
0.375208
4.201042
0.360705
28.967199
false
false
2024-05-17
2024-09-28
0
princeton-nlp/Llama-3-Instruct-8B-RDPO
princeton-nlp_Llama-3-Instruct-8B-RDPO-v0.2_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/princeton-nlp/Llama-3-Instruct-8B-RDPO-v0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Llama-3-Instruct-8B-RDPO-v0.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Llama-3-Instruct-8B-RDPO-v0.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
princeton-nlp/Llama-3-Instruct-8B-RDPO-v0.2
4e5bc9779cba3a2f615379d3f8ef1bbb3ea487f7
24.427995
1
8.03
true
false
false
true
0.557948
0.707692
70.769226
0.504922
28.854277
0.050604
5.060423
0.292785
5.704698
0.380448
5.35599
0.37741
30.82336
false
false
2024-07-06
2024-09-28
0
princeton-nlp/Llama-3-Instruct-8B-RDPO-v0.2
princeton-nlp_Llama-3-Instruct-8B-RRHF_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/princeton-nlp/Llama-3-Instruct-8B-RRHF" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Llama-3-Instruct-8B-RRHF</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Llama-3-Instruct-8B-RRHF-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
princeton-nlp/Llama-3-Instruct-8B-RRHF
73561d9b0fd42b94250246f8d794251fe9f9d2e9
24.059318
0
8.03
true
false
false
true
0.639216
0.727451
72.745094
0.491055
27.216485
0.095166
9.516616
0.280201
4.026846
0.347552
1.477344
0.364362
29.373522
false
false
2024-07-06
2024-10-07
0
princeton-nlp/Llama-3-Instruct-8B-RRHF
princeton-nlp_Llama-3-Instruct-8B-RRHF-v0.2_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/princeton-nlp/Llama-3-Instruct-8B-RRHF-v0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Llama-3-Instruct-8B-RRHF-v0.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Llama-3-Instruct-8B-RRHF-v0.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
princeton-nlp/Llama-3-Instruct-8B-RRHF-v0.2
81191fbb214d17f0a4fec247da5d648f4cb61ef1
23.753751
0
8.03
true
false
false
true
0.505873
0.712488
71.248842
0.49839
28.498724
0.087613
8.761329
0.260067
1.342282
0.373781
5.089323
0.348238
27.582004
false
false
2024-07-06
2024-10-07
0
princeton-nlp/Llama-3-Instruct-8B-RRHF-v0.2
princeton-nlp_Llama-3-Instruct-8B-SLiC-HF_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/princeton-nlp/Llama-3-Instruct-8B-SLiC-HF" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Llama-3-Instruct-8B-SLiC-HF</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Llama-3-Instruct-8B-SLiC-HF-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
princeton-nlp/Llama-3-Instruct-8B-SLiC-HF
7e9001f6f4fe940c363bb7ea1814d33c79b21737
25.056382
0
8.03
true
false
false
true
0.725192
0.739966
73.996551
0.502942
29.211612
0.082326
8.232628
0.286074
4.809843
0.372292
5.369792
0.358461
28.717863
false
false
2024-07-06
2024-10-07
0
princeton-nlp/Llama-3-Instruct-8B-SLiC-HF
princeton-nlp_Llama-3-Instruct-8B-SLiC-HF-v0.2_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/princeton-nlp/Llama-3-Instruct-8B-SLiC-HF-v0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Llama-3-Instruct-8B-SLiC-HF-v0.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Llama-3-Instruct-8B-SLiC-HF-v0.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
princeton-nlp/Llama-3-Instruct-8B-SLiC-HF-v0.2
1821cc42189d8dab9e157c31b223dc60fc037c2d
23.728355
0
8.03
true
false
false
true
0.521239
0.710965
71.096468
0.49839
28.498724
0.087613
8.761329
0.260067
1.342282
0.373781
5.089323
0.348238
27.582004
false
false
2024-07-06
2024-10-07
0
princeton-nlp/Llama-3-Instruct-8B-SLiC-HF-v0.2
princeton-nlp_Llama-3-Instruct-8B-SimPO_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/princeton-nlp/Llama-3-Instruct-8B-SimPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Llama-3-Instruct-8B-SimPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Llama-3-Instruct-8B-SimPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
princeton-nlp/Llama-3-Instruct-8B-SimPO
f700cb6afb4509b10dea43ab72bb0e260e166be4
22.657116
57
8.03
true
false
false
true
0.533346
0.65039
65.038985
0.484468
26.709133
0.02568
2.567976
0.293624
5.816555
0.394833
8.154167
0.348903
27.655881
false
false
2024-05-17
2024-09-28
0
princeton-nlp/Llama-3-Instruct-8B-SimPO
princeton-nlp_Llama-3-Instruct-8B-SimPO-v0.2_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/princeton-nlp/Llama-3-Instruct-8B-SimPO-v0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Llama-3-Instruct-8B-SimPO-v0.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Llama-3-Instruct-8B-SimPO-v0.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
princeton-nlp/Llama-3-Instruct-8B-SimPO-v0.2
9ac0fbee445e7755e50520e9881d67588b4b854c
24.474601
6
8.03
true
false
false
true
0.579982
0.680865
68.086455
0.503834
29.214022
0.057402
5.740181
0.301174
6.823266
0.398802
7.85026
0.362201
29.133422
false
false
2024-07-06
2024-09-28
0
princeton-nlp/Llama-3-Instruct-8B-SimPO-v0.2
princeton-nlp_Mistral-7B-Base-SFT-CPO_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/princeton-nlp/Mistral-7B-Base-SFT-CPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Mistral-7B-Base-SFT-CPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Mistral-7B-Base-SFT-CPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
princeton-nlp/Mistral-7B-Base-SFT-CPO
7f67394668b94a9ddfb64daff8976b48b135d96c
17.373794
1
7.242
true
false
false
true
0.809769
0.465493
46.549267
0.438215
21.857696
0.026435
2.643505
0.291946
5.592841
0.407083
9.252083
0.265126
18.34737
false
false
2024-07-06
2024-10-07
0
princeton-nlp/Mistral-7B-Base-SFT-CPO
princeton-nlp_Mistral-7B-Base-SFT-DPO_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/princeton-nlp/Mistral-7B-Base-SFT-DPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Mistral-7B-Base-SFT-DPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Mistral-7B-Base-SFT-DPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
princeton-nlp/Mistral-7B-Base-SFT-DPO
17134fd80cfbf3980353967a30dc6f450f18f78f
16.236325
0
7.242
true
false
false
true
0.66762
0.440338
44.03383
0.435011
20.79098
0.016616
1.661631
0.272651
3.020134
0.412229
9.628646
0.264545
18.282728
false
false
2024-05-17
2024-10-07
0
princeton-nlp/Mistral-7B-Base-SFT-DPO
princeton-nlp_Mistral-7B-Base-SFT-IPO_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/princeton-nlp/Mistral-7B-Base-SFT-IPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Mistral-7B-Base-SFT-IPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Mistral-7B-Base-SFT-IPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
princeton-nlp/Mistral-7B-Base-SFT-IPO
eea781724e4d2ab8bdda7c13526f042de4cfae41
17.210428
0
7.242
true
false
false
true
0.667334
0.482953
48.295301
0.445802
23.703491
0.024924
2.492447
0.280201
4.026846
0.377625
4.836458
0.279172
19.908023
false
false
2024-05-17
2024-10-07
0
princeton-nlp/Mistral-7B-Base-SFT-IPO
princeton-nlp_Mistral-7B-Base-SFT-KTO_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/princeton-nlp/Mistral-7B-Base-SFT-KTO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Mistral-7B-Base-SFT-KTO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Mistral-7B-Base-SFT-KTO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
princeton-nlp/Mistral-7B-Base-SFT-KTO
02148bb9241b0f4bb0c75e93893eed005abe25e8
18.96264
0
7.242
true
false
false
true
0.666017
0.478482
47.848154
0.447643
23.107642
0.036254
3.625378
0.290268
5.369128
0.436781
13.03099
0.287151
20.794548
false
false
2024-05-17
2024-10-07
0
princeton-nlp/Mistral-7B-Base-SFT-KTO
princeton-nlp_Mistral-7B-Base-SFT-RDPO_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/princeton-nlp/Mistral-7B-Base-SFT-RDPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Mistral-7B-Base-SFT-RDPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Mistral-7B-Base-SFT-RDPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
princeton-nlp/Mistral-7B-Base-SFT-RDPO
2a63a6d9e1978c99444e440371268f7c2b7e0375
16.465757
0
7.242
true
false
false
true
0.662505
0.460647
46.064664
0.443953
22.98201
0.020393
2.039275
0.277685
3.691275
0.357938
4.275521
0.277676
19.7418
false
false
2024-05-17
2024-10-07
0
princeton-nlp/Mistral-7B-Base-SFT-RDPO
princeton-nlp_Mistral-7B-Base-SFT-RRHF_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/princeton-nlp/Mistral-7B-Base-SFT-RRHF" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Mistral-7B-Base-SFT-RRHF</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Mistral-7B-Base-SFT-RRHF-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
princeton-nlp/Mistral-7B-Base-SFT-RRHF
0d5861072e9d01f420451bf6a5b108bc8d3a76bc
16.194613
0
7.242
true
false
false
true
0.669001
0.440663
44.0663
0.428059
19.598831
0.02568
2.567976
0.290268
5.369128
0.418677
10.034635
0.239777
15.530807
false
false
2024-07-06
2024-10-07
0
princeton-nlp/Mistral-7B-Base-SFT-RRHF
princeton-nlp_Mistral-7B-Base-SFT-SLiC-HF_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/princeton-nlp/Mistral-7B-Base-SFT-SLiC-HF" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Mistral-7B-Base-SFT-SLiC-HF</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Mistral-7B-Base-SFT-SLiC-HF-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
princeton-nlp/Mistral-7B-Base-SFT-SLiC-HF
65d2cc49ad05258da3d982b39682c7f672f5e4ab
18.955533
0
7.242
true
false
false
true
0.668442
0.512728
51.272845
0.44224
22.304723
0.032477
3.247734
0.291946
5.592841
0.426083
11.527083
0.278092
19.787973
false
false
2024-07-06
2024-10-07
0
princeton-nlp/Mistral-7B-Base-SFT-SLiC-HF
princeton-nlp_Mistral-7B-Base-SFT-SimPO_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/princeton-nlp/Mistral-7B-Base-SFT-SimPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Mistral-7B-Base-SFT-SimPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Mistral-7B-Base-SFT-SimPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
princeton-nlp/Mistral-7B-Base-SFT-SimPO
9d9e8b8de4f673d45bc826efc4a1444f9d480222
16.893545
0
7.242
true
false
false
true
0.635706
0.470064
47.006387
0.439805
22.332886
0.006042
0.60423
0.283557
4.474273
0.397063
8.032813
0.270196
18.910683
false
false
2024-05-17
2024-09-21
0
princeton-nlp/Mistral-7B-Base-SFT-SimPO
princeton-nlp_Mistral-7B-Instruct-CPO_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/princeton-nlp/Mistral-7B-Instruct-CPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Mistral-7B-Instruct-CPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Mistral-7B-Instruct-CPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
princeton-nlp/Mistral-7B-Instruct-CPO
32492f8e5588f06005689ac944c2ea39c394c28e
15.565535
0
7.242
true
false
false
true
0.645922
0.420305
42.030479
0.406922
17.248538
0.021903
2.190332
0.26594
2.12528
0.417844
10.897135
0.270113
18.901448
false
false
2024-07-06
2024-10-07
0
princeton-nlp/Mistral-7B-Instruct-CPO
princeton-nlp_Mistral-7B-Instruct-DPO_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/princeton-nlp/Mistral-7B-Instruct-DPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Mistral-7B-Instruct-DPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Mistral-7B-Instruct-DPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
princeton-nlp/Mistral-7B-Instruct-DPO
5e96cff70d8db87cf17c616429c17c8dc9352543
16.549607
0
7.242
true
false
false
true
0.605267
0.517624
51.762435
0.406036
16.875389
0.030211
3.021148
0.268456
2.46085
0.383333
5.75
0.27485
19.427822
false
false
2024-05-17
2024-10-07
0
princeton-nlp/Mistral-7B-Instruct-DPO
princeton-nlp_Mistral-7B-Instruct-IPO_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/princeton-nlp/Mistral-7B-Instruct-IPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Mistral-7B-Instruct-IPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Mistral-7B-Instruct-IPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
princeton-nlp/Mistral-7B-Instruct-IPO
32ad99c6e7231bbe8ebd9d24b28e084c60848558
17.707096
0
7.242
true
false
false
true
0.625748
0.49292
49.29199
0.432218
20.09411
0.019637
1.963746
0.27349
3.131991
0.432417
12.785417
0.270778
18.975325
false
false
2024-05-17
2024-10-07
0
princeton-nlp/Mistral-7B-Instruct-IPO
princeton-nlp_Mistral-7B-Instruct-KTO_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/princeton-nlp/Mistral-7B-Instruct-KTO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Mistral-7B-Instruct-KTO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Mistral-7B-Instruct-KTO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
princeton-nlp/Mistral-7B-Instruct-KTO
834422e5b9b9eee6aac2f8d4822b925a6574d628
16.664827
0
7.242
true
false
false
true
0.603378
0.490797
49.079664
0.413959
17.812648
0.024169
2.416918
0.27349
3.131991
0.395271
7.408854
0.28125
20.138889
false
false
2024-05-17
2024-10-07
0
princeton-nlp/Mistral-7B-Instruct-KTO
princeton-nlp_Mistral-7B-Instruct-ORPO_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/princeton-nlp/Mistral-7B-Instruct-ORPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Mistral-7B-Instruct-ORPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Mistral-7B-Instruct-ORPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
princeton-nlp/Mistral-7B-Instruct-ORPO
69c0481f4100629a49ae73f760ddbb61d8e98e48
16.050529
0
7.242
true
false
false
true
0.624297
0.471962
47.196217
0.410406
18.038373
0.02719
2.719033
0.274329
3.243848
0.39124
6.638281
0.266207
18.46742
false
false
2024-05-17
2024-10-07
0
princeton-nlp/Mistral-7B-Instruct-ORPO
princeton-nlp_Mistral-7B-Instruct-RDPO_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/princeton-nlp/Mistral-7B-Instruct-RDPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Mistral-7B-Instruct-RDPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Mistral-7B-Instruct-RDPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
princeton-nlp/Mistral-7B-Instruct-RDPO
23ec6ab4f996134eb15c19322dabb34d7332d7cd
16.420491
0
7.242
true
false
false
true
0.610616
0.488723
48.872325
0.405015
17.048388
0.024169
2.416918
0.280201
4.026846
0.387333
6.416667
0.277676
19.7418
false
false
2024-05-17
2024-10-07
0
princeton-nlp/Mistral-7B-Instruct-RDPO
princeton-nlp_Mistral-7B-Instruct-RRHF_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/princeton-nlp/Mistral-7B-Instruct-RRHF" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Mistral-7B-Instruct-RRHF</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Mistral-7B-Instruct-RRHF-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
princeton-nlp/Mistral-7B-Instruct-RRHF
493d3ceb571232fe3b2f55c0bf78692760f4fc7e
16.829083
0
7.242
true
false
false
true
0.587751
0.496017
49.601723
0.418977
19.206552
0.024169
2.416918
0.276007
3.467562
0.397875
7.934375
0.265126
18.34737
false
false
2024-07-06
2024-10-07
0
princeton-nlp/Mistral-7B-Instruct-RRHF
princeton-nlp_Mistral-7B-Instruct-SLiC-HF_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/princeton-nlp/Mistral-7B-Instruct-SLiC-HF" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Mistral-7B-Instruct-SLiC-HF</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Mistral-7B-Instruct-SLiC-HF-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
princeton-nlp/Mistral-7B-Instruct-SLiC-HF
3d08c8b7c3e73beb2a3264848f17246b74c3d162
16.376556
0
7.242
true
false
false
true
0.622453
0.511529
51.152941
0.404001
16.653429
0.016616
1.661631
0.272651
3.020134
0.391302
6.71276
0.271526
19.058437
false
false
2024-07-06
2024-10-16
0
princeton-nlp/Mistral-7B-Instruct-SLiC-HF
princeton-nlp_Mistral-7B-Instruct-SimPO_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/princeton-nlp/Mistral-7B-Instruct-SimPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Mistral-7B-Instruct-SimPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Mistral-7B-Instruct-SimPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
princeton-nlp/Mistral-7B-Instruct-SimPO
03191ee1e60d21a698d11a515703a037073724f8
17.569551
2
7.242
false
false
false
true
0.570562
0.46869
46.868974
0.450723
22.382277
0.026435
2.643505
0.278523
3.803132
0.409781
9.75599
0.279671
19.963431
false
false
2024-05-24
2024-09-21
0
princeton-nlp/Mistral-7B-Instruct-SimPO
princeton-nlp_Sheared-LLaMA-1.3B_bfloat16
bfloat16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/princeton-nlp/Sheared-LLaMA-1.3B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Sheared-LLaMA-1.3B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Sheared-LLaMA-1.3B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
princeton-nlp/Sheared-LLaMA-1.3B
a4b76938edbf571ea7d7d9904861cbdca08809b4
5.505397
apache-2.0
93
1.3
true
false
false
false
0.3546
0.21977
21.977021
0.319705
4.74463
0.008308
0.830816
0.239933
0
0.371302
3.579427
0.117104
1.900488
false
false
2023-10-10
2024-07-29
0
princeton-nlp/Sheared-LLaMA-1.3B
princeton-nlp_Sheared-LLaMA-2.7B_bfloat16
bfloat16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/princeton-nlp/Sheared-LLaMA-2.7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Sheared-LLaMA-2.7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Sheared-LLaMA-2.7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
princeton-nlp/Sheared-LLaMA-2.7B
2f157a0306b75d37694ae05f6a4067220254d540
6.324627
apache-2.0
60
2.7
true
false
false
false
0.47005
0.241652
24.165215
0.325869
5.655521
0.006042
0.60423
0.275168
3.355705
0.356729
2.091146
0.118684
2.075946
false
false
2023-10-10
2024-07-29
0
princeton-nlp/Sheared-LLaMA-2.7B
princeton-nlp_gemma-2-9b-it-DPO_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/princeton-nlp/gemma-2-9b-it-DPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/gemma-2-9b-it-DPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__gemma-2-9b-it-DPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
princeton-nlp/gemma-2-9b-it-DPO
f646c99fc3aa7afc7b22c3c7115fd03a40fc1d22
19.434035
6
9.242
false
false
false
true
2.890627
0.276872
27.687203
0.594144
41.593654
0
0
0.33557
11.409396
0.382031
5.653906
0.37234
30.260047
false
false
2024-07-16
2024-09-19
2
google/gemma-2-9b
princeton-nlp_gemma-2-9b-it-SimPO_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/princeton-nlp/gemma-2-9b-it-SimPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/gemma-2-9b-it-SimPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__gemma-2-9b-it-SimPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
princeton-nlp/gemma-2-9b-it-SimPO
8c87091f412e3aa6f74f66bd86c57fb81cbc3fde
21.161652
mit
140
9.242
true
false
false
true
2.769004
0.320686
32.068578
0.583918
40.09343
0
0
0.33557
11.409396
0.412323
10.340365
0.397523
33.058141
false
false
2024-07-16
2024-08-10
2
google/gemma-2-9b
prithivMLmods_Deepthink-Reasoning-7B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/prithivMLmods/Deepthink-Reasoning-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Deepthink-Reasoning-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Deepthink-Reasoning-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
prithivMLmods/Deepthink-Reasoning-7B
0ccaa3825ded55cf8cfa18f7db53d91848e3733b
26.894145
creativeml-openrail-m
12
7.616
true
false
false
false
0.626998
0.484002
48.400245
0.550507
35.623731
0.200906
20.090634
0.299497
6.599553
0.443229
13.436979
0.434924
37.213726
false
false
2024-12-28
2025-01-09
1
prithivMLmods/Deepthink-Reasoning-7B (Merge)
prithivMLmods_GWQ-9B-Preview_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/prithivMLmods/GWQ-9B-Preview" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/GWQ-9B-Preview</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__GWQ-9B-Preview-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
prithivMLmods/GWQ-9B-Preview
5a0e00ac0ff885f54ef32e607508895bae864006
29.915362
gemma
9
9.242
true
false
false
false
2.461162
0.506584
50.658364
0.580575
40.669723
0.212236
21.223565
0.339765
11.96868
0.495104
21.821354
0.398354
33.150488
false
false
2025-01-04
2025-01-08
0
prithivMLmods/GWQ-9B-Preview
prithivMLmods_GWQ-9B-Preview2_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/prithivMLmods/GWQ-9B-Preview2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/GWQ-9B-Preview2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__GWQ-9B-Preview2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
prithivMLmods/GWQ-9B-Preview2
42f5d4f7d19eb59c9408ff70cdbc30459ec1ad3d
29.870954
creativeml-openrail-m
13
9.242
true
false
false
false
2.452824
0.520897
52.089678
0.579722
40.184861
0.226586
22.65861
0.326342
10.178971
0.48599
20.815365
0.399684
33.298242
false
false
2025-01-04
2025-01-08
1
prithivMLmods/GWQ-9B-Preview2 (Merge)