eval_name
stringlengths
12
111
Precision
stringclasses
3 values
Type
stringclasses
7 values
T
stringclasses
7 values
Weight type
stringclasses
3 values
Architecture
stringclasses
62 values
Model
stringlengths
355
689
fullname
stringlengths
4
102
Model sha
stringlengths
0
40
Average ⬆️
float64
0.74
52
Hub License
stringclasses
27 values
Hub ❤️
int64
0
5.99k
#Params (B)
float64
-1
141
Available on the hub
bool
2 classes
MoE
bool
2 classes
Flagged
bool
2 classes
Chat Template
bool
2 classes
CO₂ cost (kg)
float64
0.03
107
IFEval Raw
float64
0
0.9
IFEval
float64
0
90
BBH Raw
float64
0.24
0.75
BBH
float64
0.25
64.1
MATH Lvl 5 Raw
float64
0
0.52
MATH Lvl 5
float64
0
52.4
GPQA Raw
float64
0.21
0.47
GPQA
float64
0
29.4
MUSR Raw
float64
0.29
0.6
MUSR
float64
0
38.5
MMLU-PRO Raw
float64
0.1
0.73
MMLU-PRO
float64
0
70
Merged
bool
2 classes
Official Providers
bool
2 classes
Upload To Hub Date
stringclasses
478 values
Submission Date
stringclasses
219 values
Generation
int64
0
10
Base Model
stringlengths
4
102
princeton-nlp_Mistral-7B-Instruct-ORPO_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/princeton-nlp/Mistral-7B-Instruct-ORPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Mistral-7B-Instruct-ORPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Mistral-7B-Instruct-ORPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
princeton-nlp/Mistral-7B-Instruct-ORPO
69c0481f4100629a49ae73f760ddbb61d8e98e48
16.050529
0
7.242
true
false
false
true
0.624297
0.471962
47.196217
0.410406
18.038373
0.02719
2.719033
0.274329
3.243848
0.39124
6.638281
0.266207
18.46742
false
false
2024-05-17
2024-10-07
0
princeton-nlp/Mistral-7B-Instruct-ORPO
princeton-nlp_Mistral-7B-Instruct-RDPO_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/princeton-nlp/Mistral-7B-Instruct-RDPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Mistral-7B-Instruct-RDPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Mistral-7B-Instruct-RDPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
princeton-nlp/Mistral-7B-Instruct-RDPO
23ec6ab4f996134eb15c19322dabb34d7332d7cd
16.420491
0
7.242
true
false
false
true
0.610616
0.488723
48.872325
0.405015
17.048388
0.024169
2.416918
0.280201
4.026846
0.387333
6.416667
0.277676
19.7418
false
false
2024-05-17
2024-10-07
0
princeton-nlp/Mistral-7B-Instruct-RDPO
princeton-nlp_Mistral-7B-Instruct-RRHF_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/princeton-nlp/Mistral-7B-Instruct-RRHF" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Mistral-7B-Instruct-RRHF</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Mistral-7B-Instruct-RRHF-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
princeton-nlp/Mistral-7B-Instruct-RRHF
493d3ceb571232fe3b2f55c0bf78692760f4fc7e
16.829083
0
7.242
true
false
false
true
0.587751
0.496017
49.601723
0.418977
19.206552
0.024169
2.416918
0.276007
3.467562
0.397875
7.934375
0.265126
18.34737
false
false
2024-07-06
2024-10-07
0
princeton-nlp/Mistral-7B-Instruct-RRHF
princeton-nlp_Mistral-7B-Instruct-SLiC-HF_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/princeton-nlp/Mistral-7B-Instruct-SLiC-HF" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Mistral-7B-Instruct-SLiC-HF</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Mistral-7B-Instruct-SLiC-HF-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
princeton-nlp/Mistral-7B-Instruct-SLiC-HF
3d08c8b7c3e73beb2a3264848f17246b74c3d162
16.376556
0
7.242
true
false
false
true
0.622453
0.511529
51.152941
0.404001
16.653429
0.016616
1.661631
0.272651
3.020134
0.391302
6.71276
0.271526
19.058437
false
false
2024-07-06
2024-10-16
0
princeton-nlp/Mistral-7B-Instruct-SLiC-HF
princeton-nlp_Mistral-7B-Instruct-SimPO_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/princeton-nlp/Mistral-7B-Instruct-SimPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Mistral-7B-Instruct-SimPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Mistral-7B-Instruct-SimPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
princeton-nlp/Mistral-7B-Instruct-SimPO
03191ee1e60d21a698d11a515703a037073724f8
17.569551
2
7.242
false
false
false
true
0.570562
0.46869
46.868974
0.450723
22.382277
0.026435
2.643505
0.278523
3.803132
0.409781
9.75599
0.279671
19.963431
false
false
2024-05-24
2024-09-21
0
princeton-nlp/Mistral-7B-Instruct-SimPO
princeton-nlp_Sheared-LLaMA-1.3B_bfloat16
bfloat16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/princeton-nlp/Sheared-LLaMA-1.3B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Sheared-LLaMA-1.3B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Sheared-LLaMA-1.3B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
princeton-nlp/Sheared-LLaMA-1.3B
a4b76938edbf571ea7d7d9904861cbdca08809b4
5.505397
apache-2.0
93
1.3
true
false
false
false
0.3546
0.21977
21.977021
0.319705
4.74463
0.008308
0.830816
0.239933
0
0.371302
3.579427
0.117104
1.900488
false
false
2023-10-10
2024-07-29
0
princeton-nlp/Sheared-LLaMA-1.3B
princeton-nlp_Sheared-LLaMA-2.7B_bfloat16
bfloat16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/princeton-nlp/Sheared-LLaMA-2.7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Sheared-LLaMA-2.7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__Sheared-LLaMA-2.7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
princeton-nlp/Sheared-LLaMA-2.7B
2f157a0306b75d37694ae05f6a4067220254d540
6.324627
apache-2.0
61
2.7
true
false
false
false
0.47005
0.241652
24.165215
0.325869
5.655521
0.006042
0.60423
0.275168
3.355705
0.356729
2.091146
0.118684
2.075946
false
false
2023-10-10
2024-07-29
0
princeton-nlp/Sheared-LLaMA-2.7B
princeton-nlp_gemma-2-9b-it-DPO_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/princeton-nlp/gemma-2-9b-it-DPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/gemma-2-9b-it-DPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__gemma-2-9b-it-DPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
princeton-nlp/gemma-2-9b-it-DPO
f646c99fc3aa7afc7b22c3c7115fd03a40fc1d22
19.434035
7
9.242
false
false
false
true
2.890627
0.276872
27.687203
0.594144
41.593654
0
0
0.33557
11.409396
0.382031
5.653906
0.37234
30.260047
false
false
2024-07-16
2024-09-19
2
google/gemma-2-9b
princeton-nlp_gemma-2-9b-it-SimPO_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/princeton-nlp/gemma-2-9b-it-SimPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/gemma-2-9b-it-SimPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/princeton-nlp__gemma-2-9b-it-SimPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
princeton-nlp/gemma-2-9b-it-SimPO
8c87091f412e3aa6f74f66bd86c57fb81cbc3fde
21.161652
mit
142
9.242
true
false
false
true
2.769004
0.320686
32.068578
0.583918
40.09343
0
0
0.33557
11.409396
0.412323
10.340365
0.397523
33.058141
false
false
2024-07-16
2024-08-10
2
google/gemma-2-9b
prithivMLmods_Bellatrix-1.5B-xElite_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/prithivMLmods/Bellatrix-1.5B-xElite" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Bellatrix-1.5B-xElite</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Bellatrix-1.5B-xElite-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
prithivMLmods/Bellatrix-1.5B-xElite
4ec39cef1bf7701abb30dda694b4918c517d1c0d
9.547601
apache-2.0
7
1.777
true
false
false
false
0.599664
0.196414
19.64144
0.35012
9.486709
0.126133
12.613293
0.278523
3.803132
0.361906
4.438281
0.165725
7.302748
false
false
2025-01-25
2025-01-27
1
prithivMLmods/Bellatrix-1.5B-xElite (Merge)
prithivMLmods_Bellatrix-Tiny-1B-v2_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/prithivMLmods/Bellatrix-Tiny-1B-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Bellatrix-Tiny-1B-v2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Bellatrix-Tiny-1B-v2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
prithivMLmods/Bellatrix-Tiny-1B-v2
d82282c0853688ed16e3b9e121a09d063c566cc5
5.970924
llama3.2
7
1.236
true
false
false
false
0.386866
0.150952
15.09517
0.326768
6.032562
0.024924
2.492447
0.272651
3.020134
0.343021
3.710937
0.149269
5.474291
false
false
2025-01-26
2025-01-27
1
prithivMLmods/Bellatrix-Tiny-1B-v2 (Merge)
prithivMLmods_COCO-7B-Instruct-1M_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/prithivMLmods/COCO-7B-Instruct-1M" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/COCO-7B-Instruct-1M</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__COCO-7B-Instruct-1M-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
prithivMLmods/COCO-7B-Instruct-1M
a8ccc848bd1db0f05172a4e1c2197a0d3b4f25c5
28.171845
apache-2.0
7
7.616
true
false
false
false
0.669052
0.47431
47.431039
0.540996
34.677883
0.30287
30.287009
0.307886
7.718121
0.43824
13.513281
0.418634
35.403738
false
false
2025-01-25
2025-01-27
1
prithivMLmods/COCO-7B-Instruct-1M (Merge)
prithivMLmods_Calcium-Opus-14B-Elite_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/prithivMLmods/Calcium-Opus-14B-Elite" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Calcium-Opus-14B-Elite</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Calcium-Opus-14B-Elite-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
prithivMLmods/Calcium-Opus-14B-Elite
a8661f82079677c777595e4259dbaf5a72c8f134
38.377957
apache-2.0
9
14.766
true
false
false
false
2.012399
0.605152
60.515211
0.631736
46.934158
0.376888
37.688822
0.374161
16.55481
0.485958
20.778125
0.53017
47.796616
false
false
2025-01-23
2025-01-23
1
prithivMLmods/Calcium-Opus-14B-Elite (Merge)
prithivMLmods_Calcium-Opus-14B-Elite_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/prithivMLmods/Calcium-Opus-14B-Elite" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Calcium-Opus-14B-Elite</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Calcium-Opus-14B-Elite-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
prithivMLmods/Calcium-Opus-14B-Elite
a8661f82079677c777595e4259dbaf5a72c8f134
38.249365
apache-2.0
9
14.766
true
false
false
false
2.022333
0.606351
60.635115
0.62959
46.532809
0.370846
37.084592
0.373322
16.442953
0.487323
20.948698
0.530668
47.852024
false
false
2025-01-23
2025-01-23
1
prithivMLmods/Calcium-Opus-14B-Elite (Merge)
prithivMLmods_Calcium-Opus-14B-Elite-1M_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/prithivMLmods/Calcium-Opus-14B-Elite-1M" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Calcium-Opus-14B-Elite-1M</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Calcium-Opus-14B-Elite-1M-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
prithivMLmods/Calcium-Opus-14B-Elite-1M
07f093df0a87d5d13e4325aa54eb62de9322721c
35.110144
apache-2.0
7
14.77
true
false
false
false
1.946804
0.561288
56.128849
0.63294
46.935523
0.295317
29.531722
0.352349
13.646532
0.467604
18.283854
0.515209
46.134382
false
false
2025-01-25
2025-01-27
1
prithivMLmods/Calcium-Opus-14B-Elite-1M (Merge)
prithivMLmods_Calcium-Opus-14B-Elite-Stock_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/prithivMLmods/Calcium-Opus-14B-Elite-Stock" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Calcium-Opus-14B-Elite-Stock</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Calcium-Opus-14B-Elite-Stock-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
prithivMLmods/Calcium-Opus-14B-Elite-Stock
e3b7fa2d20fa3e7a92bb7a99ad05219c9a86a95d
36.4921
8
14.766
false
false
false
false
1.98723
0.614295
61.429452
0.632877
46.897899
0.271903
27.190332
0.368289
15.771812
0.48075
20.060417
0.528424
47.602689
false
false
2025-01-25
2025-01-25
1
prithivMLmods/Calcium-Opus-14B-Elite-Stock (Merge)
prithivMLmods_Calcium-Opus-14B-Elite2_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/prithivMLmods/Calcium-Opus-14B-Elite2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Calcium-Opus-14B-Elite2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Calcium-Opus-14B-Elite2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
prithivMLmods/Calcium-Opus-14B-Elite2
0d948a368ff62658c06f90219849d8a6be29b78e
38.449708
apache-2.0
8
14.766
true
false
false
false
2.012723
0.617617
61.761681
0.631826
46.80615
0.361027
36.102719
0.369966
15.995526
0.493958
22.244792
0.530086
47.787382
false
false
2025-01-24
2025-01-25
1
prithivMLmods/Calcium-Opus-14B-Elite2 (Merge)
prithivMLmods_Calcium-Opus-14B-Elite3_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/prithivMLmods/Calcium-Opus-14B-Elite3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Calcium-Opus-14B-Elite3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Calcium-Opus-14B-Elite3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
prithivMLmods/Calcium-Opus-14B-Elite3
6be2c8ea522ff941fa1ed5bec18949ac4c3b5651
35.857734
apache-2.0
8
14.766
true
false
false
false
2.012315
0.542829
54.282858
0.63504
47.0746
0.293807
29.380665
0.370805
16.107383
0.479479
20.134896
0.533494
48.166002
false
false
2025-01-25
2025-01-25
1
prithivMLmods/Calcium-Opus-14B-Elite3 (Merge)
prithivMLmods_Calcium-Opus-14B-Elite4_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/prithivMLmods/Calcium-Opus-14B-Elite4" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Calcium-Opus-14B-Elite4</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Calcium-Opus-14B-Elite4-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
prithivMLmods/Calcium-Opus-14B-Elite4
59525af6aae57e700ff9cd6ce9c6b3257f422f4c
34.540949
apache-2.0
8
14.766
true
false
false
false
1.958301
0.611197
61.119718
0.619526
45.208475
0.230363
23.036254
0.355705
14.09396
0.468719
17.689844
0.514877
46.097444
false
false
2025-01-25
2025-01-25
1
prithivMLmods/Calcium-Opus-14B-Elite4 (Merge)
prithivMLmods_Calcium-Opus-14B-Merge_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/prithivMLmods/Calcium-Opus-14B-Merge" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Calcium-Opus-14B-Merge</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Calcium-Opus-14B-Merge-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
prithivMLmods/Calcium-Opus-14B-Merge
ceb41ff76990a24d2f4ff29f1c342fcd7322948a
35.795652
8
14.766
false
false
false
false
2.069258
0.494943
49.494342
0.631929
46.766668
0.330816
33.081571
0.370805
16.107383
0.486083
20.927083
0.535572
48.396868
false
false
2025-01-24
2025-01-24
1
prithivMLmods/Calcium-Opus-14B-Merge (Merge)
prithivMLmods_Calcium-Opus-20B-v1_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/prithivMLmods/Calcium-Opus-20B-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Calcium-Opus-20B-v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Calcium-Opus-20B-v1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
prithivMLmods/Calcium-Opus-20B-v1
28395429552eb6f22cd3dc8b54cd03e47c6132c9
26.849891
apache-2.0
8
19.173
true
false
false
false
2.73627
0.309272
30.927162
0.599033
41.805576
0.110272
11.02719
0.353188
13.758389
0.494333
22.091667
0.473404
41.489362
false
false
2025-01-19
2025-01-23
1
prithivMLmods/Calcium-Opus-20B-v1 (Merge)
prithivMLmods_Codepy-Deepthink-3B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/prithivMLmods/Codepy-Deepthink-3B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Codepy-Deepthink-3B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Codepy-Deepthink-3B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
prithivMLmods/Codepy-Deepthink-3B
73551f0560645b098ff8293e70ff633bfc72c125
17.367825
creativeml-openrail-m
8
3.213
true
false
false
false
0.605503
0.43272
43.271963
0.425945
18.640888
0.111782
11.178248
0.279362
3.914989
0.331021
3.977604
0.309009
23.223257
false
false
2024-12-26
2025-01-12
1
prithivMLmods/Codepy-Deepthink-3B (Merge)
prithivMLmods_Deepthink-Reasoning-14B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/prithivMLmods/Deepthink-Reasoning-14B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Deepthink-Reasoning-14B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Deepthink-Reasoning-14B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
prithivMLmods/Deepthink-Reasoning-14B
08fd00d4ac2bf07766c8bab7e73d17028487d23a
34.795154
apache-2.0
10
14.77
true
false
false
false
1.950391
0.542354
54.235429
0.633405
47.306257
0.244713
24.471299
0.366611
15.548098
0.473156
19.477865
0.529588
47.731974
false
false
2025-01-20
2025-01-22
1
prithivMLmods/Deepthink-Reasoning-14B (Merge)
prithivMLmods_Deepthink-Reasoning-7B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/prithivMLmods/Deepthink-Reasoning-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Deepthink-Reasoning-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Deepthink-Reasoning-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
prithivMLmods/Deepthink-Reasoning-7B
0ccaa3825ded55cf8cfa18f7db53d91848e3733b
26.894145
creativeml-openrail-m
14
7.616
true
false
false
false
0.626998
0.484002
48.400245
0.550507
35.623731
0.200906
20.090634
0.299497
6.599553
0.443229
13.436979
0.434924
37.213726
false
false
2024-12-28
2025-01-09
1
prithivMLmods/Deepthink-Reasoning-7B (Merge)
prithivMLmods_FastThink-0.5B-Tiny_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/prithivMLmods/FastThink-0.5B-Tiny" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/FastThink-0.5B-Tiny</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__FastThink-0.5B-Tiny-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
prithivMLmods/FastThink-0.5B-Tiny
c07fd949ceba096d7c2e405bcfce99e269f7ca39
7.252605
apache-2.0
7
0.494
true
false
false
false
0.537537
0.257989
25.79888
0.320558
5.01961
0.004532
0.453172
0.260906
1.454139
0.356635
3.579427
0.164894
7.210402
false
false
2025-01-20
2025-01-24
1
prithivMLmods/FastThink-0.5B-Tiny (Merge)
prithivMLmods_GWQ-9B-Preview_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/prithivMLmods/GWQ-9B-Preview" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/GWQ-9B-Preview</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__GWQ-9B-Preview-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
prithivMLmods/GWQ-9B-Preview
5a0e00ac0ff885f54ef32e607508895bae864006
29.915362
gemma
9
9.242
true
false
false
false
2.461162
0.506584
50.658364
0.580575
40.669723
0.212236
21.223565
0.339765
11.96868
0.495104
21.821354
0.398354
33.150488
false
false
2025-01-04
2025-01-08
0
prithivMLmods/GWQ-9B-Preview
prithivMLmods_GWQ-9B-Preview2_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/prithivMLmods/GWQ-9B-Preview2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/GWQ-9B-Preview2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__GWQ-9B-Preview2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
prithivMLmods/GWQ-9B-Preview2
42f5d4f7d19eb59c9408ff70cdbc30459ec1ad3d
29.870954
creativeml-openrail-m
16
9.242
true
false
false
false
2.452824
0.520897
52.089678
0.579722
40.184861
0.226586
22.65861
0.326342
10.178971
0.48599
20.815365
0.399684
33.298242
false
false
2025-01-04
2025-01-08
1
prithivMLmods/GWQ-9B-Preview2 (Merge)
prithivMLmods_GWQ2b_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/prithivMLmods/GWQ2b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/GWQ2b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__GWQ2b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
prithivMLmods/GWQ2b
1d2a808ec30008a2cba697b1bb742ab67efb71f0
16.404535
gemma
10
2.614
true
false
false
false
1.204289
0.411487
41.148708
0.414337
17.68035
0.061178
6.117825
0.282718
4.362416
0.431115
12.75599
0.247257
16.361924
false
false
2025-01-09
2025-01-12
1
prithivMLmods/GWQ2b (Merge)
prithivMLmods_Llama-3.1-5B-Instruct_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/prithivMLmods/Llama-3.1-5B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Llama-3.1-5B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Llama-3.1-5B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
prithivMLmods/Llama-3.1-5B-Instruct
310ab744cd88aecedc534abd373d2f66a0c82f19
3.955411
llama3.1
7
5.413
true
false
false
false
0.499552
0.14066
14.066012
0.305107
3.109216
0
0
0.264262
1.901566
0.354
2.616667
0.118351
2.039007
false
false
2025-01-04
2025-01-12
0
prithivMLmods/Llama-3.1-5B-Instruct
prithivMLmods_Llama-3.1-8B-Open-SFT_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/prithivMLmods/Llama-3.1-8B-Open-SFT" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Llama-3.1-8B-Open-SFT</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Llama-3.1-8B-Open-SFT-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
prithivMLmods/Llama-3.1-8B-Open-SFT
e5d7fa281735f7fcc09fdb5810a2118789040d67
20.942999
creativeml-openrail-m
11
8.03
true
false
false
false
0.727794
0.412262
41.226169
0.496798
28.179928
0.115559
11.555891
0.309564
7.941834
0.390365
8.728906
0.352227
28.025266
false
false
2024-12-18
2025-01-12
1
prithivMLmods/Llama-3.1-8B-Open-SFT (Merge)
prithivMLmods_Llama-3.2-3B-Math-Oct_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/prithivMLmods/Llama-3.2-3B-Math-Oct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Llama-3.2-3B-Math-Oct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Llama-3.2-3B-Math-Oct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
prithivMLmods/Llama-3.2-3B-Math-Oct
5d72ae9689eb8307a741c6e7a455e427a792cd15
17.416778
llama3.2
7
3.213
true
false
false
false
0.595683
0.458523
45.852338
0.437184
19.94675
0.114048
11.404834
0.258389
1.118568
0.34699
4.940365
0.29114
21.23781
false
false
2025-01-22
2025-01-24
1
prithivMLmods/Llama-3.2-3B-Math-Oct (Merge)
prithivMLmods_Llama-3.2-6B-AlgoCode_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/prithivMLmods/Llama-3.2-6B-AlgoCode" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Llama-3.2-6B-AlgoCode</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Llama-3.2-6B-AlgoCode-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
prithivMLmods/Llama-3.2-6B-AlgoCode
e111d34ff9033fe36b4f1c283a17d017b4e4e5c6
9.250648
llama3.2
7
6.339
true
false
false
false
0.77736
0.213576
21.357554
0.374774
11.602526
0.010574
1.057402
0.286913
4.9217
0.401344
7.701302
0.179771
8.863401
false
false
2025-01-10
2025-01-12
0
prithivMLmods/Llama-3.2-6B-AlgoCode
prithivMLmods_Llama-8B-Distill-CoT_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/prithivMLmods/Llama-8B-Distill-CoT" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Llama-8B-Distill-CoT</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Llama-8B-Distill-CoT-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
prithivMLmods/Llama-8B-Distill-CoT
4c2d02c2cd92f4c371547201027202ac42d88a71
19.107331
llama3.1
10
8.03
true
false
false
false
0.717502
0.334151
33.415116
0.429762
19.595123
0.30136
30.135952
0.28943
5.257271
0.371979
6.997396
0.273188
19.243129
false
false
2025-01-21
2025-01-22
1
prithivMLmods/Llama-8B-Distill-CoT (Merge)
prithivMLmods_Llama-Deepsync-1B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/prithivMLmods/Llama-Deepsync-1B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Llama-Deepsync-1B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Llama-Deepsync-1B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
prithivMLmods/Llama-Deepsync-1B
03a9a38ffbb49f0f176a901a5fab3e444d6131fe
10.118362
creativeml-openrail-m
9
1.236
true
false
false
false
0.377137
0.357007
35.700719
0.338563
7.763873
0.034743
3.47432
0.260067
1.342282
0.35651
4.230469
0.173787
8.198508
false
false
2024-12-29
2025-01-12
1
prithivMLmods/Llama-Deepsync-1B (Merge)
prithivMLmods_Llama-Deepsync-3B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/prithivMLmods/Llama-Deepsync-3B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Llama-Deepsync-3B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Llama-Deepsync-3B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
prithivMLmods/Llama-Deepsync-3B
9f7c81f997f9a35797b511197e48a64ffb6d046f
17.063214
creativeml-openrail-m
15
3.213
true
false
false
false
0.608077
0.430222
43.022181
0.429152
18.963664
0.111027
11.102719
0.271812
2.908277
0.332385
3.814844
0.303108
22.567598
false
false
2024-12-29
2025-01-12
1
prithivMLmods/Llama-Deepsync-3B (Merge)
prithivMLmods_Llama-Express.1-Math_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/prithivMLmods/Llama-Express.1-Math" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Llama-Express.1-Math</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Llama-Express.1-Math-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
prithivMLmods/Llama-Express.1-Math
9c32d92f0ef3a4c4935992c9a5074d7a65ea91bc
12.082506
llama3.2
7
1.236
true
false
false
true
0.356045
0.508432
50.843207
0.336381
7.19902
0.050604
5.060423
0.263423
1.789709
0.314344
0.826302
0.160987
6.776374
false
false
2025-01-21
2025-01-25
1
prithivMLmods/Llama-Express.1-Math (Merge)
prithivMLmods_LwQ-10B-Instruct_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/prithivMLmods/LwQ-10B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/LwQ-10B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__LwQ-10B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
prithivMLmods/LwQ-10B-Instruct
3db52014aba9ec7163c28af47aac1f07af8fe0f6
20.866756
llama3.1
7
10.732
true
false
false
false
0.725175
0.393477
39.347709
0.512171
31.590273
0.033988
3.398792
0.312081
8.277405
0.454396
16.832813
0.331782
25.753546
false
false
2025-01-14
2025-01-19
1
prithivMLmods/LwQ-10B-Instruct (Merge)
prithivMLmods_LwQ-Reasoner-10B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/prithivMLmods/LwQ-Reasoner-10B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/LwQ-Reasoner-10B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__LwQ-Reasoner-10B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
prithivMLmods/LwQ-Reasoner-10B
fcd46007bd9f098004843dd79042a99543a22293
26.730028
llama3.1
8
10.306
true
false
false
false
0.894598
0.294134
29.413401
0.586625
40.337248
0.342145
34.214502
0.346477
12.863535
0.407854
8.581771
0.414727
34.96971
false
false
2025-01-18
2025-01-19
1
prithivMLmods/LwQ-Reasoner-10B (Merge)
prithivMLmods_Omni-Reasoner-Merged_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/prithivMLmods/Omni-Reasoner-Merged" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Omni-Reasoner-Merged</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Omni-Reasoner-Merged-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
prithivMLmods/Omni-Reasoner-Merged
5c34ad1b2510c510025ac724a16bed7f5ae5f1c3
28.429225
9
7.616
false
false
false
false
0.631416
0.459947
45.994738
0.550785
35.361777
0.284743
28.47432
0.303691
7.158837
0.461646
16.205729
0.43642
37.37995
false
false
2025-01-16
2025-01-17
1
prithivMLmods/Omni-Reasoner-Merged (Merge)
prithivMLmods_Omni-Reasoner3-Merged_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/prithivMLmods/Omni-Reasoner3-Merged" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Omni-Reasoner3-Merged</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Omni-Reasoner3-Merged-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
prithivMLmods/Omni-Reasoner3-Merged
a8fbe5740e04a78661dedd16597fa4d5a135ad95
18.421149
7
3.213
false
false
false
false
0.5883
0.49347
49.346955
0.438785
20.586522
0.108006
10.800604
0.264262
1.901566
0.352229
6.228646
0.294963
21.662603
false
false
2025-01-17
2025-01-17
1
prithivMLmods/Omni-Reasoner3-Merged (Merge)
prithivMLmods_Phi-4-Empathetic_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/prithivMLmods/Phi-4-Empathetic" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Phi-4-Empathetic</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Phi-4-Empathetic-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
prithivMLmods/Phi-4-Empathetic
181a87cfc05f0ee538b14cf4a773ad3b816224fe
28.158044
mit
12
14.66
true
false
false
false
0.897638
0.049659
4.965935
0.672682
52.838938
0.259063
25.906344
0.380034
17.337808
0.499135
22.72526
0.506566
45.17398
false
false
2025-01-10
2025-01-12
1
prithivMLmods/Phi-4-Empathetic (Merge)
prithivMLmods_Phi-4-Math-IO_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/prithivMLmods/Phi-4-Math-IO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Phi-4-Math-IO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Phi-4-Math-IO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
prithivMLmods/Phi-4-Math-IO
2e3f81b0c1613d33a4b0e216120fa3a3dd9206f8
25.804663
mit
8
14.66
true
false
false
false
0.966947
0.058977
5.897685
0.666826
52.093771
0.096677
9.667674
0.39849
19.798658
0.487292
20.644792
0.520529
46.725399
false
false
2025-01-10
2025-01-12
1
prithivMLmods/Phi-4-Math-IO (Merge)
prithivMLmods_Phi-4-QwQ_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/prithivMLmods/Phi-4-QwQ" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Phi-4-QwQ</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Phi-4-QwQ-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
prithivMLmods/Phi-4-QwQ
f9d9cc11a7c9e56420b705ac97f06362321dd89a
29.047166
mit
12
14.66
true
false
false
false
0.985764
0.055929
5.592938
0.669557
52.28685
0.324773
32.477341
0.39094
18.791946
0.465063
17.632813
0.52751
47.501108
false
false
2025-01-10
2025-01-12
1
prithivMLmods/Phi-4-QwQ (Merge)
prithivMLmods_Phi-4-Super_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/prithivMLmods/Phi-4-Super" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Phi-4-Super</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Phi-4-Super-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
prithivMLmods/Phi-4-Super
d0632dd9df3d6a8ae4f10f2185d38eeb61cab9d2
30.274078
8
14.66
false
false
false
false
0.962645
0.048136
4.813561
0.672012
52.697295
0.342145
34.214502
0.394295
19.239374
0.504375
23.280208
0.526596
47.399527
false
false
2025-01-23
2025-01-24
1
prithivMLmods/Phi-4-Super (Merge)
prithivMLmods_Phi-4-Super-1_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/prithivMLmods/Phi-4-Super-1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Phi-4-Super-1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Phi-4-Super-1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
prithivMLmods/Phi-4-Super-1
081e3442df878853ab8bd765430c961658ce5024
30.104386
9
14.66
false
false
false
false
0.935598
0.041766
4.176585
0.672934
52.905831
0.344411
34.441088
0.393456
19.127517
0.50174
22.917448
0.523521
47.057846
false
false
2025-01-24
2025-01-24
1
prithivMLmods/Phi-4-Super-1 (Merge)
prithivMLmods_Phi-4-Super-o1_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/prithivMLmods/Phi-4-Super-o1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Phi-4-Super-o1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Phi-4-Super-o1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
prithivMLmods/Phi-4-Super-o1
081e3442df878853ab8bd765430c961658ce5024
30.104386
9
14.66
false
false
false
false
0.964152
0.041766
4.176585
0.672934
52.905831
0.344411
34.441088
0.393456
19.127517
0.50174
22.917448
0.523521
47.057846
false
false
2025-01-24
2025-01-24
1
prithivMLmods/Phi-4-Super-o1 (Merge)
prithivMLmods_Phi-4-o1_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/prithivMLmods/Phi-4-o1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Phi-4-o1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Phi-4-o1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
prithivMLmods/Phi-4-o1
aa2a7571e9dbce0fefe98479fe04f298f2491b8c
30.116173
mit
25
14.66
true
false
false
false
0.869083
0.028976
2.897645
0.668873
52.170862
0.39426
39.425982
0.38255
17.673378
0.497771
22.154687
0.51737
46.374483
false
false
2025-01-08
2025-01-09
1
prithivMLmods/Phi-4-o1 (Merge)
prithivMLmods_Phi4-Super_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/prithivMLmods/Phi4-Super" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Phi4-Super</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Phi4-Super-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
prithivMLmods/Phi4-Super
d27188b144a6ac8c2d70f761e8afd8b05c74fd16
30.274078
8
14.66
false
false
false
false
0.918532
0.048136
4.813561
0.672012
52.697295
0.342145
34.214502
0.394295
19.239374
0.504375
23.280208
0.526596
47.399527
false
false
2025-01-23
2025-01-23
1
prithivMLmods/Phi4-Super (Merge)
prithivMLmods_QwQ-LCoT-14B-Conversational_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/prithivMLmods/QwQ-LCoT-14B-Conversational" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/QwQ-LCoT-14B-Conversational</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__QwQ-LCoT-14B-Conversational-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
prithivMLmods/QwQ-LCoT-14B-Conversational
60ef4aa0a2660f9b6f28a3de773729969a1df9ae
33.165443
apache-2.0
9
14.77
true
false
false
false
1.954452
0.404743
40.474275
0.623983
45.62626
0.314199
31.41994
0.349832
13.310962
0.484719
20.623177
0.527842
47.538047
false
false
2025-01-18
2025-01-19
1
prithivMLmods/QwQ-LCoT-14B-Conversational (Merge)
prithivMLmods_QwQ-LCoT-3B-Instruct_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/prithivMLmods/QwQ-LCoT-3B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/QwQ-LCoT-3B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__QwQ-LCoT-3B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
prithivMLmods/QwQ-LCoT-3B-Instruct
1f47223ac1c6069c3e53b75a45ad496f0fb9a124
21.012747
creativeml-openrail-m
10
3.086
true
false
false
false
0.765394
0.435442
43.54424
0.476298
26.621188
0.101964
10.196375
0.281879
4.250559
0.435792
12.773958
0.358211
28.69016
false
false
2024-12-12
2025-01-12
1
prithivMLmods/QwQ-LCoT-3B-Instruct (Merge)
prithivMLmods_QwQ-LCoT-7B-Instruct_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/prithivMLmods/QwQ-LCoT-7B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/QwQ-LCoT-7B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__QwQ-LCoT-7B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
prithivMLmods/QwQ-LCoT-7B-Instruct
06f0076fcf5cb72222513e6c76bd33e1ebaa97b7
28.132178
creativeml-openrail-m
23
7.616
true
false
false
false
0.650305
0.49869
49.869014
0.546647
34.780933
0.207704
20.770393
0.302013
6.935123
0.480188
19.390104
0.433428
37.047503
false
false
2024-12-14
2025-01-07
1
prithivMLmods/QwQ-LCoT-7B-Instruct (Merge)
prithivMLmods_QwQ-LCoT1-Merged_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/prithivMLmods/QwQ-LCoT1-Merged" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/QwQ-LCoT1-Merged</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__QwQ-LCoT1-Merged-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
prithivMLmods/QwQ-LCoT1-Merged
d85a4f359bc568afb7b1a2a6e6503934bb352ab6
28.38084
8
7.616
false
false
false
false
0.65538
0.475135
47.513486
0.548096
35.166254
0.249245
24.924471
0.307047
7.606264
0.469615
17.76849
0.435755
37.306073
false
false
2025-01-21
2025-01-22
1
prithivMLmods/QwQ-LCoT1-Merged (Merge)
prithivMLmods_QwQ-LCoT2-7B-Instruct_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/prithivMLmods/QwQ-LCoT2-7B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/QwQ-LCoT2-7B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__QwQ-LCoT2-7B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
prithivMLmods/QwQ-LCoT2-7B-Instruct
f2ea462f6d3f6cf104313b1329909cb15a388841
28.574182
apache-2.0
10
7.616
true
false
false
false
1.365323
0.556118
55.611777
0.542486
34.366737
0.222054
22.205438
0.297819
6.375839
0.456438
15.754688
0.434176
37.130615
false
false
2025-01-20
2025-01-24
1
prithivMLmods/QwQ-LCoT2-7B-Instruct (Merge)
prithivMLmods_QwQ-MathOct-7B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/prithivMLmods/QwQ-MathOct-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/QwQ-MathOct-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__QwQ-MathOct-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
prithivMLmods/QwQ-MathOct-7B
d2ff038987cc16a7b317034929dd9ab35265e308
27.918705
apache-2.0
8
7.616
true
false
false
false
0.664523
0.46844
46.84404
0.548551
35.254667
0.260574
26.057402
0.302852
7.04698
0.460063
15.307812
0.433012
37.00133
false
false
2025-01-11
2025-01-19
1
prithivMLmods/QwQ-MathOct-7B (Merge)
prithivMLmods_QwQ-R1-Distill-1.5B-CoT_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/prithivMLmods/QwQ-R1-Distill-1.5B-CoT" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/QwQ-R1-Distill-1.5B-CoT</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__QwQ-R1-Distill-1.5B-CoT-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
prithivMLmods/QwQ-R1-Distill-1.5B-CoT
cd1a92a4fffbc923013e2a77d9d7f2c8b2a738ae
12.458841
apache-2.0
7
1.777
true
false
false
false
0.587416
0.219396
21.939565
0.366621
11.476456
0.246224
24.622356
0.286074
4.809843
0.343396
1.757812
0.191323
10.147015
false
false
2025-01-21
2025-01-22
1
prithivMLmods/QwQ-R1-Distill-1.5B-CoT (Merge)
prithivMLmods_QwQ-R1-Distill-7B-CoT_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/prithivMLmods/QwQ-R1-Distill-7B-CoT" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/QwQ-R1-Distill-7B-CoT</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__QwQ-R1-Distill-7B-CoT-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
prithivMLmods/QwQ-R1-Distill-7B-CoT
db0c74ffe611d00eb0a5df4413f3eced7fdacb78
18.919333
apache-2.0
8
7.616
true
false
false
false
0.67112
0.350038
35.00379
0.438789
20.953831
0.271903
27.190332
0.293624
5.816555
0.377906
4.504948
0.280419
20.046543
false
false
2025-01-21
2025-01-22
1
prithivMLmods/QwQ-R1-Distill-7B-CoT (Merge)
prithivMLmods_SmolLM2-CoT-360M_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/prithivMLmods/SmolLM2-CoT-360M" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/SmolLM2-CoT-360M</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__SmolLM2-CoT-360M-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
prithivMLmods/SmolLM2-CoT-360M
474240d772fbb3b8da6f8eb47f32dd34c6b78baf
5.724162
apache-2.0
14
0.362
true
false
false
false
0.387752
0.221569
22.156877
0.31353
4.801205
0.006798
0.679758
0.236577
0
0.379396
5.757813
0.108544
0.94932
false
false
2025-01-05
2025-01-07
1
prithivMLmods/SmolLM2-CoT-360M (Merge)
prithivMLmods_Taurus-Opus-7B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/prithivMLmods/Taurus-Opus-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Taurus-Opus-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Taurus-Opus-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
prithivMLmods/Taurus-Opus-7B
4b9918fb7ed2a92bdb1beae11deb337a3745d053
26.064884
apache-2.0
7
7.456
true
false
false
false
0.68181
0.422328
42.232831
0.536736
34.234016
0.227341
22.734139
0.326342
10.178971
0.439885
14.21901
0.395113
32.790337
false
false
2025-01-25
2025-01-27
1
prithivMLmods/Taurus-Opus-7B (Merge)
prithivMLmods_Triangulum-10B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/prithivMLmods/Triangulum-10B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Triangulum-10B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Triangulum-10B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
prithivMLmods/Triangulum-10B
d3776fbe6bfc884f1380fe128223759d76214049
28.099256
llama3.1
10
10.306
true
false
false
false
0.859414
0.322935
32.293537
0.596802
42.240747
0.3429
34.29003
0.354027
13.870246
0.41725
10.589583
0.417803
35.311392
false
false
2024-12-30
2025-01-07
1
prithivMLmods/Triangulum-10B (Merge)
prithivMLmods_Triangulum-5B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/prithivMLmods/Triangulum-5B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Triangulum-5B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Triangulum-5B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
prithivMLmods/Triangulum-5B
55e161fc171b17b3e6c15aef9d5318a51bdb48fb
3.860735
creativeml-openrail-m
8
5.413
true
false
false
false
0.492429
0.128321
12.832063
0.312412
4.293502
0.001511
0.151057
0.255034
0.671141
0.344542
2.734375
0.12234
2.48227
false
false
2024-12-31
2025-01-07
1
prithivMLmods/Triangulum-5B (Merge)
prithivMLmods_Tulu-MathLingo-8B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/prithivMLmods/Tulu-MathLingo-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Tulu-MathLingo-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Tulu-MathLingo-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
prithivMLmods/Tulu-MathLingo-8B
0fb551a24dfe1a576e2c5118a7581588d339a2e7
21.596382
creativeml-openrail-m
9
8.03
true
false
false
false
0.841574
0.55894
55.894028
0.465881
24.703351
0.132931
13.293051
0.290268
5.369128
0.386427
7.603385
0.304438
22.715352
false
false
2024-12-23
2025-01-12
1
prithivMLmods/Tulu-MathLingo-8B (Merge)
pszemraj_Llama-3-6.3b-v0.1_bfloat16
bfloat16
🟩 continuously pretrained
🟩
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/pszemraj/Llama-3-6.3b-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">pszemraj/Llama-3-6.3b-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/pszemraj__Llama-3-6.3b-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
pszemraj/Llama-3-6.3b-v0.1
7000b39346162f95f19aa4ca3975242db61902d7
10.333954
llama3
6
6.3
true
false
false
false
0.814463
0.10439
10.438969
0.419681
18.679996
0.018127
1.812689
0.283557
4.474273
0.390833
6.154167
0.283993
20.443632
false
false
2024-05-17
2024-06-26
1
meta-llama/Meta-Llama-3-8B
pszemraj_Mistral-v0.3-6B_bfloat16
bfloat16
🟩 continuously pretrained
🟩
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/pszemraj/Mistral-v0.3-6B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">pszemraj/Mistral-v0.3-6B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/pszemraj__Mistral-v0.3-6B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
pszemraj/Mistral-v0.3-6B
ae11a699012b83996361f04808f4d45debf3b01c
10.046851
apache-2.0
1
5.939
true
false
false
false
0.530539
0.245374
24.53745
0.377405
13.515091
0.009063
0.906344
0.265101
2.013423
0.390771
6.613021
0.214262
12.695774
false
false
2024-05-25
2024-06-26
2
pszemraj/Mistral-7B-v0.3-prune6 (Merge)
qingy2019_LLaMa_3.2_3B_Catalysts_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/qingy2019/LLaMa_3.2_3B_Catalysts" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">qingy2019/LLaMa_3.2_3B_Catalysts</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/qingy2019__LLaMa_3.2_3B_Catalysts-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
qingy2019/LLaMa_3.2_3B_Catalysts
3f4a318114beb37f32a2c143cbd68b6d15d18164
19.628816
apache-2.0
1
3
true
false
false
false
0.649834
0.49924
49.923979
0.446813
21.345401
0.111027
11.102719
0.288591
5.145414
0.378771
7.946354
0.300781
22.309028
false
false
2024-10-19
2024-10-29
2
meta-llama/Llama-3.2-3B-Instruct
qingy2019_OpenMath2-Llama3.1-8B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/qingy2019/OpenMath2-Llama3.1-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">qingy2019/OpenMath2-Llama3.1-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/qingy2019__OpenMath2-Llama3.1-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
qingy2019/OpenMath2-Llama3.1-8B
38412f988f7688d884c9249b2a4e5cc76f98c1c6
8.987818
0
8
false
false
false
false
0.692806
0.233059
23.305939
0.409552
16.29437
0.041541
4.154079
0.265101
2.013423
0.343552
2.010677
0.155336
6.148419
false
false
2024-11-23
0
Removed
qingy2019_Oracle-14B_float16
float16
🤝 base merges and moerges
🤝
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/qingy2019/Oracle-14B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">qingy2019/Oracle-14B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/qingy2019__Oracle-14B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
qingy2019/Oracle-14B
0154031aa9306aa98da156a0f3c8e10d9f1377f6
13.34025
0
13.668
false
false
false
false
1.393024
0.235832
23.583204
0.461158
23.18463
0.064199
6.41994
0.25755
1.006711
0.371667
10.491667
0.238198
15.355349
false
false
2024-11-23
0
Removed
qingy2019_Oracle-14B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/qingy2019/Oracle-14B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">qingy2019/Oracle-14B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/qingy2019__Oracle-14B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
qingy2019/Oracle-14B
0154031aa9306aa98da156a0f3c8e10d9f1377f6
13.479724
0
13.668
false
false
false
false
1.368887
0.240079
24.007855
0.46223
23.301946
0.06571
6.570997
0.260906
1.454139
0.370333
10.225
0.237866
15.31841
false
false
2024-11-24
0
Removed
qingy2019_Qwen2.5-Math-14B-Instruct_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/qingy2019/Qwen2.5-Math-14B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">qingy2019/Qwen2.5-Math-14B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/qingy2019__Qwen2.5-Math-14B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
qingy2019/Qwen2.5-Math-14B-Instruct
025d9637208b862c7b10b7590969fe6870ce01a0
36.70629
apache-2.0
1
14
true
false
false
false
1.932827
0.606626
60.662597
0.635007
47.017086
0.284743
28.47432
0.372483
16.331096
0.475729
19.632812
0.533078
48.119829
false
false
2024-12-01
2024-12-01
3
Qwen/Qwen2.5-14B
qingy2019_Qwen2.5-Math-14B-Instruct_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/qingy2019/Qwen2.5-Math-14B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">qingy2019/Qwen2.5-Math-14B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/qingy2019__Qwen2.5-Math-14B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
qingy2019/Qwen2.5-Math-14B-Instruct
025d9637208b862c7b10b7590969fe6870ce01a0
36.380504
apache-2.0
1
14
true
false
false
false
1.971893
0.600531
60.053104
0.635649
47.065572
0.276435
27.643505
0.369128
15.883669
0.475667
19.425
0.53391
48.212175
false
false
2024-12-01
2024-12-01
3
Qwen/Qwen2.5-14B
qingy2019_Qwen2.5-Math-14B-Instruct-Alpha_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/qingy2019/Qwen2.5-Math-14B-Instruct-Alpha" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">qingy2019/Qwen2.5-Math-14B-Instruct-Alpha</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/qingy2019__Qwen2.5-Math-14B-Instruct-Alpha-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
qingy2019/Qwen2.5-Math-14B-Instruct-Alpha
e24aaa0779b576301bfb62b93789dea24ab10c88
35.456012
apache-2.0
2
14
true
false
false
false
1.893142
0.598083
59.808309
0.637508
47.750108
0.231118
23.111782
0.369966
15.995526
0.464938
17.950521
0.533078
48.119829
false
false
2024-12-03
2024-12-03
2
Qwen/Qwen2.5-14B
qingy2019_Qwen2.5-Math-14B-Instruct-Pro_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/qingy2019/Qwen2.5-Math-14B-Instruct-Pro" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">qingy2019/Qwen2.5-Math-14B-Instruct-Pro</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/qingy2019__Qwen2.5-Math-14B-Instruct-Pro-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
qingy2019/Qwen2.5-Math-14B-Instruct-Pro
295a9ce370c2bfeabe13f76d52c92f57ff6d0308
19.70776
0
14.766
false
false
false
true
1.659569
0.192168
19.216789
0.531869
33.036904
0.251511
25.151057
0.311242
8.165548
0.374031
4.253906
0.355801
28.422355
false
false
2024-12-03
2024-12-03
1
qingy2019/Qwen2.5-Math-14B-Instruct-Pro (Merge)
qingy2019_Qwen2.5-Ultimate-14B-Instruct_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/qingy2019/Qwen2.5-Ultimate-14B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">qingy2019/Qwen2.5-Ultimate-14B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/qingy2019__Qwen2.5-Ultimate-14B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
qingy2019/Qwen2.5-Ultimate-14B-Instruct
3eeba743112bed957ae6dc6a3f880355c8bedb66
29.289124
1
14.766
false
false
false
true
1.952089
0.393802
39.380178
0.584156
40.580601
0.280211
28.021148
0.356544
14.205817
0.4135
9.8875
0.492936
43.659501
false
false
2024-12-02
2024-12-02
1
qingy2019/Qwen2.5-Ultimate-14B-Instruct (Merge)
qingy2024_Eyas-17B-Instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/qingy2024/Eyas-17B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">qingy2024/Eyas-17B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/qingy2024__Eyas-17B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
qingy2024/Eyas-17B-Instruct
afa6aa65deaef3eeb733e80f0fbffcf6d70a863f
32.252102
0
17.431
false
false
false
true
2.285761
0.657459
65.745888
0.608455
43.850066
0.228097
22.809668
0.314597
8.612975
0.452167
15.354167
0.434259
37.139849
false
false
2024-12-23
2024-12-23
1
qingy2024/Eyas-17B-Instruct (Merge)
qingy2024_Falcon3-2x10B-MoE-Instruct_bfloat16
bfloat16
🟩 continuously pretrained
🟩
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/qingy2024/Falcon3-2x10B-MoE-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">qingy2024/Falcon3-2x10B-MoE-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/qingy2024__Falcon3-2x10B-MoE-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
qingy2024/Falcon3-2x10B-MoE-Instruct
e226b1f0beb60ff1e3770a694af51572b6d95dc5
35.168629
apache-2.0
0
18.799
true
true
false
true
2.193287
0.784978
78.49783
0.618493
45.073853
0.257553
25.755287
0.330537
10.738255
0.428354
12.910937
0.44232
38.035609
true
false
2024-12-25
2024-12-25
1
qingy2024/Falcon3-2x10B-MoE-Instruct (Merge)
qingy2024_Fusion-14B-Instruct_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/qingy2024/Fusion-14B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">qingy2024/Fusion-14B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/qingy2024__Fusion-14B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
qingy2024/Fusion-14B-Instruct
2e15219659b919e04ad5b56bef259489cc264f09
37.64425
1
14
false
false
false
true
1.623681
0.725977
72.597707
0.639593
48.579836
0.309668
30.966767
0.354866
13.982103
0.440042
14.805208
0.504405
44.93388
false
false
2024-12-05
2024-12-05
1
qingy2024/Fusion-14B-Instruct (Merge)
qingy2024_Fusion2-14B-Instruct_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/qingy2024/Fusion2-14B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">qingy2024/Fusion2-14B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/qingy2024__Fusion2-14B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
qingy2024/Fusion2-14B-Instruct
df00288ce3d37ef518189c19e7973e71b47ef214
35.182268
1
14.766
false
false
false
true
1.668666
0.606401
60.640102
0.611852
44.767044
0.308157
30.81571
0.344799
12.639821
0.463385
17.223177
0.50507
45.007757
false
false
2024-12-05
2024-12-06
1
qingy2024/Fusion2-14B-Instruct (Merge)
qingy2024_Fusion4-14B-Instruct_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/qingy2024/Fusion4-14B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">qingy2024/Fusion4-14B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/qingy2024__Fusion4-14B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
qingy2024/Fusion4-14B-Instruct
3f3c7178006857d7fdf942ab7e86bd2b0d7b624d
38.733953
0
14.77
false
false
false
true
1.822831
0.764895
76.489492
0.654252
50.695856
0.339124
33.912387
0.330537
10.738255
0.432573
13.971615
0.519365
46.596114
false
false
2024-12-25
2024-12-25
1
qingy2024/Fusion4-14B-Instruct (Merge)
qingy2024_OwO-14B-Instruct_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/qingy2024/OwO-14B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">qingy2024/OwO-14B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/qingy2024__OwO-14B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
qingy2024/OwO-14B-Instruct
0c64ce33086d285d9374f0fb9360d52d0eb1ff92
27.423406
apache-2.0
0
14.77
true
false
false
false
2.814761
0.138312
13.83119
0.616481
44.948452
0.304381
30.438066
0.364094
15.212528
0.440687
13.652604
0.518118
46.457595
false
false
2024-12-27
2024-12-30
2
Qwen/Qwen2.5-14B
qingy2024_QwQ-14B-Math-v0.2_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/qingy2024/QwQ-14B-Math-v0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">qingy2024/QwQ-14B-Math-v0.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/qingy2024__QwQ-14B-Math-v0.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
qingy2024/QwQ-14B-Math-v0.2
308f732e0f2c1ac9e416e9c1e0523c0198ac658c
24.08899
apache-2.0
18
14.77
true
false
false
true
3.411171
0.339097
33.909693
0.573098
39.099214
0.190332
19.033233
0.262584
1.677852
0.402094
8.595052
0.47997
42.218898
false
false
2024-12-20
2024-12-23
2
Qwen/Qwen2.5-14B
qingy2024_Qwarkstar-4B_bfloat16
bfloat16
🟩 continuously pretrained
🟩
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/qingy2024/Qwarkstar-4B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">qingy2024/Qwarkstar-4B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/qingy2024__Qwarkstar-4B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
qingy2024/Qwarkstar-4B
c3dd554ec8f344e31b91b0532864388d6151700a
14.192507
0
4.473
false
false
false
false
1.117328
0.199412
19.9412
0.401491
16.574205
0.087613
8.761329
0.324664
9.955257
0.442833
14.0875
0.24252
15.83555
false
false
2025-01-05
2025-01-10
1
qingy2024/Qwarkstar-4B (Merge)
qingy2024_Qwarkstar-4B-Instruct-Preview_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/qingy2024/Qwarkstar-4B-Instruct-Preview" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">qingy2024/Qwarkstar-4B-Instruct-Preview</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/qingy2024__Qwarkstar-4B-Instruct-Preview-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
qingy2024/Qwarkstar-4B-Instruct-Preview
cd93b138d949e75eed3c4dba1f4dbdfe92ce255c
18.256192
apache-2.0
1
4.473
true
false
false
true
0.967205
0.532437
53.243727
0.435844
20.234017
0.09139
9.138973
0.280201
4.026846
0.389594
6.199219
0.250249
16.694371
false
false
2025-01-10
2025-01-17
1
qingy2024/Qwarkstar-4B-Instruct-Preview (Merge)
qingy2024_Qwen2.5-4B_bfloat16
bfloat16
🟢 pretrained
🟢
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/qingy2024/Qwen2.5-4B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">qingy2024/Qwen2.5-4B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/qingy2024__Qwen2.5-4B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
qingy2024/Qwen2.5-4B
e2736ed3972e1a0b2c1d6357acec2c21369827e1
13.973242
0
4.168
false
false
false
false
0.959765
0.215848
21.584839
0.426938
19.977752
0.033233
3.323263
0.291107
5.480984
0.461031
16.528906
0.252493
16.943706
false
false
2025-01-03
2025-01-16
1
qingy2024/Qwen2.5-4B (Merge)
qingy2024_Qwen2.5-Math-14B-Instruct-Alpha_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/qingy2024/Qwen2.5-Math-14B-Instruct-Alpha" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">qingy2024/Qwen2.5-Math-14B-Instruct-Alpha</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/qingy2024__Qwen2.5-Math-14B-Instruct-Alpha-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
qingy2024/Qwen2.5-Math-14B-Instruct-Alpha
c82727eb404d3d55450759301b80f838e4d3e1fc
32.215395
apache-2.0
2
14.77
true
false
false
true
1.569347
0.77044
77.044021
0.646486
50.179503
0.000755
0.075529
0.348993
13.199105
0.402094
8.728385
0.496592
44.065824
false
false
2024-12-03
2024-12-10
2
Qwen/Qwen2.5-14B
qingy2024_Qwen2.5-Math-14B-Instruct-Preview_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/qingy2024/Qwen2.5-Math-14B-Instruct-Preview" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">qingy2024/Qwen2.5-Math-14B-Instruct-Preview</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/qingy2024__Qwen2.5-Math-14B-Instruct-Preview-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
qingy2024/Qwen2.5-Math-14B-Instruct-Preview
7b9e9b94d69f0de9627f728e9328fb394f7fea14
31.987593
apache-2.0
1
14.77
true
false
false
true
1.61898
0.78258
78.258022
0.629394
47.050808
0
0
0.340604
12.080537
0.411458
10.165625
0.499335
44.370567
false
false
2024-12-01
2024-12-10
3
Qwen/Qwen2.5-14B
qingy2024_Qwen2.6-14B-Instruct_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/qingy2024/Qwen2.6-14B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">qingy2024/Qwen2.6-14B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/qingy2024__Qwen2.6-14B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
qingy2024/Qwen2.6-14B-Instruct
c21acf3c074e9522c5d0559ccc4ed715c48b8eff
35.624979
1
14.766
false
false
false
false
1.789286
0.581097
58.109704
0.639414
48.047948
0.267372
26.73716
0.379195
17.225951
0.456938
16.017188
0.528507
47.611924
false
false
2024-12-04
2024-12-04
1
qingy2024/Qwen2.6-14B-Instruct (Merge)
qingy2024_Qwen2.6-Math-14B-Instruct_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/qingy2024/Qwen2.6-Math-14B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">qingy2024/Qwen2.6-Math-14B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/qingy2024__Qwen2.6-Math-14B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
qingy2024/Qwen2.6-Math-14B-Instruct
45bb3f302922fbf185694bba2748a32ca3313a5e
28.046404
apache-2.0
1
14
true
false
false
false
1.556072
0.386232
38.623186
0.632444
47.022117
0
0
0.369966
15.995526
0.475854
19.515104
0.524102
47.122488
false
false
2024-12-04
2024-12-04
3
Qwen/Qwen2.5-14B
qq8933_OpenLongCoT-Base-Gemma2-2B_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/qq8933/OpenLongCoT-Base-Gemma2-2B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">qq8933/OpenLongCoT-Base-Gemma2-2B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/qq8933__OpenLongCoT-Base-Gemma2-2B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
qq8933/OpenLongCoT-Base-Gemma2-2B
39e5bc941f107ac28142c802aecfd257cc47c1bb
5.08291
other
8
3.204
true
false
false
true
1.658487
0.196514
19.651414
0.310636
3.546298
0
0
0.262584
1.677852
0.32225
2.114583
0.131566
3.507314
false
false
2024-10-28
2024-11-12
2
google/gemma-2-2b
raphgg_test-2.5-72B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/raphgg/test-2.5-72B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">raphgg/test-2.5-72B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/raphgg__test-2.5-72B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
raphgg/test-2.5-72B
0f34d627ccd451c5bd74f495bcdb8b18787d6f3b
45.027894
apache-2.0
0
72.706
true
false
false
true
22.432353
0.843705
84.37047
0.72661
62.154127
0.308157
30.81571
0.389262
18.568233
0.481188
20.515104
0.583693
53.74372
false
false
2023-07-27
2024-12-27
0
raphgg/test-2.5-72B
rasyosef_Mistral-NeMo-Minitron-8B-Chat_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/rasyosef/Mistral-NeMo-Minitron-8B-Chat" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">rasyosef/Mistral-NeMo-Minitron-8B-Chat</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/rasyosef__Mistral-NeMo-Minitron-8B-Chat-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
rasyosef/Mistral-NeMo-Minitron-8B-Chat
cede47eac8a4e65aa27567d3f087c28185b537d9
17.230946
other
9
8.414
true
false
false
true
1.476398
0.445184
44.518433
0.475944
26.036695
0.008308
0.830816
0.276007
3.467562
0.430427
12.936719
0.240359
15.595449
false
false
2024-08-26
2024-08-26
1
nvidia/Mistral-NeMo-Minitron-8B-Base
rasyosef_Phi-1_5-Instruct-v0.1_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
PhiForCausalLM
<a target="_blank" href="https://huggingface.co/rasyosef/Phi-1_5-Instruct-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">rasyosef/Phi-1_5-Instruct-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/rasyosef__Phi-1_5-Instruct-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
rasyosef/Phi-1_5-Instruct-v0.1
f4c405ee4bff5dc1a69383f3fe682342c9c87c77
6.638162
mit
1
1.415
true
false
false
true
0.295022
0.240228
24.022815
0.31179
4.820244
0
0
0.260067
1.342282
0.342156
3.402865
0.156167
6.240765
false
false
2024-07-24
2024-07-25
1
microsoft/phi-1_5
rasyosef_phi-2-instruct-apo_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
PhiForCausalLM
<a target="_blank" href="https://huggingface.co/rasyosef/phi-2-instruct-apo" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">rasyosef/phi-2-instruct-apo</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/rasyosef__phi-2-instruct-apo-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
rasyosef/phi-2-instruct-apo
2d3722d6db77a8c844a50dd32ddc4278fdc89e1f
12.043528
mit
0
2.775
true
false
false
true
0.495065
0.314592
31.459195
0.44451
21.672438
0
0
0.270134
2.684564
0.334219
3.610677
0.215509
12.834294
false
false
2024-09-15
2024-09-17
1
microsoft/phi-2
rasyosef_phi-2-instruct-v0.1_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
PhiForCausalLM
<a target="_blank" href="https://huggingface.co/rasyosef/phi-2-instruct-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">rasyosef/phi-2-instruct-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/rasyosef__phi-2-instruct-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
rasyosef/phi-2-instruct-v0.1
29aeb3ccf7c79e0169a038fbd0deaf9772a9fefd
14.218631
mit
2
2.775
true
false
false
true
0.492726
0.368148
36.814763
0.472612
26.358802
0
0
0.274329
3.243848
0.352354
5.044271
0.224651
13.850103
false
false
2024-08-09
2024-08-10
1
microsoft/phi-2
realtreetune_rho-1b-sft-MATH_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/realtreetune/rho-1b-sft-MATH" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">realtreetune/rho-1b-sft-MATH</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/realtreetune__rho-1b-sft-MATH-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
realtreetune/rho-1b-sft-MATH
b5f93df6af679a860caac9a9598e0f70c326b4fb
5.355177
0
1.1
false
false
false
false
0.278134
0.212102
21.210167
0.314415
4.197623
0.021903
2.190332
0.252517
0.33557
0.345844
2.897135
0.111702
1.300236
false
false
2024-06-06
2024-10-05
1
realtreetune/rho-1b-sft-MATH (Merge)
recoilme_Gemma-2-Ataraxy-Gemmasutra-9B-slerp_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/recoilme/Gemma-2-Ataraxy-Gemmasutra-9B-slerp" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">recoilme/Gemma-2-Ataraxy-Gemmasutra-9B-slerp</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/recoilme__Gemma-2-Ataraxy-Gemmasutra-9B-slerp-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
recoilme/Gemma-2-Ataraxy-Gemmasutra-9B-slerp
9048af8616bc62b6efab2bc1bc77ba53c5dfed79
29.873992
apache-2.0
4
10.159
true
false
false
true
2.114373
0.764895
76.489492
0.597439
42.25121
0.017372
1.73716
0.330537
10.738255
0.424479
12.393229
0.420711
35.634604
false
false
2024-09-11
2024-09-12
0
recoilme/Gemma-2-Ataraxy-Gemmasutra-9B-slerp
recoilme_Gemma-2-Ataraxy-Gemmasutra-9B-slerp_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/recoilme/Gemma-2-Ataraxy-Gemmasutra-9B-slerp" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">recoilme/Gemma-2-Ataraxy-Gemmasutra-9B-slerp</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/recoilme__Gemma-2-Ataraxy-Gemmasutra-9B-slerp-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
recoilme/Gemma-2-Ataraxy-Gemmasutra-9B-slerp
5a4f7299d9f8ea5faad2b1edc68b7bf634dac40b
23.205618
apache-2.0
4
10.159
true
false
false
false
2.969828
0.285365
28.536505
0.598393
42.703798
0.058157
5.81571
0.329698
10.626398
0.460656
16.415365
0.416223
35.135934
false
false
2024-09-11
2024-09-27
0
recoilme/Gemma-2-Ataraxy-Gemmasutra-9B-slerp
recoilme_recoilme-gemma-2-9B-v0.1_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/recoilme/recoilme-gemma-2-9B-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">recoilme/recoilme-gemma-2-9B-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/recoilme__recoilme-gemma-2-9B-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
recoilme/recoilme-gemma-2-9B-v0.1
6dc0997046db4e9932f87d338ecdc2a4158abbda
29.602746
0
10.159
false
false
false
true
1.924809
0.751506
75.1506
0.599531
42.321861
0.016616
1.661631
0.338926
11.856823
0.419146
11.526563
0.415891
35.098995
false
false
2024-09-18
0
Removed
recoilme_recoilme-gemma-2-9B-v0.2_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/recoilme/recoilme-gemma-2-9B-v0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">recoilme/recoilme-gemma-2-9B-v0.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/recoilme__recoilme-gemma-2-9B-v0.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
recoilme/recoilme-gemma-2-9B-v0.2
483116e575fb3a56de25243b14d715c58fe127bc
30.048864
cc-by-nc-4.0
1
10.159
true
false
false
true
1.914086
0.759175
75.917455
0.602596
43.027969
0.05287
5.287009
0.328859
10.514541
0.409875
10.401042
0.416307
35.145168
false
false
2024-09-18
2024-09-18
0
recoilme/recoilme-gemma-2-9B-v0.2
recoilme_recoilme-gemma-2-9B-v0.2_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/recoilme/recoilme-gemma-2-9B-v0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">recoilme/recoilme-gemma-2-9B-v0.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/recoilme__recoilme-gemma-2-9B-v0.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
recoilme/recoilme-gemma-2-9B-v0.2
483116e575fb3a56de25243b14d715c58fe127bc
23.674735
cc-by-nc-4.0
1
10.159
true
false
false
false
2.946784
0.274699
27.469891
0.603083
43.560581
0.077795
7.779456
0.330537
10.738255
0.468594
17.807552
0.412234
34.692671
false
false
2024-09-18
2024-09-27
0
recoilme/recoilme-gemma-2-9B-v0.2
recoilme_recoilme-gemma-2-9B-v0.3_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/recoilme/recoilme-gemma-2-9B-v0.3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">recoilme/recoilme-gemma-2-9B-v0.3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/recoilme__recoilme-gemma-2-9B-v0.3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
recoilme/recoilme-gemma-2-9B-v0.3
772cab46d9d22cbcc3c574d193021803ce5c444c
30.207472
cc-by-nc-4.0
3
10.159
true
false
false
true
1.876637
0.743937
74.39372
0.599253
42.026279
0.087613
8.761329
0.323826
9.8434
0.420385
12.08151
0.407247
34.138593
false
false
2024-09-18
2024-09-18
0
recoilme/recoilme-gemma-2-9B-v0.3
recoilme_recoilme-gemma-2-9B-v0.3_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/recoilme/recoilme-gemma-2-9B-v0.3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">recoilme/recoilme-gemma-2-9B-v0.3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/recoilme__recoilme-gemma-2-9B-v0.3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
recoilme/recoilme-gemma-2-9B-v0.3
76c8fb761660e6eb237c91bb6e6761ee36266bba
30.111638
cc-by-nc-4.0
3
10.159
true
false
false
false
2.55535
0.576076
57.607592
0.601983
43.326868
0.172961
17.296073
0.337248
11.63311
0.463229
17.036979
0.403923
33.769208
false
false
2024-09-18
2024-09-27
0
recoilme/recoilme-gemma-2-9B-v0.3