4,306 rows
eval_name
stringlengths
12
111
Precision
stringclasses
3 values
Type
stringclasses
7 values
T
stringclasses
7 values
Weight type
stringclasses
2 values
Architecture
stringclasses
63 values
Model
stringlengths
355
689
fullname
stringlengths
4
102
Model sha
stringlengths
0
40
Average ⬆️
float64
0.74
52.1
Hub License
stringclasses
27 values
Hub ❤️
int64
0
6.06k
#Params (B)
float64
-1
141
Available on the hub
bool
2 classes
MoE
bool
2 classes
Flagged
bool
2 classes
Chat Template
bool
2 classes
CO₂ cost (kg)
float64
0.04
187
IFEval Raw
float64
0
0.9
IFEval
float64
0
90
BBH Raw
float64
0.22
0.83
BBH
float64
0.25
76.7
MATH Lvl 5 Raw
float64
0
0.71
MATH Lvl 5
float64
0
71.5
GPQA Raw
float64
0.21
0.47
GPQA
float64
0
29.4
MUSR Raw
float64
0.29
0.6
MUSR
float64
0
38.5
MMLU-PRO Raw
float64
0.1
0.73
MMLU-PRO
float64
0
70
Merged
bool
2 classes
Official Providers
bool
2 classes
Upload To Hub Date
stringclasses
515 values
Submission Date
stringclasses
253 values
Generation
int64
0
10
Base Model
stringlengths
4
102
Corianas_llama-3-reactor_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Corianas/llama-3-reactor" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Corianas/llama-3-reactor</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Corianas__llama-3-reactor-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Corianas/llama-3-reactor
bef2eac42fd89baa0064badbc9c7958ad9ccbed3
13.99547
apache-2.0
1
-1
true
false
false
false
1.64233
0.230012
23.001192
0.445715
21.88856
0.046828
4.682779
0.297819
6.375839
0.397719
8.014844
0.280086
20.009604
false
false
2024-07-20
2024-07-23
0
Corianas/llama-3-reactor
cognitivecomputations_dolphin-2.9.2-Phi-3-Medium_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/cognitivecomputations/dolphin-2.9.2-Phi-3-Medium" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">cognitivecomputations/dolphin-2.9.2-Phi-3-Medium</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/cognitivecomputations__dolphin-2.9.2-Phi-3-Medium-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
cognitivecomputations/dolphin-2.9.2-Phi-3-Medium
0470c5b912b51fa6e27d87a8ea7feafacd8cb101
28.614516
mit
21
-1
true
false
false
true
1.680964
0.424776
42.477626
0.645674
49.72194
0.182779
18.277946
0.327181
10.290828
0.419052
11.414844
0.455535
39.503915
false
true
2024-05-31
2024-08-05
1
cognitivecomputations/dolphin-2.9.2-Phi-3-Medium (Merge)
google_umt5-base_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
UMT5ForConditionalGeneration
<a target="_blank" href="https://huggingface.co/google/umt5-base" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">google/umt5-base</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/google__umt5-base-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
google/umt5-base
0de9394d54f8975e71838d309de1cb496c894ab9
3.516575
apache-2.0
13
-1
true
false
false
false
1.336092
0.174632
17.46322
0.278773
0.813553
0.004532
0.453172
0.254195
0.559284
0.338219
0.94401
0.107796
0.866209
false
true
2023-07-02
2024-09-06
0
google/umt5-base
DeepMount00_Llama-3.1-8b-Ita_bfloat16
bfloat16
❓ other
Original
Unknown
<a target="_blank" href="https://huggingface.co/DeepMount00/Llama-3.1-8b-Ita" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DeepMount00/Llama-3.1-8b-Ita</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DeepMount00__Llama-3.1-8b-Ita-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DeepMount00/Llama-3.1-8b-Ita
5ede1e388b6b15bc06acd364a8f805fe9ed16db9
26.265732
6
0
false
false
false
false
0.906247
0.536484
53.648431
0.517
31.333639
0.170695
17.069486
0.306208
7.494407
0.448719
15.15651
0.396027
32.891918
false
false
2024-08-13
2
meta-llama/Meta-Llama-3.1-8B
Quazim0t0_ODB-14B-sce_bfloat16
bfloat16
❓ other
Original
Unknown
<a target="_blank" href="https://huggingface.co/Quazim0t0/ODB-14B-sce" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Quazim0t0/ODB-14B-sce</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Quazim0t0__ODB-14B-sce-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Quazim0t0/ODB-14B-sce
eda0a8fa1470baabc824329b40cf9d023f1fe3b1
26.920457
0
0
false
false
false
false
2.983368
0.292236
29.223571
0.655892
50.699504
0.254532
25.453172
0.26594
2.12528
0.392885
7.277344
0.520695
46.743868
false
false
2025-02-06
0
Quazim0t0/ODB-14B-sce
allenai_OLMo-1.7-7B-hf_float16
float16
❓ other
Original
Unknown
<a target="_blank" href="https://huggingface.co/allenai/OLMo-1.7-7B-hf" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allenai/OLMo-1.7-7B-hf</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allenai__OLMo-1.7-7B-hf-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
allenai/OLMo-1.7-7B-hf
a2a514275cb69a5f9b3dd51e0a4e92df88a12dfb
3.800232
apache-2.0
12
0
true
false
false
false
0.654293
0.156897
15.689703
0.30137
2.770316
0.002266
0.226586
0.255034
0.671141
0.34749
2.069531
0.112367
1.374113
false
true
2024-04-17
0
allenai/OLMo-1.7-7B-hf
databricks_dbrx-base_float16
float16
❓ other
Original
Unknown
<a target="_blank" href="https://huggingface.co/databricks/dbrx-base" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">databricks/dbrx-base</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/databricks__dbrx-base-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
databricks/dbrx-base
d7d18d833146403dd74c2620b8434639ae123d6e
16.359432
other
557
0
true
true
false
false
10.45341
0.082147
8.214724
0.519583
32.608538
0.1
10
0.326667
10.222222
0.406667
9.333333
0.35
27.777778
false
true
2024-03-26
0
databricks/dbrx-base
internlm_internlm2-7b_float16
float16
❓ other
Original
Unknown
<a target="_blank" href="https://huggingface.co/internlm/internlm2-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">internlm/internlm2-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/internlm__internlm2-7b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
internlm/internlm2-7b
530fc706c606b1af1145c662877a7d99ad79d623
17.923366
other
41
0
true
false
false
false
1.039542
0.228037
22.803681
0.5825
40.276198
0.085714
8.571429
0.336667
11.555556
0.44
14.333333
0.19
10
false
true
2024-01-12
0
internlm/internlm2-7b
mistral-community_Mixtral-8x22B-v0.1_float16
float16
❓ other
Original
Unknown
<a target="_blank" href="https://huggingface.co/mistral-community/Mixtral-8x22B-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">mistral-community/Mixtral-8x22B-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/mistral-community__Mixtral-8x22B-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
mistral-community/Mixtral-8x22B-v0.1
ab1e8c1950cf359e2a25de9b274ab836adb6dbab
16.82739
apache-2.0
674
0
true
true
false
false
15.173202
0.316656
31.665644
0.38
12.647903
0.154286
15.428571
0.33
10.666667
0.353333
1.666667
0.36
28.888889
false
true
2024-04-10
0
mistral-community/Mixtral-8x22B-v0.1
cpayne1303_smallcp2024_bfloat16
bfloat16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/cpayne1303/smallcp2024" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">cpayne1303/smallcp2024</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/cpayne1303__smallcp2024-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
cpayne1303/smallcp2024
ef995127242553e4126190e7f70f927504834360
3.543848
apache-2.0
0
0.002
true
false
false
false
0.094616
0.158196
15.819581
0.302705
3.118178
0.005287
0.528701
0.230705
0
0.342469
0.533333
0.11137
1.263298
false
false
2024-11-27
2024-11-27
0
cpayne1303/smallcp2024
darkc0de_BuddyGlass_v0.3_Xortron7MethedUpSwitchedUp_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/darkc0de/BuddyGlass_v0.3_Xortron7MethedUpSwitchedUp" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">darkc0de/BuddyGlass_v0.3_Xortron7MethedUpSwitchedUp</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/darkc0de__BuddyGlass_v0.3_Xortron7MethedUpSwitchedUp-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
darkc0de/BuddyGlass_v0.3_Xortron7MethedUpSwitchedUp
57367fefe01c7d9653c303b28449b416fc777d93
22.328255
3
0.007
false
false
false
false
1.796364
0.435842
43.584245
0.524309
31.869311
0.128399
12.839879
0.298658
6.487696
0.414333
9.491667
0.367271
29.696735
false
false
2024-09-10
2024-09-15
1
darkc0de/BuddyGlass_v0.3_Xortron7MethedUpSwitchedUp (Merge)
cpayne1303_cp2024_bfloat16
bfloat16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/cpayne1303/cp2024" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">cpayne1303/cp2024</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/cpayne1303__cp2024-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
cpayne1303/cp2024
fb354aaa73c40b4f1fc6e86beea733e4f3929470
3.702133
apache-2.0
0
0.031
true
false
false
false
0.095226
0.165814
16.581448
0.298539
2.739141
0.005287
0.528701
0.255872
0.782998
0.338313
0.455729
0.110123
1.124778
false
false
2024-11-26
2024-11-26
0
cpayne1303/cp2024
cpayne1303_cp2024-instruct_bfloat16
bfloat16
🟩 continuously pretrained
🟩
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/cpayne1303/cp2024-instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">cpayne1303/cp2024-instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/cpayne1303__cp2024-instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
cpayne1303/cp2024-instruct
ac4cfbc28479f8a94e3eb745526620be9b75edfa
4.319731
apache-2.0
1
0.031
true
false
false
true
0.064324
0.170611
17.061065
0.294678
2.4813
0
0
0.260067
1.342282
0.368635
3.179427
0.116689
1.854314
false
false
2024-11-27
2024-11-27
1
cpayne1303/cp2024
Felladrin_Minueza-32M-UltraChat_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Felladrin/Minueza-32M-UltraChat" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Felladrin/Minueza-32M-UltraChat</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Felladrin__Minueza-32M-UltraChat-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Felladrin/Minueza-32M-UltraChat
28506b99c5902d2215eb378ec91d4226a7396c49
3.924256
apache-2.0
5
0.033
true
false
false
true
0.336134
0.137563
13.756278
0.294148
2.43729
0.004532
0.453172
0.255872
0.782998
0.374187
4.640104
0.113281
1.475694
false
false
2024-02-27
2024-07-23
1
Felladrin/Minueza-32M-Base
cpayne1303_llama-43m-beta_bfloat16
bfloat16
🟩 continuously pretrained
🟩
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/cpayne1303/llama-43m-beta" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">cpayne1303/llama-43m-beta</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/cpayne1303__llama-43m-beta-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
cpayne1303/llama-43m-beta
1f85bec8c3541ed58fc2fcf4e6f98c1c34d72f60
5.288332
apache-2.0
0
0.043
true
false
false
false
0.058392
0.191568
19.156837
0.297678
2.482041
0
0
0.268456
2.46085
0.387177
6.163802
0.113198
1.46646
false
false
2024-11-30
2024-11-30
1
JackFram/llama-68m
cpayne1303_llama-43m-beta_float16
float16
🟩 continuously pretrained
🟩
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/cpayne1303/llama-43m-beta" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">cpayne1303/llama-43m-beta</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/cpayne1303__llama-43m-beta-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
cpayne1303/llama-43m-beta
1f85bec8c3541ed58fc2fcf4e6f98c1c34d72f60
5.422629
apache-2.0
0
0.043
true
false
false
false
0.119832
0.194891
19.489067
0.296463
2.496048
0.004532
0.453172
0.268456
2.46085
0.388542
6.401042
0.11112
1.235594
false
false
2024-11-30
2024-12-04
1
JackFram/llama-68m
JackFram_llama-68m_bfloat16
bfloat16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/JackFram/llama-68m" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">JackFram/llama-68m</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/JackFram__llama-68m-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
JackFram/llama-68m
964a5d77df908b69f8d6476fb70e940425b04cb5
4.96334
apache-2.0
26
0.068
true
false
false
false
0.121116
0.172634
17.263417
0.29363
2.591048
0.006042
0.60423
0.258389
1.118568
0.39099
6.607031
0.114362
1.595745
false
false
2023-07-19
2024-11-30
0
JackFram/llama-68m
google_flan-t5-small_float16
float16
🟢 pretrained
🟢
Original
T5ForConditionalGeneration
<a target="_blank" href="https://huggingface.co/google/flan-t5-small" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">google/flan-t5-small</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/google__flan-t5-small-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
google/flan-t5-small
0fc9ddf78a1e988dac52e2dac162b0ede4fd74ab
6.129662
apache-2.0
318
0.077
true
false
false
false
0.28626
0.152426
15.242556
0.32829
6.363112
0.007553
0.755287
0.260906
1.454139
0.412292
10.369792
0.123338
2.593085
false
true
2022-10-21
2024-06-27
0
google/flan-t5-small
distilbert_distilgpt2_bfloat16
bfloat16
🟢 pretrained
🟢
Original
GPT2LMHeadModel
<a target="_blank" href="https://huggingface.co/distilbert/distilgpt2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">distilbert/distilgpt2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/distilbert__distilgpt2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
distilbert/distilgpt2
2290a62682d06624634c1f46a6ad5be0f47f38aa
4.002274
apache-2.0
492
0.088
true
false
false
false
0.246163
0.0611
6.11001
0.303799
2.83522
0.006042
0.60423
0.259228
1.230425
0.420729
11.157813
0.118684
2.075946
false
true
2022-03-02
2024-06-12
0
distilbert/distilgpt2
BEE-spoke-data_smol_llama-101M-GQA_bfloat16
bfloat16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/BEE-spoke-data/smol_llama-101M-GQA" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BEE-spoke-data/smol_llama-101M-GQA</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BEE-spoke-data__smol_llama-101M-GQA-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
BEE-spoke-data/smol_llama-101M-GQA
bb26643db413bada7e0c3c50752bf9da82403dba
4.0196
apache-2.0
28
0.101
true
false
false
false
0.239211
0.138437
13.843712
0.301756
3.198004
0.006042
0.60423
0.25755
1.006711
0.371271
4.275521
0.110705
1.189421
false
false
2023-10-26
2024-07-06
0
BEE-spoke-data/smol_llama-101M-GQA
DeepAutoAI_causal_gpt2_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
GPT2LMHeadModel
<a target="_blank" href="https://huggingface.co/DeepAutoAI/causal_gpt2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DeepAutoAI/causal_gpt2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DeepAutoAI__causal_gpt2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DeepAutoAI/causal_gpt2
995f029f6645dde1ef830406001754b904c49775
6.032059
0
0.124
false
false
false
false
0.25173
0.181277
18.127679
0.302571
2.633344
0.005287
0.528701
0.260067
1.342282
0.426958
12.103125
0.113115
1.457225
false
false
2024-10-17
2024-10-17
0
DeepAutoAI/causal_gpt2
DeepAutoAI_d2nwg_causal_gpt2_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
GPT2LMHeadModel
<a target="_blank" href="https://huggingface.co/DeepAutoAI/d2nwg_causal_gpt2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DeepAutoAI/d2nwg_causal_gpt2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DeepAutoAI__d2nwg_causal_gpt2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DeepAutoAI/d2nwg_causal_gpt2
eab065cba5a7a9b08f8b264d61d504c4ecbb611b
6.305441
0
0.124
false
false
false
false
0.259815
0.191618
19.161824
0.30269
2.850574
0.004532
0.453172
0.25755
1.006711
0.429719
12.68151
0.11511
1.678856
false
false
2024-10-18
2024-10-18
0
DeepAutoAI/d2nwg_causal_gpt2
DeepAutoAI_d2nwg_causal_gpt2_v1_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
GPT2LMHeadModel
<a target="_blank" href="https://huggingface.co/DeepAutoAI/d2nwg_causal_gpt2_v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DeepAutoAI/d2nwg_causal_gpt2_v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DeepAutoAI__d2nwg_causal_gpt2_v1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DeepAutoAI/d2nwg_causal_gpt2_v1
3f40c3dcb3eb591dec80ff03573eec7928a7feaa
6.419566
0
0.124
false
false
false
false
0.343007
0.198862
19.886235
0.29919
2.387278
0.003776
0.377644
0.258389
1.118568
0.433688
13.244271
0.113531
1.503398
false
false
2024-10-18
2024-10-19
0
DeepAutoAI/d2nwg_causal_gpt2_v1
Sharathhebbar24_chat_gpt2_dpo_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
GPT2LMHeadModel
<a target="_blank" href="https://huggingface.co/Sharathhebbar24/chat_gpt2_dpo" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Sharathhebbar24/chat_gpt2_dpo</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Sharathhebbar24__chat_gpt2_dpo-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Sharathhebbar24/chat_gpt2_dpo
f4a41f2c058c6b4087e1c0196d1279a38dd1f060
3.406546
apache-2.0
1
0.124
true
false
false
false
0.128755
0.098619
9.861944
0.29023
1.698604
0.005287
0.528701
0.260067
1.342282
0.381844
5.430469
0.114195
1.577275
false
false
2024-01-24
2025-01-02
0
Sharathhebbar24/chat_gpt2_dpo
sumink_ftgpt_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
GPT2LMHeadModel
<a target="_blank" href="https://huggingface.co/sumink/ftgpt" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sumink/ftgpt</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sumink__ftgpt-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
sumink/ftgpt
fea7c59fff2443a73a7fd11a78b1d80eb5f0c4e6
3.951784
mit
0
0.124
true
false
false
false
0.105635
0.07871
7.871004
0.291909
1.931277
0
0
0.264262
1.901566
0.413844
10.097135
0.117188
1.909722
false
false
2024-11-06
2024-11-20
0
sumink/ftgpt
vonjack_MobileLLM-125M-HF_float16
float16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/vonjack/MobileLLM-125M-HF" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">vonjack/MobileLLM-125M-HF</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/vonjack__MobileLLM-125M-HF-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
vonjack/MobileLLM-125M-HF
7664f5e1b91faa04fac545f64db84c26316c7e63
5.565352
cc-by-nc-4.0
0
0.125
true
false
false
false
0.343623
0.210728
21.072754
0.30273
3.146584
0.009063
0.906344
0.260067
1.342282
0.378187
5.106771
0.116356
1.817376
false
false
2024-11-15
2024-11-15
0
vonjack/MobileLLM-125M-HF
HuggingFaceTB_SmolLM-135M_bfloat16
bfloat16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/HuggingFaceTB/SmolLM-135M" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">HuggingFaceTB/SmolLM-135M</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/HuggingFaceTB__SmolLM-135M-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
HuggingFaceTB/SmolLM-135M
eec6e461571fba3e197a57c298f60b75422eae02
6.95149
apache-2.0
197
0.13
true
false
false
false
0.686755
0.212476
21.247623
0.304605
3.2854
0.013595
1.359517
0.258389
1.118568
0.436604
13.342188
0.112201
1.355644
false
true
2024-07-14
2024-07-18
0
HuggingFaceTB/SmolLM-135M
amd_AMD-Llama-135m_bfloat16
bfloat16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/amd/AMD-Llama-135m" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">amd/AMD-Llama-135m</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/amd__AMD-Llama-135m-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
amd/AMD-Llama-135m
8f9c39b5ed86d422ab332ed1ecf042fdaeb57903
5.228977
apache-2.0
111
0.134
true
false
false
false
0.708678
0.191843
19.18432
0.296944
2.537953
0.007553
0.755287
0.258389
1.118568
0.384573
5.904948
0.116855
1.872784
false
true
2024-07-19
2024-10-01
0
amd/AMD-Llama-135m
FlofloB_smollm2-135M_pretrained_1000k_fineweb_float16
float16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/FlofloB/smollm2-135M_pretrained_1000k_fineweb" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">FlofloB/smollm2-135M_pretrained_1000k_fineweb</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/FlofloB__smollm2-135M_pretrained_1000k_fineweb-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
FlofloB/smollm2-135M_pretrained_1000k_fineweb
a0f91cfda4e5a820dbe30bd5e3fbb8f233f7467e
4.207808
apache-2.0
0
0.135
true
false
false
false
0.675817
0.148454
14.845388
0.291794
2.708744
0.009063
0.906344
0.262584
1.677852
0.358062
3.291146
0.116356
1.817376
false
false
2025-01-11
2025-01-14
5
HuggingFaceTB/SmolLM2-135M
FlofloB_smollm2-135M_pretrained_1000k_fineweb_uncovai_human_removed_float16
float16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/FlofloB/smollm2-135M_pretrained_1000k_fineweb_uncovai_human_removed" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">FlofloB/smollm2-135M_pretrained_1000k_fineweb_uncovai_human_removed</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/FlofloB__smollm2-135M_pretrained_1000k_fineweb_uncovai_human_removed-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
FlofloB/smollm2-135M_pretrained_1000k_fineweb_uncovai_human_removed
73ba3da387b3bdc50d6e3594c5c89ddebb271e81
4.06135
apache-2.0
0
0.135
true
false
false
false
0.678209
0.155373
15.53733
0.306643
3.274267
0.006042
0.60423
0.250839
0.111857
0.358031
3.253906
0.114279
1.58651
false
false
2025-01-24
2025-01-27
5
HuggingFaceTB/SmolLM2-135M
FlofloB_smollm2-135M_pretrained_1000k_fineweb_uncovai_selected_float16
float16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/FlofloB/smollm2-135M_pretrained_1000k_fineweb_uncovai_selected" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">FlofloB/smollm2-135M_pretrained_1000k_fineweb_uncovai_selected</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/FlofloB__smollm2-135M_pretrained_1000k_fineweb_uncovai_selected-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
FlofloB/smollm2-135M_pretrained_1000k_fineweb_uncovai_selected
e2115c3c7315400cb6338465672087c457b157ac
5.055843
apache-2.0
0
0.135
true
false
false
false
0.669616
0.146781
14.678054
0.293178
2.113414
0.006798
0.679758
0.26594
2.12528
0.40476
8.995052
0.115691
1.743499
false
false
2025-01-12
2025-01-12
5
HuggingFaceTB/SmolLM2-135M
FlofloB_smollm2-135M_pretrained_1200k_fineweb_float16
float16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/FlofloB/smollm2-135M_pretrained_1200k_fineweb" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">FlofloB/smollm2-135M_pretrained_1200k_fineweb</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/FlofloB__smollm2-135M_pretrained_1200k_fineweb-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
FlofloB/smollm2-135M_pretrained_1200k_fineweb
d886605e0d45787f492f628fd0ea72c27f205f83
4.188312
apache-2.0
0
0.135
true
false
false
false
0.670761
0.158096
15.809607
0.294098
2.237296
0.006798
0.679758
0.264262
1.901566
0.371365
3.653906
0.10763
0.847739
false
false
2025-01-12
2025-01-12
6
HuggingFaceTB/SmolLM2-135M
FlofloB_smollm2-135M_pretrained_1200k_fineweb_uncovai_human_removed_float16
float16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/FlofloB/smollm2-135M_pretrained_1200k_fineweb_uncovai_human_removed" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">FlofloB/smollm2-135M_pretrained_1200k_fineweb_uncovai_human_removed</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/FlofloB__smollm2-135M_pretrained_1200k_fineweb_uncovai_human_removed-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
FlofloB/smollm2-135M_pretrained_1200k_fineweb_uncovai_human_removed
d743033d6f0048af31089e1133de7cee8b1e83f5
4.280291
apache-2.0
0
0.135
true
false
false
false
0.672153
0.157771
15.777138
0.294962
2.849419
0.000755
0.075529
0.265101
2.013423
0.37
3.416667
0.113946
1.549572
false
false
2025-01-27
2025-01-27
6
HuggingFaceTB/SmolLM2-135M
FlofloB_smollm2-135M_pretrained_1200k_fineweb_uncovai_selected_float16
float16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/FlofloB/smollm2-135M_pretrained_1200k_fineweb_uncovai_selected" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">FlofloB/smollm2-135M_pretrained_1200k_fineweb_uncovai_selected</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/FlofloB__smollm2-135M_pretrained_1200k_fineweb_uncovai_selected-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
FlofloB/smollm2-135M_pretrained_1200k_fineweb_uncovai_selected
8c05c5b2f00c84d4120b3221c81c1f481c585768
4.030505
apache-2.0
0
0.135
true
false
false
false
0.67059
0.158471
15.847064
0.296047
2.206545
0.007553
0.755287
0.263423
1.789709
0.356729
1.757812
0.116439
1.826611
false
false
2025-01-12
2025-01-14
6
HuggingFaceTB/SmolLM2-135M
FlofloB_smollm2-135M_pretrained_1400k_fineweb_float16
float16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/FlofloB/smollm2-135M_pretrained_1400k_fineweb" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">FlofloB/smollm2-135M_pretrained_1400k_fineweb</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/FlofloB__smollm2-135M_pretrained_1400k_fineweb-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
FlofloB/smollm2-135M_pretrained_1400k_fineweb
a9c59a43cf0da87ad05ec8bd4a4c75d22c2e367c
4.992957
apache-2.0
0
0.135
true
false
false
false
0.688093
0.176381
17.638089
0.292178
2.1601
0.011329
1.132931
0.26594
2.12528
0.387333
6.016667
0.107962
0.884678
false
false
2025-01-13
2025-01-13
7
HuggingFaceTB/SmolLM2-135M
FlofloB_smollm2-135M_pretrained_1400k_fineweb_uncovai_human_removed_float16
float16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/FlofloB/smollm2-135M_pretrained_1400k_fineweb_uncovai_human_removed" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">FlofloB/smollm2-135M_pretrained_1400k_fineweb_uncovai_human_removed</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/FlofloB__smollm2-135M_pretrained_1400k_fineweb_uncovai_human_removed-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
FlofloB/smollm2-135M_pretrained_1400k_fineweb_uncovai_human_removed
f2851eedb367100fa0ca50ed25ff610a83713de2
5.063032
apache-2.0
0
0.135
true
false
false
false
0.688245
0.170661
17.066051
0.299239
2.630029
0.010574
1.057402
0.260906
1.454139
0.393938
7.008854
0.110455
1.161717
false
false
2025-01-28
2025-01-28
7
HuggingFaceTB/SmolLM2-135M
FlofloB_smollm2-135M_pretrained_1400k_fineweb_uncovai_selected_float16
float16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/FlofloB/smollm2-135M_pretrained_1400k_fineweb_uncovai_selected" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">FlofloB/smollm2-135M_pretrained_1400k_fineweb_uncovai_selected</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/FlofloB__smollm2-135M_pretrained_1400k_fineweb_uncovai_selected-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
FlofloB/smollm2-135M_pretrained_1400k_fineweb_uncovai_selected
098a8e666d272a8cb4863b0877b6f4507e1c230c
4.62464
apache-2.0
0
0.135
true
false
false
false
0.675417
0.15385
15.384956
0.291673
2.631616
0.010574
1.057402
0.268456
2.46085
0.374062
4.691146
0.113697
1.521868
false
false
2025-01-13
2025-01-13
7
HuggingFaceTB/SmolLM2-135M
FlofloB_smollm2-135M_pretrained_200k_fineweb_uncovai_human_removed_float16
float16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/FlofloB/smollm2-135M_pretrained_200k_fineweb_uncovai_human_removed" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">FlofloB/smollm2-135M_pretrained_200k_fineweb_uncovai_human_removed</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/FlofloB__smollm2-135M_pretrained_200k_fineweb_uncovai_human_removed-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
FlofloB/smollm2-135M_pretrained_200k_fineweb_uncovai_human_removed
4bacfcaa1040d1cba93da123ce57749bf2ed5e82
3.881968
apache-2.0
0
0.135
true
false
false
false
0.666409
0.14748
14.74798
0.302874
2.82254
0.003776
0.377644
0.258389
1.118568
0.357844
2.897135
0.111951
1.32794
false
false
2025-01-17
2025-01-17
1
HuggingFaceTB/SmolLM2-135M
FlofloB_smollm2-135M_pretrained_200k_fineweb_uncovai_selected_float16
float16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/FlofloB/smollm2-135M_pretrained_200k_fineweb_uncovai_selected" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">FlofloB/smollm2-135M_pretrained_200k_fineweb_uncovai_selected</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/FlofloB__smollm2-135M_pretrained_200k_fineweb_uncovai_selected-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
FlofloB/smollm2-135M_pretrained_200k_fineweb_uncovai_selected
381cdec29375aeaf0fb1bcc8ab2218443fc1cadd
3.492026
apache-2.0
1
0.135
true
false
false
false
0.68229
0.134515
13.451531
0.292719
2.322352
0.007553
0.755287
0.250839
0.111857
0.366031
2.853906
0.113115
1.457225
false
false
2025-01-08
2025-01-08
1
HuggingFaceTB/SmolLM2-135M
FlofloB_smollm2-135M_pretrained_400k_fineweb_float16
float16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/FlofloB/smollm2-135M_pretrained_400k_fineweb" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">FlofloB/smollm2-135M_pretrained_400k_fineweb</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/FlofloB__smollm2-135M_pretrained_400k_fineweb-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
FlofloB/smollm2-135M_pretrained_400k_fineweb
2601cf93307104afc3f57f467323f5368567cb74
4.224945
apache-2.0
0
0.135
true
false
false
false
0.691257
0.151127
15.112679
0.297234
1.889766
0.012085
1.208459
0.252517
0.33557
0.379427
4.995052
0.116273
1.808141
false
false
2025-01-09
2025-01-10
2
HuggingFaceTB/SmolLM2-135M
FlofloB_smollm2-135M_pretrained_400k_fineweb_uncovai_human_removed_float16
float16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/FlofloB/smollm2-135M_pretrained_400k_fineweb_uncovai_human_removed" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">FlofloB/smollm2-135M_pretrained_400k_fineweb_uncovai_human_removed</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/FlofloB__smollm2-135M_pretrained_400k_fineweb_uncovai_human_removed-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
FlofloB/smollm2-135M_pretrained_400k_fineweb_uncovai_human_removed
c99f5022db1982d463626b4d87c7aeeff519b3fa
4.710927
apache-2.0
0
0.135
true
false
false
false
0.679336
0.155648
15.564812
0.30488
3.575492
0.009063
0.906344
0.255034
0.671141
0.386
6.016667
0.11378
1.531102
false
false
2025-01-18
2025-01-18
2
HuggingFaceTB/SmolLM2-135M
FlofloB_smollm2-135M_pretrained_400k_fineweb_uncovai_selected_float16
float16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/FlofloB/smollm2-135M_pretrained_400k_fineweb_uncovai_selected" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">FlofloB/smollm2-135M_pretrained_400k_fineweb_uncovai_selected</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/FlofloB__smollm2-135M_pretrained_400k_fineweb_uncovai_selected-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
FlofloB/smollm2-135M_pretrained_400k_fineweb_uncovai_selected
ecac44607d60c294b460a8786f6253d561f3de85
4.387331
apache-2.0
1
0.135
true
false
false
false
0.67153
0.158421
15.842077
0.292517
2.073466
0.006798
0.679758
0.254195
0.559284
0.382
5.416667
0.115775
1.752733
false
false
2025-01-09
2025-01-09
2
HuggingFaceTB/SmolLM2-135M
FlofloB_smollm2-135M_pretrained_600k_fineweb_float16
float16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/FlofloB/smollm2-135M_pretrained_600k_fineweb" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">FlofloB/smollm2-135M_pretrained_600k_fineweb</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/FlofloB__smollm2-135M_pretrained_600k_fineweb-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
FlofloB/smollm2-135M_pretrained_600k_fineweb
6922498cf15ce9558b8ad2c33fc43106628d0cec
4.886739
apache-2.0
0
0.135
true
false
false
false
0.674154
0.163916
16.391619
0.301372
3.424053
0.006042
0.60423
0.26594
2.12528
0.380854
5.373437
0.112616
1.401817
false
false
2025-01-10
2025-01-11
3
HuggingFaceTB/SmolLM2-135M
FlofloB_smollm2-135M_pretrained_600k_fineweb_uncovai_human_removed_float16
float16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/FlofloB/smollm2-135M_pretrained_600k_fineweb_uncovai_human_removed" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">FlofloB/smollm2-135M_pretrained_600k_fineweb_uncovai_human_removed</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/FlofloB__smollm2-135M_pretrained_600k_fineweb_uncovai_human_removed-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
FlofloB/smollm2-135M_pretrained_600k_fineweb_uncovai_human_removed
02a7c39af8a00dbd0ffa449cd830cf57261246b3
4.644402
apache-2.0
0
0.135
true
false
false
false
0.667734
0.164141
16.414115
0.300017
2.418749
0.009063
0.906344
0.262584
1.677852
0.379333
4.816667
0.114694
1.632683
false
false
2025-01-18
2025-01-19
3
HuggingFaceTB/SmolLM2-135M
FlofloB_smollm2-135M_pretrained_600k_fineweb_uncovai_selected_float16
float16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/FlofloB/smollm2-135M_pretrained_600k_fineweb_uncovai_selected" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">FlofloB/smollm2-135M_pretrained_600k_fineweb_uncovai_selected</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/FlofloB__smollm2-135M_pretrained_600k_fineweb_uncovai_selected-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
FlofloB/smollm2-135M_pretrained_600k_fineweb_uncovai_selected
66e4931a5409bb8739522ff5df3b4f3373738fad
4.657606
apache-2.0
0
0.135
true
false
false
false
0.675576
0.160594
16.059389
0.298344
2.165156
0.007553
0.755287
0.260906
1.454139
0.384635
5.71276
0.11619
1.798907
false
false
2025-01-09
2025-01-09
3
HuggingFaceTB/SmolLM2-135M
FlofloB_smollm2-135M_pretrained_800k_fineweb_float16
float16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/FlofloB/smollm2-135M_pretrained_800k_fineweb" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">FlofloB/smollm2-135M_pretrained_800k_fineweb</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/FlofloB__smollm2-135M_pretrained_800k_fineweb-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
FlofloB/smollm2-135M_pretrained_800k_fineweb
066f4d48c5f6d83ac9a44e8572a3d20c74f6ec08
4.174506
apache-2.0
0
0.135
true
false
false
false
0.672662
0.164141
16.414115
0.295944
2.348388
0.008308
0.830816
0.249161
0
0.370125
3.765625
0.115193
1.688091
false
false
2025-01-11
2025-01-14
4
HuggingFaceTB/SmolLM2-135M
FlofloB_smollm2-135M_pretrained_800k_fineweb_uncovai_human_removed_float16
float16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/FlofloB/smollm2-135M_pretrained_800k_fineweb_uncovai_human_removed" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">FlofloB/smollm2-135M_pretrained_800k_fineweb_uncovai_human_removed</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/FlofloB__smollm2-135M_pretrained_800k_fineweb_uncovai_human_removed-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
FlofloB/smollm2-135M_pretrained_800k_fineweb_uncovai_human_removed
60c100113d77cced9b284172608f100297183ac9
5.032543
apache-2.0
0
0.135
true
false
false
false
0.668405
0.162293
16.229272
0.30381
3.210703
0.006798
0.679758
0.252517
0.33557
0.399271
8.208854
0.11378
1.531102
false
false
2025-01-19
2025-01-19
4
HuggingFaceTB/SmolLM2-135M
FlofloB_smollm2-135M_pretrained_800k_fineweb_uncovai_selected_float16
float16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/FlofloB/smollm2-135M_pretrained_800k_fineweb_uncovai_selected" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">FlofloB/smollm2-135M_pretrained_800k_fineweb_uncovai_selected</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/FlofloB__smollm2-135M_pretrained_800k_fineweb_uncovai_selected-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
FlofloB/smollm2-135M_pretrained_800k_fineweb_uncovai_selected
7b351540b5fb395759e44385826c5fedef8672ec
4.118739
apache-2.0
0
0.135
true
false
false
false
0.670237
0.14743
14.742993
0.294281
1.922858
0.004532
0.453172
0.261745
1.565996
0.376635
4.579427
0.113032
1.447991
false
false
2025-01-11
2025-01-14
4
HuggingFaceTB/SmolLM2-135M
FlofloB_smollm2_pretrained_200k_fineweb_float16
float16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/FlofloB/smollm2_pretrained_200k_fineweb" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">FlofloB/smollm2_pretrained_200k_fineweb</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/FlofloB__smollm2_pretrained_200k_fineweb-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
FlofloB/smollm2_pretrained_200k_fineweb
c3086ab3555e766f0b3903b8b9a1a290e3e25f3d
4.005599
apache-2.0
1
0.135
true
false
false
false
0.659464
0.1527
15.270039
0.299468
2.872523
0.003776
0.377644
0.247483
0
0.369938
3.742187
0.115941
1.771203
false
false
2025-01-08
2025-01-08
1
HuggingFaceTB/SmolLM2-135M
HuggingFaceTB_SmolLM-135M-Instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/HuggingFaceTB/SmolLM-135M-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">HuggingFaceTB/SmolLM-135M-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/HuggingFaceTB__SmolLM-135M-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
HuggingFaceTB/SmolLM-135M-Instruct
8ca7af58e27777cae460ad8ca3ab9db15f5c160d
3.652288
apache-2.0
107
0.135
true
false
false
true
0.59726
0.121401
12.140122
0.301508
2.692958
0.005287
0.528701
0.259228
1.230425
0.363458
3.365625
0.117603
1.955895
false
true
2024-07-15
2024-10-12
1
HuggingFaceTB/SmolLM-135M
HuggingFaceTB_SmolLM2-135M_bfloat16
bfloat16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/HuggingFaceTB/SmolLM2-135M" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">HuggingFaceTB/SmolLM2-135M</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/HuggingFaceTB__SmolLM2-135M-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
HuggingFaceTB/SmolLM2-135M
28e66ca6931668447a3bac213f23d990ad3b0e2b
5.695927
apache-2.0
66
0.135
true
false
false
false
0.677924
0.181777
18.177658
0.304423
3.708078
0.012085
1.208459
0.248322
0
0.411177
10.030469
0.109458
1.050901
false
true
2024-10-31
2024-11-06
0
HuggingFaceTB/SmolLM2-135M
HuggingFaceTB_SmolLM2-135M-Instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/HuggingFaceTB/SmolLM2-135M-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">HuggingFaceTB/SmolLM2-135M-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/HuggingFaceTB__SmolLM2-135M-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
HuggingFaceTB/SmolLM2-135M-Instruct
5a33ba103645800d7b3790c4448546c1b73efc71
6.467365
apache-2.0
145
0.135
true
false
false
true
0.338376
0.288314
28.83139
0.312432
4.720808
0.003021
0.302115
0.235738
0
0.366219
3.677344
0.111453
1.272533
false
true
2024-10-31
2024-11-06
1
HuggingFaceTB/SmolLM2-135M-Instruct (Merge)
HuggingFaceTB_SmolLM2-135M-Instruct_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/HuggingFaceTB/SmolLM2-135M-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">HuggingFaceTB/SmolLM2-135M-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/HuggingFaceTB__SmolLM2-135M-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
HuggingFaceTB/SmolLM2-135M-Instruct
5a33ba103645800d7b3790c4448546c1b73efc71
3.206597
apache-2.0
145
0.135
true
false
false
false
0.697508
0.059252
5.925167
0.313475
4.796276
0.01435
1.435045
0.23406
0
0.387146
6.059896
0.109209
1.023197
false
true
2024-10-31
2024-11-14
1
HuggingFaceTB/SmolLM2-135M-Instruct (Merge)
SeppeV_SmolLM_pretrained_with_sft_trained_with_1pc_data_on_a_preference_dpo_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/SeppeV/SmolLM_pretrained_with_sft_trained_with_1pc_data_on_a_preference_dpo" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">SeppeV/SmolLM_pretrained_with_sft_trained_with_1pc_data_on_a_preference_dpo</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/SeppeV__SmolLM_pretrained_with_sft_trained_with_1pc_data_on_a_preference_dpo-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
SeppeV/SmolLM_pretrained_with_sft_trained_with_1pc_data_on_a_preference_dpo
6ced77bb27efc0d6f33d447b9cc8fca35976e91c
4.299519
0
0.135
false
false
false
true
0.654393
0.095546
9.554648
0.307267
3.612865
0.012085
1.208459
0.259228
1.230425
0.403208
8.401042
0.116107
1.789672
false
false
2024-10-12
0
Removed
abhishek_autotrain-0tmgq-5tpbg_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/abhishek/autotrain-0tmgq-5tpbg" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">abhishek/autotrain-0tmgq-5tpbg</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/abhishek__autotrain-0tmgq-5tpbg-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
abhishek/autotrain-0tmgq-5tpbg
a75e1fda984e009613dca3b7846c579a37ab0673
4.856619
other
0
0.135
true
false
false
true
0.351828
0.195715
19.571515
0.313451
4.268752
0
0
0.251678
0.223714
0.365042
3.396875
0.11511
1.678856
false
false
2024-11-19
2024-12-03
2
HuggingFaceTB/SmolLM2-135M-Instruct (Merge)
abhishek_autotrain-0tmgq-5tpbg_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/abhishek/autotrain-0tmgq-5tpbg" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">abhishek/autotrain-0tmgq-5tpbg</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/abhishek__autotrain-0tmgq-5tpbg-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
abhishek/autotrain-0tmgq-5tpbg
a75e1fda984e009613dca3b7846c579a37ab0673
5.051545
other
0
0.135
true
false
false
true
0.673608
0.195165
19.516549
0.312733
4.419023
0.01284
1.283988
0.259228
1.230425
0.358375
2.263542
0.114362
1.595745
false
false
2024-11-19
2024-12-04
2
HuggingFaceTB/SmolLM2-135M-Instruct (Merge)
amd_AMD-Llama-135m_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/amd/AMD-Llama-135m" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">amd/AMD-Llama-135m</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/amd__AMD-Llama-135m-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
amd/AMD-Llama-135m
8f9c39b5ed86d422ab332ed1ecf042fdaeb57903
4.759627
apache-2.0
111
0.135
true
false
false
false
0.128719
0.184225
18.422452
0.297393
2.485495
0.005287
0.528701
0.252517
0.33557
0.377969
4.91276
0.116855
1.872784
false
true
2024-07-19
2024-09-29
0
amd/AMD-Llama-135m
ewre324_Thinker-SmolLM2-135M-Instruct-Reasoning_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/ewre324/Thinker-SmolLM2-135M-Instruct-Reasoning" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ewre324/Thinker-SmolLM2-135M-Instruct-Reasoning</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ewre324__Thinker-SmolLM2-135M-Instruct-Reasoning-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ewre324/Thinker-SmolLM2-135M-Instruct-Reasoning
7dbd1a18e98892dbff1c6a51550ded17398e8518
5.843149
apache-2.0
1
0.135
true
false
false
false
0.668103
0.258363
25.836336
0.307135
3.973353
0.009063
0.906344
0.252517
0.33557
0.366125
2.965625
0.109375
1.041667
false
false
2025-01-07
2025-01-07
1
ewre324/Thinker-SmolLM2-135M-Instruct-Reasoning (Merge)
ewre324_ewre324-R1-SmolLM2-135M-Distill_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/ewre324/ewre324-R1-SmolLM2-135M-Distill" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ewre324/ewre324-R1-SmolLM2-135M-Distill</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ewre324__ewre324-R1-SmolLM2-135M-Distill-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ewre324/ewre324-R1-SmolLM2-135M-Distill
3592b13d6df2b6090819afed0be93b374b649b8d
4.164699
0
0.135
false
false
false
true
0.705821
0.16489
16.489027
0.30417
3.383004
0.01284
1.283988
0.261745
1.565996
0.340917
0.78125
0.113364
1.484929
false
false
2025-01-30
2025-01-30
1
ewre324/ewre324-R1-SmolLM2-135M-Distill (Merge)
vonjack_SmolLM2-135M-Merged_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/vonjack/SmolLM2-135M-Merged" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">vonjack/SmolLM2-135M-Merged</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/vonjack__SmolLM2-135M-Merged-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
vonjack/SmolLM2-135M-Merged
a1700ca913a87ad713edfe57a2030a9d7c088970
5.87243
0
0.135
false
false
false
true
0.691021
0.248297
24.829674
0.309993
4.587041
0.011329
1.132931
0.238255
0
0.366187
3.440104
0.111203
1.244829
false
false
2024-11-15
2024-11-15
1
vonjack/SmolLM2-135M-Merged (Merge)
gpt2_bfloat16
bfloat16
🟢 pretrained
🟢
Original
GPT2LMHeadModel
<a target="_blank" href="https://huggingface.co/gpt2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">gpt2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/gpt2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
gpt2
607a30d783dfa663caf39e06633721c8d4cfcd7e
6.39103
mit
2,602
0.137
true
false
false
false
0.323928
0.193417
19.34168
0.303639
2.714298
0.003021
0.302115
0.260067
1.342282
0.432417
12.985417
0.114943
1.660387
false
true
2022-03-02
2024-06-26
0
gpt2
gpt2_float16
float16
🟢 pretrained
🟢
Original
GPT2LMHeadModel
<a target="_blank" href="https://huggingface.co/gpt2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">gpt2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/gpt2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
gpt2
607a30d783dfa663caf39e06633721c8d4cfcd7e
5.977737
mit
2,602
0.137
true
false
false
false
0.039245
0.083333
8.333333
0.308333
9.199755
0
0
0.233333
0
0.433333
18.333333
0.1
0
false
true
2022-03-02
2024-06-26
0
gpt2
openai-community_gpt2_bfloat16
bfloat16
🟢 pretrained
🟢
Original
GPT2LMHeadModel
<a target="_blank" href="https://huggingface.co/openai-community/gpt2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">openai-community/gpt2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/openai-community__gpt2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
openai-community/gpt2
607a30d783dfa663caf39e06633721c8d4cfcd7e
6.510807
mit
2,602
0.137
true
false
false
false
0.085941
0.179253
17.925327
0.303571
2.674981
0.002266
0.226586
0.258389
1.118568
0.447052
15.348177
0.115941
1.771203
false
true
2022-03-02
2024-06-12
0
openai-community/gpt2
openai-community_gpt2_float16
float16
🟢 pretrained
🟢
Original
GPT2LMHeadModel
<a target="_blank" href="https://huggingface.co/openai-community/gpt2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">openai-community/gpt2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/openai-community__gpt2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
openai-community/gpt2
607a30d783dfa663caf39e06633721c8d4cfcd7e
6.334235
mit
2,602
0.137
true
false
false
false
0.234774
0.177954
17.795449
0.301658
2.815911
0.005287
0.528701
0.258389
1.118568
0.439021
13.910938
0.116523
1.835845
false
true
2022-03-02
2024-08-12
0
openai-community/gpt2
EleutherAI_gpt-neo-125m_bfloat16
bfloat16
🟢 pretrained
🟢
Original
GPTNeoForCausalLM
<a target="_blank" href="https://huggingface.co/EleutherAI/gpt-neo-125m" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EleutherAI/gpt-neo-125m</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EleutherAI__gpt-neo-125m-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
EleutherAI/gpt-neo-125m
21def0189f5705e2521767faed922f1f15e7d7db
4.407322
mit
196
0.15
true
false
false
false
0.405805
0.190544
19.054442
0.311516
3.436739
0.006042
0.60423
0.253356
0.447427
0.359333
2.616667
0.10256
0.284427
false
true
2022-03-02
2024-08-10
0
EleutherAI/gpt-neo-125m
Felladrin_Llama-160M-Chat-v1_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Felladrin/Llama-160M-Chat-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Felladrin/Llama-160M-Chat-v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Felladrin__Llama-160M-Chat-v1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Felladrin/Llama-160M-Chat-v1
e7f50665676821867ee7dfad32d0ca9fb68fc6bc
4.201766
apache-2.0
18
0.162
true
false
false
true
0.363161
0.157546
15.754642
0.303608
3.166756
0.006042
0.60423
0.25755
1.006711
0.366125
3.165625
0.113614
1.512633
false
false
2023-12-20
2024-07-23
1
JackFram/llama-160m
JackFram_llama-160m_bfloat16
bfloat16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/JackFram/llama-160m" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">JackFram/llama-160m</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/JackFram__llama-160m-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
JackFram/llama-160m
aca9b687d1425f863dcf5de9a4c96e3fe36266dd
4.73813
apache-2.0
33
0.162
true
false
false
false
0.186949
0.179104
17.910367
0.288802
2.033606
0.008308
0.830816
0.261745
1.565996
0.379208
4.667708
0.112783
1.420287
false
false
2023-05-26
2024-11-30
0
JackFram/llama-160m
google_mt5-small_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MT5ForConditionalGeneration
<a target="_blank" href="https://huggingface.co/google/mt5-small" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">google/mt5-small</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/google__mt5-small-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
google/mt5-small
73fb5dbe4756edadc8fbe8c769b0a109493acf7a
4.255928
apache-2.0
136
0.17
true
false
false
false
0.360987
0.17181
17.180969
0.276584
1.070971
0
0
0.24245
0
0.38575
5.91875
0.112284
1.364879
false
true
2022-03-02
2024-09-06
0
google/mt5-small
EleutherAI_pythia-160m_float16
float16
🟢 pretrained
🟢
Original
GPTNeoXForCausalLM
<a target="_blank" href="https://huggingface.co/EleutherAI/pythia-160m" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EleutherAI/pythia-160m</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EleutherAI__pythia-160m-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
EleutherAI/pythia-160m
50f5173d932e8e61f858120bcb800b97af589f46
5.730395
apache-2.0
30
0.213
true
false
false
false
0.470677
0.181552
18.155162
0.297044
2.198832
0.009063
0.906344
0.258389
1.118568
0.417938
10.675521
0.111951
1.32794
false
true
2023-02-08
2024-06-09
0
EleutherAI/pythia-160m
BEE-spoke-data_smol_llama-220M-GQA_bfloat16
bfloat16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/BEE-spoke-data/smol_llama-220M-GQA" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BEE-spoke-data/smol_llama-220M-GQA</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BEE-spoke-data__smol_llama-220M-GQA-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
BEE-spoke-data/smol_llama-220M-GQA
8845b1d3c0bc73522ef2700aab467183cbdca9f7
6.577801
apache-2.0
12
0.218
true
false
false
false
0.327227
0.238605
23.860468
0.303167
3.037843
0.010574
1.057402
0.255872
0.782998
0.405875
9.067708
0.114943
1.660387
false
false
2023-12-22
2024-06-26
0
BEE-spoke-data/smol_llama-220M-GQA
BEE-spoke-data_smol_llama-220M-GQA-fineweb_edu_bfloat16
bfloat16
🟩 continuously pretrained
🟩
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/BEE-spoke-data/smol_llama-220M-GQA-fineweb_edu" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BEE-spoke-data/smol_llama-220M-GQA-fineweb_edu</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BEE-spoke-data__smol_llama-220M-GQA-fineweb_edu-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
BEE-spoke-data/smol_llama-220M-GQA-fineweb_edu
dec16b41d5e94070dbc1f8449a554373fd4cc1d1
6.629851
apache-2.0
1
0.218
true
false
false
false
0.323752
0.198812
19.881248
0.292905
2.314902
0.006798
0.679758
0.259228
1.230425
0.43676
14.261719
0.112699
1.411052
false
false
2024-06-08
2024-06-26
1
BEE-spoke-data/smol_llama-220M-GQA
BEE-spoke-data_smol_llama-220M-openhermes_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/BEE-spoke-data/smol_llama-220M-openhermes" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BEE-spoke-data/smol_llama-220M-openhermes</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BEE-spoke-data__smol_llama-220M-openhermes-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
BEE-spoke-data/smol_llama-220M-openhermes
fb4bcd4b7eee363baacb4176a26cea2aaeb173f4
4.938005
apache-2.0
5
0.218
true
false
false
false
0.308852
0.155523
15.55229
0.302752
3.107692
0.010574
1.057402
0.267617
2.348993
0.384729
6.224479
0.112035
1.337175
false
false
2023-12-30
2024-09-21
1
BEE-spoke-data/smol_llama-220M-GQA
Daemontatox_TinySphinx_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Daemontatox/TinySphinx" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Daemontatox/TinySphinx</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Daemontatox__TinySphinx-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Daemontatox/TinySphinx
62172ccb670864070581498fb12e7d2594ac3a77
8.167167
0
0.247
false
false
false
false
1.007256
0.25669
25.669003
0.330984
6.546576
0.043051
4.305136
0.27349
3.131991
0.33276
1.595052
0.169797
7.755245
false
false
2024-12-31
0
Removed
Daemontatox_TinySphinx2.0_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Daemontatox/TinySphinx2.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Daemontatox/TinySphinx2.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Daemontatox__TinySphinx2.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Daemontatox/TinySphinx2.0
accc28aa00084fe89801baa0885c291d18a031ec
7.583927
0
0.247
false
false
false
false
1.004172
0.253517
25.351733
0.316841
5.004029
0.032477
3.247734
0.268456
2.46085
0.33825
1.314583
0.173122
8.124631
false
false
2024-12-31
0
Removed
Locutusque_TinyMistral-248M-v2.5_float16
float16
🟩 continuously pretrained
🟩
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Locutusque/TinyMistral-248M-v2.5" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Locutusque/TinyMistral-248M-v2.5</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Locutusque__TinyMistral-248M-v2.5-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Locutusque/TinyMistral-248M-v2.5
214e48aabc01235e25c67477898756f1bebef215
4.035439
apache-2.0
27
0.248
true
false
false
true
0.484429
0.133641
13.364096
0.303858
3.181881
0.009819
0.981873
0.250839
0.111857
0.378156
5.069531
0.113531
1.503398
true
false
2024-01-24
2024-09-17
0
Locutusque/TinyMistral-248M-v2.5
M4-ai_TinyMistral-248M-v3_bfloat16
bfloat16
🟢 pretrained
🟢
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/M4-ai/TinyMistral-248M-v3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">M4-ai/TinyMistral-248M-v3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/M4-ai__TinyMistral-248M-v3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
M4-ai/TinyMistral-248M-v3
fa23fe617768c671f0bbbff1edf4556cfe844167
4.205636
apache-2.0
6
0.248
true
false
false
false
0.468367
0.163866
16.386632
0.288455
1.777554
0.004532
0.453172
0.240772
0
0.379333
5.15
0.113198
1.46646
false
false
2024-02-05
2024-10-18
0
M4-ai/TinyMistral-248M-v3
google_flan-t5-base_float16
float16
🟢 pretrained
🟢
Original
T5ForConditionalGeneration
<a target="_blank" href="https://huggingface.co/google/flan-t5-base" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">google/flan-t5-base</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/google__flan-t5-base-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
google/flan-t5-base
7bcac572ce56db69c1ea7c8af255c5d7c9672fc2
6.415642
apache-2.0
839
0.248
true
false
false
false
0.313243
0.189071
18.907056
0.352598
11.337694
0.010574
1.057402
0.238255
0
0.367115
3.222656
0.135721
3.969046
false
true
2022-10-21
2024-08-14
0
google/flan-t5-base
keeeeenw_MicroLlama_float16
float16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/keeeeenw/MicroLlama" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">keeeeenw/MicroLlama</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/keeeeenw__MicroLlama-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
keeeeenw/MicroLlama
8d5874ca07b86ea1ea2e71eea96212278506ba65
5.266088
apache-2.0
43
0.305
true
false
false
false
0.371536
0.198538
19.853766
0.300731
2.831364
0.011329
1.132931
0.260906
1.454139
0.369812
4.793229
0.11378
1.531102
false
false
2024-03-29
2024-09-15
0
keeeeenw/MicroLlama
Mxode_NanoLM-0.3B-Instruct-v1_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Mxode/NanoLM-0.3B-Instruct-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Mxode/NanoLM-0.3B-Instruct-v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Mxode__NanoLM-0.3B-Instruct-v1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Mxode/NanoLM-0.3B-Instruct-v1
638cda2c122e96c7992227b56b29967d9c8fd57e
5.737739
gpl-3.0
0
0.315
true
false
false
true
1.212552
0.153674
15.367447
0.302825
3.10461
0.01435
1.435045
0.271812
2.908277
0.415521
10.440104
0.110539
1.170952
false
false
2024-09-03
2024-09-05
0
Mxode/NanoLM-0.3B-Instruct-v1
Mxode_NanoLM-0.3B-Instruct-v1.1_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Mxode/NanoLM-0.3B-Instruct-v1.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Mxode/NanoLM-0.3B-Instruct-v1.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Mxode__NanoLM-0.3B-Instruct-v1.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Mxode/NanoLM-0.3B-Instruct-v1.1
7338464708c691667b193e7bb8f6b5bb3f9df27d
5.974299
gpl-3.0
2
0.315
true
false
false
true
1.21496
0.178279
17.827919
0.30144
3.09528
0.013595
1.359517
0.25
0
0.427333
12.216667
0.112118
1.34641
false
false
2024-09-05
2024-09-05
0
Mxode/NanoLM-0.3B-Instruct-v1.1
Mxode_NanoLM-0.3B-Instruct-v2_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Mxode/NanoLM-0.3B-Instruct-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Mxode/NanoLM-0.3B-Instruct-v2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Mxode__NanoLM-0.3B-Instruct-v2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Mxode/NanoLM-0.3B-Instruct-v2
40027e2a1a404144975cfc0dd7d354057b98854b
5.013671
gpl-3.0
0
0.315
true
false
false
true
1.213042
0.166789
16.678857
0.29211
2.209481
0.006798
0.679758
0.260906
1.454139
0.395458
7.565625
0.113447
1.494164
false
false
2024-09-07
2024-09-08
0
Mxode/NanoLM-0.3B-Instruct-v2
microsoft_DialoGPT-medium_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
GPT2LMHeadModel
<a target="_blank" href="https://huggingface.co/microsoft/DialoGPT-medium" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">microsoft/DialoGPT-medium</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/microsoft__DialoGPT-medium-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
microsoft/DialoGPT-medium
7b40bb0f92c45fefa957d088000d8648e5c7fa33
5.251434
mit
356
0.345
true
false
false
true
0.258929
0.147904
14.790423
0.301416
2.556856
0
0
0.254195
0.559284
0.428667
12.283333
0.111868
1.318706
false
true
2022-03-02
2024-06-13
0
microsoft/DialoGPT-medium
Sharathhebbar24_SSH_355M_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
GPT2LMHeadModel
<a target="_blank" href="https://huggingface.co/Sharathhebbar24/SSH_355M" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Sharathhebbar24/SSH_355M</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Sharathhebbar24__SSH_355M-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Sharathhebbar24/SSH_355M
601988021bc27acf3c470fe70eed5db373df58db
5.371931
apache-2.0
0
0.355
true
false
false
false
0.171826
0.142359
14.235894
0.309859
3.496136
0.009063
0.906344
0.258389
1.118568
0.41775
10.51875
0.117603
1.955895
false
false
2024-02-06
2025-01-11
0
Sharathhebbar24/SSH_355M
HuggingFaceTB_SmolLM-360M_bfloat16
bfloat16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/HuggingFaceTB/SmolLM-360M" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">HuggingFaceTB/SmolLM-360M</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/HuggingFaceTB__SmolLM-360M-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
HuggingFaceTB/SmolLM-360M
318cc630b73730bfd712e5873063156ffb8936b5
6.260889
apache-2.0
62
0.36
true
false
false
false
0.730519
0.213351
21.335058
0.306452
3.284915
0.011329
1.132931
0.267617
2.348993
0.401781
8.089323
0.112367
1.374113
false
true
2024-07-14
2024-07-18
0
HuggingFaceTB/SmolLM-360M
HuggingFaceTB_SmolLM2-360M_bfloat16
bfloat16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/HuggingFaceTB/SmolLM2-360M" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">HuggingFaceTB/SmolLM2-360M</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/HuggingFaceTB__SmolLM2-360M-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
HuggingFaceTB/SmolLM2-360M
3ce05f63c246c44616da500b47b01f082f4d3bcc
6.251282
apache-2.0
40
0.36
true
false
false
false
0.773316
0.211452
21.145228
0.323348
5.543603
0.012085
1.208459
0.245805
0
0.395427
7.728385
0.116938
1.882018
false
true
2024-10-31
2024-11-06
0
HuggingFaceTB/SmolLM2-360M
HuggingFaceTB_SmolLM2-360M-Instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/HuggingFaceTB/SmolLM2-360M-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">HuggingFaceTB/SmolLM2-360M-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/HuggingFaceTB__SmolLM2-360M-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
HuggingFaceTB/SmolLM2-360M-Instruct
4873f67095301d304753fae05bc09ec766634e50
8.139566
apache-2.0
101
0.36
true
false
false
true
0.751639
0.38416
38.415959
0.314351
4.173864
0.015106
1.510574
0.255034
0.671141
0.346125
2.765625
0.111702
1.300236
false
true
2024-10-31
2024-11-06
0
HuggingFaceTB/SmolLM2-360M-Instruct
HuggingFaceTB_SmolLM-360M-Instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/HuggingFaceTB/SmolLM-360M-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">HuggingFaceTB/SmolLM-360M-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/HuggingFaceTB__SmolLM-360M-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
HuggingFaceTB/SmolLM-360M-Instruct
8e951de8c220295ea4f85d078c4e320df7137535
5.008899
apache-2.0
80
0.362
true
false
false
true
0.733002
0.195165
19.516549
0.288511
2.080374
0.018127
1.812689
0.264262
1.901566
0.347177
2.897135
0.116606
1.84508
false
true
2024-07-15
2024-08-20
1
HuggingFaceTB/SmolLM-360M
HuggingFaceTB_SmolLM2-360M-Instruct_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/HuggingFaceTB/SmolLM2-360M-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">HuggingFaceTB/SmolLM2-360M-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/HuggingFaceTB__SmolLM2-360M-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
HuggingFaceTB/SmolLM2-360M-Instruct
4873f67095301d304753fae05bc09ec766634e50
3.10002
apache-2.0
101
0.362
true
false
false
false
0.392382
0.083032
8.303191
0.30527
3.299047
0.008308
0.830816
0.265101
2.013423
0.342281
2.751823
0.112616
1.401817
false
true
2024-10-31
2024-11-14
0
HuggingFaceTB/SmolLM2-360M-Instruct
aloobun_d-SmolLM2-360M_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/aloobun/d-SmolLM2-360M" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">aloobun/d-SmolLM2-360M</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/aloobun__d-SmolLM2-360M-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
aloobun/d-SmolLM2-360M
2a1d82b4cbcdfdff3c2cfcd171435c5f01b8de43
6.184071
apache-2.0
1
0.362
true
false
false
false
0.740247
0.209704
20.970359
0.319578
4.762821
0.01284
1.283988
0.253356
0.447427
0.398063
7.757813
0.116938
1.882018
false
false
2024-11-20
2024-11-26
0
aloobun/d-SmolLM2-360M
prithivMLmods_SmolLM2-CoT-360M_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/prithivMLmods/SmolLM2-CoT-360M" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/SmolLM2-CoT-360M</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__SmolLM2-CoT-360M-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
prithivMLmods/SmolLM2-CoT-360M
474240d772fbb3b8da6f8eb47f32dd34c6b78baf
5.950748
apache-2.0
17
0.362
true
false
false
false
0.775503
0.221569
22.156877
0.31353
4.801205
0.020393
2.039275
0.236577
0
0.379396
5.757813
0.108544
0.94932
false
false
2025-01-05
2025-01-07
1
prithivMLmods/SmolLM2-CoT-360M (Merge)
thirdeyeai_elevate360m_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/thirdeyeai/elevate360m" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">thirdeyeai/elevate360m</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/thirdeyeai__elevate360m-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
thirdeyeai/elevate360m
f4321ba8704e732769d328952d217bdb564e1824
1.918188
0
0.362
false
false
false
false
0.737565
0.044489
4.448862
0.296258
2.339847
0.015861
1.586103
0.240772
0
0.346219
2.277344
0.107713
0.856974
false
false
2025-01-28
2025-01-29
0
thirdeyeai/elevate360m
voidful_smol-360m-ft_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/voidful/smol-360m-ft" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">voidful/smol-360m-ft</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/voidful__smol-360m-ft-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
voidful/smol-360m-ft
3889a38fc79d2400997e01bf1d00c8059d72fead
4.78993
apache-2.0
0
0.362
true
false
false
true
0.763459
0.20131
20.13103
0.301195
3.022706
0.008308
0.830816
0.245805
0
0.371365
3.78724
0.10871
0.96779
false
false
2024-11-24
2024-11-28
1
voidful/smol-360m-ft (Merge)
vonjack_SmolLM2-360M-Merged_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/vonjack/SmolLM2-360M-Merged" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">vonjack/SmolLM2-360M-Merged</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/vonjack__SmolLM2-360M-Merged-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
vonjack/SmolLM2-360M-Merged
32bceedf56b29a4a9fdd459a36fbc7fae5e274c8
7.294377
0
0.362
false
false
false
true
0.771484
0.320587
32.058715
0.315485
4.741734
0.017372
1.73716
0.255872
0.782998
0.352729
3.357813
0.109791
1.08784
false
false
2024-11-15
2024-11-15
1
vonjack/SmolLM2-360M-Merged (Merge)
vihangd_smart-dan-sft-v0.1_4bit
4bit
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/vihangd/smart-dan-sft-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">vihangd/smart-dan-sft-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/vihangd__smart-dan-sft-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
vihangd/smart-dan-sft-v0.1
924b4a09153d4061fa9d58f03b10cd7cde7e3084
3.871213
apache-2.0
0
0.379
true
false
false
false
0.722049
0.157646
15.764616
0.306177
3.125599
0.009819
0.981873
0.255034
0.671141
0.350188
1.106771
0.114195
1.577275
false
false
2024-08-09
2024-08-20
0
vihangd/smart-dan-sft-v0.1
openai-community_gpt2-medium_bfloat16
bfloat16
🟢 pretrained
🟢
Original
GPT2LMHeadModel
<a target="_blank" href="https://huggingface.co/openai-community/gpt2-medium" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">openai-community/gpt2-medium</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/openai-community__gpt2-medium-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
openai-community/gpt2-medium
6dcaa7a952f72f9298047fd5137cd6e4f05f41da
5.90234
mit
167
0.38
true
false
false
false
0.242124
0.220844
22.084403
0.305028
2.719972
0.007553
0.755287
0.262584
1.677852
0.388448
6.15599
0.118185
2.020538
false
true
2022-03-02
2024-06-12
0
openai-community/gpt2-medium
postbot_gpt2-medium-emailgen_bfloat16
bfloat16
🟩 continuously pretrained
🟩
Original
GPT2LMHeadModel
<a target="_blank" href="https://huggingface.co/postbot/gpt2-medium-emailgen" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">postbot/gpt2-medium-emailgen</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/postbot__gpt2-medium-emailgen-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
postbot/gpt2-medium-emailgen
a0299eb6760126e3bd04d2f10cd166c4563f82d2
4.743048
apache-2.0
6
0.38
true
false
false
false
0.156373
0.149203
14.9203
0.313043
3.6737
0
0
0.260067
1.342282
0.391115
6.889323
0.114694
1.632683
false
false
2022-09-29
2024-11-17
0
postbot/gpt2-medium-emailgen
google_mt5-base_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MT5ForConditionalGeneration
<a target="_blank" href="https://huggingface.co/google/mt5-base" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">google/mt5-base</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/google__mt5-base-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
google/mt5-base
2eb15465c5dd7f72a8f7984306ad05ebc3dd1e1f
3.71634
apache-2.0
220
0.39
true
false
false
false
0.40008
0.164516
16.451571
0.288316
1.298551
0.009063
0.906344
0.239094
0
0.367208
2.867708
0.106965
0.773862
false
true
2022-03-02
2024-09-06
0
google/mt5-base
jaredjoss_pythia-410m-roberta-lr_8e7-kl_01-steps_12000-rlhf-model_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
GPTNeoXForCausalLM
<a target="_blank" href="https://huggingface.co/jaredjoss/pythia-410m-roberta-lr_8e7-kl_01-steps_12000-rlhf-model" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">jaredjoss/pythia-410m-roberta-lr_8e7-kl_01-steps_12000-rlhf-model</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/jaredjoss__pythia-410m-roberta-lr_8e7-kl_01-steps_12000-rlhf-model-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
jaredjoss/pythia-410m-roberta-lr_8e7-kl_01-steps_12000-rlhf-model
048bc8edfc32fdcf6d957332d5f4c0d4e5950746
3.81661
mit
0
0.407
true
false
false
true
0.466127
0.157222
15.722173
0.286344
1.820374
0
0
0.259228
1.230425
0.360698
2.253906
0.116855
1.872784
false
false
2024-04-23
2024-08-06
0
jaredjoss/pythia-410m-roberta-lr_8e7-kl_01-steps_12000-rlhf-model
CoolSpring_Qwen2-0.5B-Abyme_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/CoolSpring/Qwen2-0.5B-Abyme" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CoolSpring/Qwen2-0.5B-Abyme</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CoolSpring__Qwen2-0.5B-Abyme-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CoolSpring/Qwen2-0.5B-Abyme
a48b7c04b854e5c60fe3464f96904bfc53c8640c
4.999994
apache-2.0
0
0.494
true
false
false
true
2.355595
0.191519
19.15185
0.286183
2.276484
0.029456
2.945619
0.253356
0.447427
0.354219
1.477344
0.133311
3.701241
false
false
2024-07-18
2024-09-04
1
Qwen/Qwen2-0.5B
JayHyeon_Qwen-0.5B-DPO-1epoch_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2Model
<a target="_blank" href="https://huggingface.co/JayHyeon/Qwen-0.5B-DPO-1epoch" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">JayHyeon/Qwen-0.5B-DPO-1epoch</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/JayHyeon__Qwen-0.5B-DPO-1epoch-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
JayHyeon/Qwen-0.5B-DPO-1epoch
f5569969d307d193798eff52c0527e23f4ac8bb9
7.385733
mit
0
0.494
true
false
false
true
0.956513
0.264733
26.473313
0.319075
5.543695
0.028701
2.870091
0.252517
0.33557
0.335177
2.897135
0.155751
6.194592
false
false
2024-12-26
2024-12-26
0
JayHyeon/Qwen-0.5B-DPO-1epoch