eval_name
stringlengths
12
111
Precision
stringclasses
3 values
Type
stringclasses
6 values
T
stringclasses
6 values
Weight type
stringclasses
2 values
Architecture
stringclasses
47 values
Model
stringlengths
355
650
fullname
stringlengths
4
102
Model sha
stringlengths
0
40
Average ⬆️
float64
1.41
50.8
Hub License
stringclasses
24 values
Hub ❤️
int64
0
5.8k
#Params (B)
int64
-1
140
Available on the hub
bool
2 classes
Not_Merged
bool
2 classes
MoE
bool
2 classes
Flagged
bool
1 class
Chat Template
bool
2 classes
IFEval Raw
float64
0
0.87
IFEval
float64
0
86.7
BBH Raw
float64
0.28
0.75
BBH
float64
0.81
62.8
MATH Lvl 5 Raw
float64
0
0.48
MATH Lvl 5
float64
0
47.6
GPQA Raw
float64
0.22
0.41
GPQA
float64
0
21.6
MUSR Raw
float64
0.29
0.59
MUSR
float64
0
36.4
MMLU-PRO Raw
float64
0.1
0.7
MMLU-PRO
float64
0
66.8
Maintainer's Highlight
bool
2 classes
Upload To Hub Date
stringlengths
0
10
Submission Date
stringclasses
130 values
Generation
int64
0
6
Base Model
stringlengths
4
102
0-hero_Matter-0.2-7B-DPO_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/0-hero/Matter-0.2-7B-DPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">0-hero/Matter-0.2-7B-DPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/0-hero__Matter-0.2-7B-DPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
0-hero/Matter-0.2-7B-DPO
26a66f0d862e2024ce4ad0a09c37052ac36e8af6
8.805656
apache-2.0
3
7
true
true
true
false
true
0.330279
33.027921
0.359625
10.055525
0.008308
0.830816
0.259228
1.230425
0.381375
5.871875
0.116356
1.817376
false
2024-04-13
2024-08-05
0
0-hero/Matter-0.2-7B-DPO
01-ai_Yi-1.5-34B_bfloat16
bfloat16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/01-ai/Yi-1.5-34B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">01-ai/Yi-1.5-34B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/01-ai__Yi-1.5-34B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
01-ai/Yi-1.5-34B
4b486f81c935a2dadde84c6baa1e1370d40a098f
25.432496
apache-2.0
46
34
true
true
true
false
false
0.284117
28.411725
0.597639
42.749363
0.140483
14.048338
0.365772
15.436242
0.423604
11.217188
0.466589
40.732122
true
2024-05-11
2024-06-12
0
01-ai/Yi-1.5-34B
01-ai_Yi-1.5-34B-32K_bfloat16
bfloat16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/01-ai/Yi-1.5-34B-32K" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">01-ai/Yi-1.5-34B-32K</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/01-ai__Yi-1.5-34B-32K-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
01-ai/Yi-1.5-34B-32K
2c03a29761e4174f20347a60fbe229be4383d48b
26.400622
apache-2.0
35
34
true
true
true
false
false
0.311869
31.186917
0.601569
43.381847
0.134441
13.444109
0.363255
15.100671
0.439823
14.077865
0.470911
41.212323
true
2024-05-15
2024-06-12
0
01-ai/Yi-1.5-34B-32K
01-ai_Yi-1.5-34B-Chat_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/01-ai/Yi-1.5-34B-Chat" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">01-ai/Yi-1.5-34B-Chat</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/01-ai__Yi-1.5-34B-Chat-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
01-ai/Yi-1.5-34B-Chat
f3128b2d02d82989daae566c0a7eadc621ca3254
32.627883
apache-2.0
245
34
true
true
true
false
true
0.606676
60.667584
0.608375
44.262826
0.233384
23.338369
0.364933
15.324385
0.428198
13.058073
0.452045
39.116061
true
2024-05-10
2024-06-12
0
01-ai/Yi-1.5-34B-Chat
01-ai_Yi-1.5-34B-Chat-16K_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/01-ai/Yi-1.5-34B-Chat-16K" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">01-ai/Yi-1.5-34B-Chat-16K</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/01-ai__Yi-1.5-34B-Chat-16K-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
01-ai/Yi-1.5-34B-Chat-16K
ff74452e11f0f749ab872dc19b1dd3813c25c4d8
28.975559
apache-2.0
27
34
true
true
true
false
true
0.45645
45.645
0.610022
44.536157
0.188066
18.806647
0.338087
11.744966
0.43976
13.736719
0.454455
39.383865
true
2024-05-15
2024-07-15
0
01-ai/Yi-1.5-34B-Chat-16K
01-ai_Yi-1.5-6B_bfloat16
bfloat16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/01-ai/Yi-1.5-6B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">01-ai/Yi-1.5-6B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/01-ai__Yi-1.5-6B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
01-ai/Yi-1.5-6B
cab51fce425b4c1fb19fccfdd96bd5d0908c1657
16.5317
apache-2.0
28
6
true
true
true
false
false
0.26166
26.166017
0.449258
22.027905
0.053625
5.362538
0.313758
8.501119
0.437406
13.309115
0.314412
23.823508
true
2024-05-11
2024-08-10
0
01-ai/Yi-1.5-6B
01-ai_Yi-1.5-6B-Chat_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/01-ai/Yi-1.5-6B-Chat" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">01-ai/Yi-1.5-6B-Chat</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/01-ai__Yi-1.5-6B-Chat-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
01-ai/Yi-1.5-6B-Chat
3f64d3f159c6ad8494227bb77e2a7baef8cd808b
20.983906
apache-2.0
41
6
true
true
true
false
true
0.514527
51.452701
0.457131
23.678723
0.054381
5.438066
0.302013
6.935123
0.439177
14.030469
0.319315
24.368351
true
2024-05-11
2024-10-22
0
01-ai/Yi-1.5-6B-Chat
01-ai_Yi-1.5-9B_bfloat16
bfloat16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/01-ai/Yi-1.5-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">01-ai/Yi-1.5-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/01-ai__Yi-1.5-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
01-ai/Yi-1.5-9B
8cfde9604384c50137bee480b8cef8a08e5ae81d
21.952492
apache-2.0
46
8
true
true
true
false
false
0.293584
29.358436
0.514294
30.500717
0.101964
10.196375
0.379195
17.225951
0.432781
12.03099
0.391622
32.402482
true
2024-05-11
2024-06-12
0
01-ai/Yi-1.5-9B
01-ai_Yi-1.5-9B-32K_bfloat16
bfloat16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/01-ai/Yi-1.5-9B-32K" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">01-ai/Yi-1.5-9B-32K</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/01-ai__Yi-1.5-9B-32K-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
01-ai/Yi-1.5-9B-32K
116561dfae63af90f9d163b43077629e0e916bb1
19.608376
apache-2.0
18
8
true
true
true
false
false
0.230311
23.031113
0.496332
28.937012
0.095921
9.592145
0.35906
14.541387
0.418615
10.826823
0.376496
30.721779
true
2024-05-15
2024-06-12
0
01-ai/Yi-1.5-9B-32K
01-ai_Yi-1.5-9B-Chat_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/01-ai/Yi-1.5-9B-Chat" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">01-ai/Yi-1.5-9B-Chat</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/01-ai__Yi-1.5-9B-Chat-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
01-ai/Yi-1.5-9B-Chat
bc87d8557c98dc1e5fdef6ec23ed31088c4d3f35
27.705595
apache-2.0
133
8
true
true
true
false
true
0.604553
60.455259
0.555906
36.952931
0.116314
11.63142
0.334732
11.297539
0.425906
12.838281
0.397523
33.058141
true
2024-05-10
2024-06-12
0
01-ai/Yi-1.5-9B-Chat
01-ai_Yi-1.5-9B-Chat-16K_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/01-ai/Yi-1.5-9B-Chat-16K" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">01-ai/Yi-1.5-9B-Chat-16K</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/01-ai__Yi-1.5-9B-Chat-16K-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
01-ai/Yi-1.5-9B-Chat-16K
2b397e5f0fab87984efa66856c5c4ed4bbe68b50
22.896812
apache-2.0
34
8
true
true
true
false
true
0.421404
42.14041
0.515338
31.497609
0.126133
12.613293
0.308725
7.829978
0.409906
10.038281
0.399352
33.261303
true
2024-05-15
2024-06-12
0
01-ai/Yi-1.5-9B-Chat-16K
01-ai_Yi-34B_bfloat16
bfloat16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/01-ai/Yi-34B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">01-ai/Yi-34B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/01-ai__Yi-34B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
01-ai/Yi-34B
e1e7da8c75cfd5c44522228599fd4d2990cedd1c
22.259834
apache-2.0
1,283
34
true
true
true
false
false
0.304575
30.457519
0.54571
35.542431
0.044562
4.456193
0.366611
15.548098
0.411854
9.648438
0.441157
37.906324
true
2023-11-01
2024-06-12
0
01-ai/Yi-34B
01-ai_Yi-34B-200K_bfloat16
bfloat16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/01-ai/Yi-34B-200K" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">01-ai/Yi-34B-200K</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/01-ai__Yi-34B-200K-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
01-ai/Yi-34B-200K
8ac1a1ebe011df28b78ccd08012aeb2222443c77
19.799477
apache-2.0
314
34
true
true
true
false
false
0.154249
15.424851
0.544182
36.02211
0.044562
4.456193
0.356544
14.205817
0.381719
9.414844
0.453457
39.27305
true
2023-11-06
2024-06-12
0
01-ai/Yi-34B-200K
01-ai_Yi-34B-Chat_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/01-ai/Yi-34B-Chat" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">01-ai/Yi-34B-Chat</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/01-ai__Yi-34B-Chat-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
01-ai/Yi-34B-Chat
2e528b6a80fb064a0a746c5ca43114b135e30464
23.899372
apache-2.0
343
34
true
true
true
false
true
0.469889
46.988878
0.556087
37.623988
0.043051
4.305136
0.338087
11.744966
0.397844
8.363802
0.409325
34.369459
true
2023-11-22
2024-06-12
0
01-ai/Yi-34B-Chat
01-ai_Yi-6B_bfloat16
bfloat16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/01-ai/Yi-6B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">01-ai/Yi-6B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/01-ai__Yi-6B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
01-ai/Yi-6B
7f7fb7662fd8ec09029364f408053c954986c8e5
13.599029
apache-2.0
371
6
true
true
true
false
false
0.289338
28.933785
0.430923
19.408505
0.015106
1.510574
0.269295
2.572707
0.393687
7.044271
0.299119
22.124335
true
2023-11-01
2024-06-12
0
01-ai/Yi-6B
01-ai_Yi-6B-200K_bfloat16
bfloat16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/01-ai/Yi-6B-200K" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">01-ai/Yi-6B-200K</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/01-ai__Yi-6B-200K-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
01-ai/Yi-6B-200K
4a74338e778a599f313e9fa8f5bc08c717604420
11.895393
apache-2.0
173
6
true
true
true
false
false
0.084331
8.433069
0.428929
20.14802
0.012085
1.208459
0.281879
4.250559
0.45874
16.842448
0.284408
20.489805
true
2023-11-06
2024-06-12
0
01-ai/Yi-6B-200K
01-ai_Yi-6B-Chat_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/01-ai/Yi-6B-Chat" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">01-ai/Yi-6B-Chat</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/01-ai__Yi-6B-Chat-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
01-ai/Yi-6B-Chat
01f7fabb6cfb26efeb764da4a0a19cad2c754232
14.004357
apache-2.0
63
6
true
true
true
false
true
0.339521
33.952136
0.41326
17.000167
0.006798
0.679758
0.294463
5.928412
0.368792
3.565625
0.3061
22.900044
true
2023-11-22
2024-06-12
0
01-ai/Yi-6B-Chat
01-ai_Yi-9B_bfloat16
bfloat16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/01-ai/Yi-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">01-ai/Yi-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/01-ai__Yi-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
01-ai/Yi-9B
b4a466d95091696285409f1dcca3028543cb39da
17.610457
apache-2.0
185
8
true
true
true
false
false
0.270878
27.087794
0.493961
27.626956
0.043807
4.380665
0.317953
9.060403
0.405406
8.909115
0.35738
28.597813
true
2024-03-01
2024-06-12
0
01-ai/Yi-9B
01-ai_Yi-9B-200K_bfloat16
bfloat16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/01-ai/Yi-9B-200K" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">01-ai/Yi-9B-200K</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/01-ai__Yi-9B-200K-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
01-ai/Yi-9B-200K
8c93accd5589dbb74ee938e103613508c4a9b88d
17.591083
apache-2.0
75
8
true
true
true
false
false
0.232709
23.270921
0.47933
26.492495
0.058157
5.81571
0.315436
8.724832
0.429406
12.109115
0.362201
29.133422
true
2024-03-15
2024-06-12
0
01-ai/Yi-9B-200K
01-ai_Yi-Coder-9B-Chat_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/01-ai/Yi-Coder-9B-Chat" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">01-ai/Yi-Coder-9B-Chat</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/01-ai__Yi-Coder-9B-Chat-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
01-ai/Yi-Coder-9B-Chat
356a1f8d4e4a606d0b879e54191ca809918576b8
16.809756
apache-2.0
186
8
true
true
true
false
true
0.481704
48.17041
0.48142
25.943153
0.029456
2.945619
0.247483
0
0.399177
7.963802
0.24252
15.83555
true
2024-08-21
2024-09-12
1
01-ai/Yi-Coder-9B
152334H_miqu-1-70b-sf_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/152334H/miqu-1-70b-sf" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">152334H/miqu-1-70b-sf</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/152334H__miqu-1-70b-sf-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
152334H/miqu-1-70b-sf
1dca4cce36f01f2104ee2e6b97bac6ff7bb300c1
28.820469
219
68
false
true
true
false
false
0.518174
51.8174
0.610236
43.807147
0.108006
10.800604
0.350671
13.422819
0.458208
17.209375
0.422789
35.86547
false
2024-01-30
2024-06-26
0
152334H/miqu-1-70b-sf
1TuanPham_T-VisStar-7B-v0.1_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/1TuanPham/T-VisStar-7B-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">1TuanPham/T-VisStar-7B-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/1TuanPham__T-VisStar-7B-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
1TuanPham/T-VisStar-7B-v0.1
b111b59971c14b46c888b96723ff7f3c7b6fd92f
18.943399
apache-2.0
1
7
true
false
true
false
true
0.360704
36.070404
0.50522
30.243834
0.045317
4.531722
0.285235
4.697987
0.4375
13.554167
0.321061
24.562278
false
2024-09-19
2024-09-22
0
1TuanPham/T-VisStar-7B-v0.1
1TuanPham_T-VisStar-v0.1_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/1TuanPham/T-VisStar-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">1TuanPham/T-VisStar-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/1TuanPham__T-VisStar-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
1TuanPham/T-VisStar-v0.1
c9779bd9630a533f7e42fd8effcca69623d48c9c
18.943399
apache-2.0
1
7
true
false
true
false
true
0.360704
36.070404
0.50522
30.243834
0.045317
4.531722
0.285235
4.697987
0.4375
13.554167
0.321061
24.562278
false
2024-09-19
2024-09-20
0
1TuanPham/T-VisStar-v0.1
3rd-Degree-Burn_Llama-3.1-8B-Squareroot_float16
float16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/3rd-Degree-Burn/Llama-3.1-8B-Squareroot" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">3rd-Degree-Burn/Llama-3.1-8B-Squareroot</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/3rd-Degree-Burn__Llama-3.1-8B-Squareroot-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
3rd-Degree-Burn/Llama-3.1-8B-Squareroot
2bec01c2c5d53276eac2222c80190eb44ab2e6af
10.304808
apache-2.0
2
8
true
false
true
false
true
0.221344
22.134381
0.346094
8.618064
0.210725
21.072508
0.256711
0.894855
0.308917
0.78125
0.17495
8.327793
false
2024-10-10
2024-10-10
1
3rd-Degree-Burn/Llama-3.1-8B-Squareroot (Merge)
3rd-Degree-Burn_Llama-Squared-8B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/3rd-Degree-Burn/Llama-Squared-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">3rd-Degree-Burn/Llama-Squared-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/3rd-Degree-Burn__Llama-Squared-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
3rd-Degree-Burn/Llama-Squared-8B
f30737e92b3a3fa0ef2a3f3ade487cc94ad34400
12.183192
0
8
false
true
true
false
true
0.275524
27.55245
0.443103
21.277103
0.042296
4.229607
0.271812
2.908277
0.308948
1.951823
0.236619
15.179891
false
2024-10-08
0
Removed
4season_final_model_test_v2_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/4season/final_model_test_v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">4season/final_model_test_v2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/4season__final_model_test_v2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
4season/final_model_test_v2
cf690c35d9cf0b0b6bf034fa16dbf88c56fe861c
21.91554
apache-2.0
0
21
true
true
true
false
false
0.319113
31.911329
0.634205
47.41067
0.013595
1.359517
0.327181
10.290828
0.431448
12.43099
0.352809
28.089908
false
2024-05-20
2024-06-27
0
4season/final_model_test_v2
AALF_gemma-2-27b-it-SimPO-37K_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/AALF/gemma-2-27b-it-SimPO-37K" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">AALF/gemma-2-27b-it-SimPO-37K</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/AALF__gemma-2-27b-it-SimPO-37K-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
AALF/gemma-2-27b-it-SimPO-37K
27f15219df2000a16955c9403c3f38b5f3413b3d
9.298079
gemma
16
27
true
true
true
false
true
0.240653
24.065258
0.391134
15.307881
0
0
0.280201
4.026846
0.34876
1.595052
0.197141
10.79344
false
2024-08-13
2024-09-05
2
google/gemma-2-27b
AALF_gemma-2-27b-it-SimPO-37K-100steps_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/AALF/gemma-2-27b-it-SimPO-37K-100steps" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">AALF/gemma-2-27b-it-SimPO-37K-100steps</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/AALF__gemma-2-27b-it-SimPO-37K-100steps-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
AALF/gemma-2-27b-it-SimPO-37K-100steps
d5cbf18b2eb90b77f5ddbb74cfcaeedfa692c90c
9.894336
gemma
10
27
true
true
true
false
true
0.256764
25.676427
0.393082
15.261078
0
0
0.288591
5.145414
0.332917
0.78125
0.212517
12.501847
false
2024-08-13
2024-09-21
2
google/gemma-2-27b
AELLM_gemma-2-aeria-infinity-9b_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/AELLM/gemma-2-aeria-infinity-9b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">AELLM/gemma-2-aeria-infinity-9b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/AELLM__gemma-2-aeria-infinity-9b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
AELLM/gemma-2-aeria-infinity-9b
24e1de07258925d5ddb52134b66e2eb0d698dc11
28.344029
1
9
false
true
true
false
true
0.7594
75.93995
0.598334
42.090214
0
0
0.333893
11.185682
0.401969
9.046094
0.38622
31.802231
false
2024-10-09
2024-10-09
1
AELLM/gemma-2-aeria-infinity-9b (Merge)
AELLM_gemma-2-lyco-infinity-9b_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/AELLM/gemma-2-lyco-infinity-9b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">AELLM/gemma-2-lyco-infinity-9b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/AELLM__gemma-2-lyco-infinity-9b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
AELLM/gemma-2-lyco-infinity-9b
2941a682fcbcfea3f1485c9e0691cc1d9edc742e
27.204937
0
10
false
true
true
false
true
0.731648
73.164758
0.583953
39.787539
0
0
0.32802
10.402685
0.400635
8.91276
0.378657
30.961879
false
2024-10-09
2024-10-09
1
AELLM/gemma-2-lyco-infinity-9b (Merge)
AGI-0_Artificium-llama3.1-8B-001_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/AGI-0/Artificium-llama3.1-8B-001" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">AGI-0/Artificium-llama3.1-8B-001</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/AGI-0__Artificium-llama3.1-8B-001-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
AGI-0/Artificium-llama3.1-8B-001
6bf3dcca3b75a06a4e04e5f944e709cccf4673fd
18.937941
unknown
33
8
true
true
true
false
true
0.524769
52.476872
0.425622
19.348898
0.102719
10.271903
0.26594
2.12528
0.379458
5.165625
0.318152
24.239066
false
2024-08-16
2024-09-08
0
AGI-0/Artificium-llama3.1-8B-001
AI-MO_NuminaMath-7B-CoT_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/AI-MO/NuminaMath-7B-CoT" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">AI-MO/NuminaMath-7B-CoT</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/AI-MO__NuminaMath-7B-CoT-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
AI-MO/NuminaMath-7B-CoT
ff7e3044218efe64128bd9c21f9ec66c3de04324
12.946252
apache-2.0
12
6
true
true
true
false
true
0.268854
26.885442
0.431419
19.152364
0.079305
7.930514
0.26594
2.12528
0.330344
0.826302
0.286818
20.757609
false
2024-07-15
2024-09-10
1
deepseek-ai/deepseek-math-7b-base
AI-MO_NuminaMath-7B-TIR_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/AI-MO/NuminaMath-7B-TIR" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">AI-MO/NuminaMath-7B-TIR</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/AI-MO__NuminaMath-7B-TIR-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
AI-MO/NuminaMath-7B-TIR
c6e394cc0579423c9cde6df6cc192c07dae73388
11.790547
apache-2.0
315
6
true
true
true
false
false
0.275624
27.562423
0.414369
16.873547
0.017372
1.73716
0.258389
1.118568
0.350927
4.199219
0.273271
19.252364
false
2024-07-04
2024-07-11
1
deepseek-ai/deepseek-math-7b-base
AI-Sweden-Models_Llama-3-8B-instruct_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/AI-Sweden-Models/Llama-3-8B-instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">AI-Sweden-Models/Llama-3-8B-instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/AI-Sweden-Models__Llama-3-8B-instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
AI-Sweden-Models/Llama-3-8B-instruct
4e1c955228bdb4d69c1c4560e8d5872312a8f033
13.777204
llama3
9
8
true
true
true
false
true
0.240128
24.012841
0.417346
18.388096
0.004532
0.453172
0.26594
2.12528
0.477094
19.936719
0.259724
17.747119
false
2024-06-01
2024-06-27
2
meta-llama/Meta-Llama-3-8B
AI-Sweden-Models_gpt-sw3-40b_float16
float16
🟢 pretrained
🟢
Original
GPT2LMHeadModel
<a target="_blank" href="https://huggingface.co/AI-Sweden-Models/gpt-sw3-40b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">AI-Sweden-Models/gpt-sw3-40b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/AI-Sweden-Models__gpt-sw3-40b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
AI-Sweden-Models/gpt-sw3-40b
1af27994df1287a7fac1b10d60e40ca43a22a385
4.684081
other
10
39
true
true
true
false
false
0.14703
14.702988
0.326774
6.894934
0.006042
0.60423
0.234899
0
0.36324
2.838281
0.127576
3.064051
false
2023-02-22
2024-06-26
0
AI-Sweden-Models/gpt-sw3-40b
AbacusResearch_Jallabi-34B_float16
float16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/AbacusResearch/Jallabi-34B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">AbacusResearch/Jallabi-34B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/AbacusResearch__Jallabi-34B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
AbacusResearch/Jallabi-34B
f65696da4ed82c9a20e94b200d9dccffa07af682
25.972084
apache-2.0
2
34
true
true
true
false
false
0.35286
35.286041
0.602338
43.615765
0.039275
3.927492
0.338926
11.856823
0.482177
20.238802
0.468168
40.90758
false
2024-03-01
2024-06-27
0
AbacusResearch/Jallabi-34B
Alibaba-NLP_gte-Qwen2-7B-instruct_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Alibaba-NLP/gte-Qwen2-7B-instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Alibaba-NLP/gte-Qwen2-7B-instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Alibaba-NLP__gte-Qwen2-7B-instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Alibaba-NLP/gte-Qwen2-7B-instruct
e26182b2122f4435e8b3ebecbf363990f409b45b
13.34324
apache-2.0
195
7
true
true
true
false
true
0.22554
22.554045
0.449514
21.925482
0.034743
3.47432
0.244966
0
0.355854
6.315104
0.332114
25.790485
false
2024-06-15
2024-08-05
0
Alibaba-NLP/gte-Qwen2-7B-instruct
ArliAI_ArliAI-RPMax-12B-v1.1_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/ArliAI/ArliAI-RPMax-12B-v1.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ArliAI/ArliAI-RPMax-12B-v1.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ArliAI__ArliAI-RPMax-12B-v1.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ArliAI/ArliAI-RPMax-12B-v1.1
645db1cf8ad952eb57854a133e8e15303b898b04
20.636461
apache-2.0
40
12
true
true
true
false
true
0.534885
53.488522
0.475182
24.809063
0.092145
9.214502
0.281879
4.250559
0.361844
5.563802
0.338431
26.492317
false
2024-08-31
2024-09-05
0
ArliAI/ArliAI-RPMax-12B-v1.1
ArliAI_Llama-3.1-8B-ArliAI-RPMax-v1.1_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/ArliAI/Llama-3.1-8B-ArliAI-RPMax-v1.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ArliAI/Llama-3.1-8B-ArliAI-RPMax-v1.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ArliAI__Llama-3.1-8B-ArliAI-RPMax-v1.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ArliAI/Llama-3.1-8B-ArliAI-RPMax-v1.1
540bd352e59c63900af91b95a932b33aaee70c76
23.640028
llama3
28
8
true
true
true
false
true
0.635902
63.590163
0.501561
28.787014
0.113293
11.329305
0.283557
4.474273
0.357688
5.310938
0.355136
28.348478
false
2024-08-23
2024-09-19
0
ArliAI/Llama-3.1-8B-ArliAI-RPMax-v1.1
Artples_L-MChat-7b_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Artples/L-MChat-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Artples/L-MChat-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Artples__L-MChat-7b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Artples/L-MChat-7b
e10137f5cbfc1b73068d6473e4a87241cca0b3f4
21.024495
apache-2.0
1
7
true
false
true
false
true
0.529665
52.966462
0.460033
24.201557
0.079305
7.930514
0.305369
7.38255
0.402865
8.12474
0.32987
25.54115
false
2024-04-02
2024-07-07
1
Artples/L-MChat-7b (Merge)
Artples_L-MChat-Small_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
PhiForCausalLM
<a target="_blank" href="https://huggingface.co/Artples/L-MChat-Small" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Artples/L-MChat-Small</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Artples__L-MChat-Small-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Artples/L-MChat-Small
52484c277f6062c12dc6d6b6397ee0d0c21b0126
14.866273
mit
1
2
true
false
true
false
true
0.328706
32.870561
0.482256
26.856516
0.015861
1.586103
0.267617
2.348993
0.369594
9.265885
0.246426
16.269577
false
2024-04-11
2024-07-07
1
Artples/L-MChat-Small (Merge)
Aryanne_SuperHeart_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Aryanne/SuperHeart" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Aryanne/SuperHeart</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Aryanne__SuperHeart-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Aryanne/SuperHeart
02b5050d7e600ce3db81a19638f6043c895d60cf
25.267673
llama3.1
1
8
true
false
true
false
false
0.519223
51.922344
0.521538
31.893554
0.138973
13.897281
0.301174
6.823266
0.443573
14.713281
0.391207
32.356309
false
2024-09-23
2024-09-23
1
Aryanne/SuperHeart (Merge)
AtAndDev_Qwen2.5-1.5B-continuous-learnt_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/AtAndDev/Qwen2.5-1.5B-continuous-learnt" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">AtAndDev/Qwen2.5-1.5B-continuous-learnt</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/AtAndDev__Qwen2.5-1.5B-continuous-learnt-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
AtAndDev/Qwen2.5-1.5B-continuous-learnt
01c0981db9cf0f146fe050065f17343af75a8aa6
16.518524
0
1
false
true
true
false
true
0.460521
46.052142
0.425775
19.537666
0.074773
7.477341
0.26594
2.12528
0.363646
3.789063
0.281167
20.129654
false
2024-10-13
0
Removed
AtAndDev_Qwen2.5-1.5B-continuous-learnt_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/AtAndDev/Qwen2.5-1.5B-continuous-learnt" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">AtAndDev/Qwen2.5-1.5B-continuous-learnt</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/AtAndDev__Qwen2.5-1.5B-continuous-learnt-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
AtAndDev/Qwen2.5-1.5B-continuous-learnt
01c0981db9cf0f146fe050065f17343af75a8aa6
16.325449
0
1
false
true
true
false
true
0.451054
45.105431
0.42747
19.766409
0.077795
7.779456
0.270134
2.684564
0.362281
2.551823
0.280585
20.065012
false
2024-10-18
0
Removed
Azure99_blossom-v5-32b_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Azure99/blossom-v5-32b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Azure99/blossom-v5-32b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Azure99__blossom-v5-32b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Azure99/blossom-v5-32b
ccd4d86e3de01187043683dea1e28df904f7408e
26.226674
apache-2.0
4
32
true
true
true
false
true
0.523544
52.35442
0.595455
42.883056
0.096677
9.667674
0.311242
8.165548
0.402
8.35
0.423454
35.939347
false
2024-04-29
2024-09-21
0
Azure99/blossom-v5-32b
Azure99_blossom-v5-llama3-8b_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Azure99/blossom-v5-llama3-8b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Azure99/blossom-v5-llama3-8b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Azure99__blossom-v5-llama3-8b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Azure99/blossom-v5-llama3-8b
91ea35e2e65516988021e4bb3b908e3e497e05c2
14.410141
apache-2.0
4
8
true
true
true
false
true
0.434293
43.429323
0.418491
18.306535
0.04003
4.003021
0.265101
2.013423
0.367021
5.310938
0.220578
13.397606
false
2024-04-20
2024-09-21
0
Azure99/blossom-v5-llama3-8b
Azure99_blossom-v5.1-34b_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Azure99/blossom-v5.1-34b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Azure99/blossom-v5.1-34b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Azure99__blossom-v5.1-34b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Azure99/blossom-v5.1-34b
2c803204f5dbf4ce37e2df98eb0205cdc53de10d
28.385288
apache-2.0
5
34
true
true
true
false
true
0.569656
56.965629
0.610911
44.147705
0.14426
14.425982
0.309564
7.941834
0.392792
7.298958
0.455785
39.531619
false
2024-05-19
2024-07-27
0
Azure99/blossom-v5.1-34b
Azure99_blossom-v5.1-9b_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Azure99/blossom-v5.1-9b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Azure99/blossom-v5.1-9b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Azure99__blossom-v5.1-9b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Azure99/blossom-v5.1-9b
6044a3dc1e04529fe883aa513d37f266a320d793
24.682682
apache-2.0
2
8
true
true
true
false
true
0.508582
50.858167
0.534329
34.201244
0.104985
10.498489
0.33557
11.409396
0.399396
8.024479
0.397939
33.104314
false
2024-05-15
2024-07-24
0
Azure99/blossom-v5.1-9b
BAAI_Gemma2-9B-IT-Simpo-Infinity-Preference_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/BAAI/Gemma2-9B-IT-Simpo-Infinity-Preference" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BAAI/Gemma2-9B-IT-Simpo-Infinity-Preference</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BAAI__Gemma2-9B-IT-Simpo-Infinity-Preference-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
BAAI/Gemma2-9B-IT-Simpo-Infinity-Preference
028a91b1a4f14d365c6db08093b03348455c7bad
20.984069
14
9
false
true
true
false
true
0.317638
31.763831
0.597946
42.190844
0
0
0.339765
11.96868
0.396573
8.104948
0.386885
31.876108
false
2024-08-28
2024-09-05
2
google/gemma-2-9b
BAAI_Infinity-Instruct-3M-0613-Llama3-70B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/BAAI/Infinity-Instruct-3M-0613-Llama3-70B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BAAI/Infinity-Instruct-3M-0613-Llama3-70B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BAAI__Infinity-Instruct-3M-0613-Llama3-70B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
BAAI/Infinity-Instruct-3M-0613-Llama3-70B
9fc53668064bdda22975ca72c5a287f8241c95b3
34.470489
apache-2.0
5
70
true
true
true
false
true
0.682113
68.211346
0.664161
51.327161
0.148792
14.879154
0.358221
14.42953
0.45226
16.532552
0.472989
41.443189
false
2024-06-27
2024-06-28
0
BAAI/Infinity-Instruct-3M-0613-Llama3-70B
BAAI_Infinity-Instruct-3M-0613-Mistral-7B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/BAAI/Infinity-Instruct-3M-0613-Mistral-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BAAI/Infinity-Instruct-3M-0613-Mistral-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BAAI__Infinity-Instruct-3M-0613-Mistral-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
BAAI/Infinity-Instruct-3M-0613-Mistral-7B
c7a742e539ec264b9eaeefe2aed29e92e8a7ebd6
22.041768
apache-2.0
11
7
true
true
true
false
true
0.531987
53.198735
0.495823
28.992936
0.066465
6.646526
0.296141
6.152125
0.435083
13.252083
0.316074
24.0082
false
2024-06-21
2024-06-27
0
BAAI/Infinity-Instruct-3M-0613-Mistral-7B
BAAI_Infinity-Instruct-3M-0625-Llama3-70B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/BAAI/Infinity-Instruct-3M-0625-Llama3-70B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BAAI/Infinity-Instruct-3M-0625-Llama3-70B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BAAI__Infinity-Instruct-3M-0625-Llama3-70B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
BAAI/Infinity-Instruct-3M-0625-Llama3-70B
6d8ceada57e55cff3503191adc4d6379ff321fe2
35.877866
apache-2.0
3
70
true
true
true
false
true
0.744212
74.421202
0.667034
52.028162
0.163142
16.314199
0.357383
14.317673
0.461656
18.340365
0.45861
39.845597
false
2024-07-09
2024-08-30
0
BAAI/Infinity-Instruct-3M-0625-Llama3-70B
BAAI_Infinity-Instruct-3M-0625-Llama3-8B_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/BAAI/Infinity-Instruct-3M-0625-Llama3-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BAAI/Infinity-Instruct-3M-0625-Llama3-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BAAI__Infinity-Instruct-3M-0625-Llama3-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
BAAI/Infinity-Instruct-3M-0625-Llama3-8B
7be7c0ff1e35c3bb781c47222da99a1724f5f1da
21.47089
apache-2.0
3
8
true
true
true
false
true
0.605027
60.502688
0.495499
28.988222
0.05287
5.287009
0.275168
3.355705
0.371208
5.667708
0.325216
25.02401
false
2024-07-09
2024-07-13
0
BAAI/Infinity-Instruct-3M-0625-Llama3-8B
BAAI_Infinity-Instruct-3M-0625-Mistral-7B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/BAAI/Infinity-Instruct-3M-0625-Mistral-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BAAI/Infinity-Instruct-3M-0625-Mistral-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BAAI__Infinity-Instruct-3M-0625-Mistral-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
BAAI/Infinity-Instruct-3M-0625-Mistral-7B
302e3ae0bcc50dae3fb69fc1b08b518398e8c407
22.692368
apache-2.0
3
7
true
true
true
false
true
0.586742
58.674207
0.493967
28.823289
0.067221
6.722054
0.286913
4.9217
0.42724
12.238281
0.322972
24.774675
false
2024-07-09
2024-08-05
0
BAAI/Infinity-Instruct-3M-0625-Mistral-7B
BAAI_Infinity-Instruct-3M-0625-Qwen2-7B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/BAAI/Infinity-Instruct-3M-0625-Qwen2-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BAAI/Infinity-Instruct-3M-0625-Qwen2-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BAAI__Infinity-Instruct-3M-0625-Qwen2-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
BAAI/Infinity-Instruct-3M-0625-Qwen2-7B
503c24156d7682458686a7b5324f7f886e63470d
24.009476
apache-2.0
8
7
true
true
true
false
true
0.555393
55.539302
0.534591
34.656829
0.061178
6.117825
0.312919
8.389262
0.38876
6.461719
0.396027
32.891918
false
2024-07-09
2024-08-05
0
BAAI/Infinity-Instruct-3M-0625-Qwen2-7B
BAAI_Infinity-Instruct-3M-0625-Yi-1.5-9B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/BAAI/Infinity-Instruct-3M-0625-Yi-1.5-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BAAI/Infinity-Instruct-3M-0625-Yi-1.5-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BAAI__Infinity-Instruct-3M-0625-Yi-1.5-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
BAAI/Infinity-Instruct-3M-0625-Yi-1.5-9B
a42c86c61b98ca4fdf238d688fe6ea11cf414d29
27.742141
apache-2.0
3
8
true
true
true
false
true
0.518598
51.859843
0.550912
35.378707
0.139728
13.97281
0.354027
13.870246
0.457531
16.72474
0.411818
34.646498
false
2024-07-09
2024-08-05
0
BAAI/Infinity-Instruct-3M-0625-Yi-1.5-9B
BAAI_Infinity-Instruct-7M-0729-Llama3_1-8B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/BAAI/Infinity-Instruct-7M-0729-Llama3_1-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BAAI/Infinity-Instruct-7M-0729-Llama3_1-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BAAI__Infinity-Instruct-7M-0729-Llama3_1-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
BAAI/Infinity-Instruct-7M-0729-Llama3_1-8B
0aca33fd7500a781d041e8bf7e5e3789b03f54f4
22.943899
llama3.1
8
8
true
true
true
false
true
0.613195
61.319521
0.507734
30.888805
0.097432
9.743202
0.292785
5.704698
0.357844
5.297135
0.32239
24.710033
false
2024-08-02
2024-08-05
0
BAAI/Infinity-Instruct-7M-0729-Llama3_1-8B
BAAI_Infinity-Instruct-7M-0729-mistral-7B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/BAAI/Infinity-Instruct-7M-0729-mistral-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BAAI/Infinity-Instruct-7M-0729-mistral-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BAAI__Infinity-Instruct-7M-0729-mistral-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
BAAI/Infinity-Instruct-7M-0729-mistral-7B
36651591cb13346ecbde23832013e024029700fa
22.763277
apache-2.0
3
7
true
true
true
false
true
0.616193
61.619281
0.496381
28.697915
0.055891
5.589124
0.290268
5.369128
0.406188
10.040104
0.327377
25.264111
false
2024-07-25
2024-08-05
0
BAAI/Infinity-Instruct-7M-0729-mistral-7B
BAAI_Infinity-Instruct-7M-Gen-Llama3_1-70B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/BAAI/Infinity-Instruct-7M-Gen-Llama3_1-70B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BAAI/Infinity-Instruct-7M-Gen-Llama3_1-70B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BAAI__Infinity-Instruct-7M-Gen-Llama3_1-70B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
BAAI/Infinity-Instruct-7M-Gen-Llama3_1-70B
1ef63c4993a8c723c9695c827295c17080a64435
36.792107
llama3.1
14
70
true
true
true
false
true
0.733546
73.354588
0.66952
52.498947
0.210725
21.072508
0.375839
16.778523
0.453906
16.971615
0.460688
40.076463
false
2024-07-25
2024-09-26
0
BAAI/Infinity-Instruct-7M-Gen-Llama3_1-70B
BAAI_Infinity-Instruct-7M-Gen-Llama3_1-8B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/BAAI/Infinity-Instruct-7M-Gen-Llama3_1-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BAAI/Infinity-Instruct-7M-Gen-Llama3_1-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BAAI__Infinity-Instruct-7M-Gen-Llama3_1-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
BAAI/Infinity-Instruct-7M-Gen-Llama3_1-8B
56f9c2845ae024eb8b1dd9ea0d8891cbaf33c596
22.943899
llama3.1
8
8
true
true
true
false
true
0.613195
61.319521
0.507734
30.888805
0.097432
9.743202
0.292785
5.704698
0.357844
5.297135
0.32239
24.710033
false
2024-08-02
2024-08-29
0
BAAI/Infinity-Instruct-7M-Gen-Llama3_1-8B
BAAI_Infinity-Instruct-7M-Gen-mistral-7B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/BAAI/Infinity-Instruct-7M-Gen-mistral-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BAAI/Infinity-Instruct-7M-Gen-mistral-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BAAI__Infinity-Instruct-7M-Gen-mistral-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
BAAI/Infinity-Instruct-7M-Gen-mistral-7B
82c83d670a8954f4250547b53a057dea1fbd460d
22.737882
apache-2.0
3
7
true
true
true
false
true
0.614669
61.466908
0.496381
28.697915
0.055891
5.589124
0.290268
5.369128
0.406188
10.040104
0.327377
25.264111
false
2024-07-25
2024-08-29
0
BAAI/Infinity-Instruct-7M-Gen-mistral-7B
BAAI_OPI-Llama-3.1-8B-Instruct_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/BAAI/OPI-Llama-3.1-8B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BAAI/OPI-Llama-3.1-8B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BAAI__OPI-Llama-3.1-8B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
BAAI/OPI-Llama-3.1-8B-Instruct
48504799d009b4e1b29e6d2948a7cde68acdc3b0
8.305018
llama3.1
1
8
true
true
true
false
true
0.207455
20.745511
0.355122
9.768712
0
0
0.274329
3.243848
0.323302
3.579427
0.212434
12.492612
false
2024-09-06
2024-09-21
2
meta-llama/Meta-Llama-3.1-8B
BEE-spoke-data_Meta-Llama-3-8Bee_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/BEE-spoke-data/Meta-Llama-3-8Bee" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BEE-spoke-data/Meta-Llama-3-8Bee</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BEE-spoke-data__Meta-Llama-3-8Bee-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
BEE-spoke-data/Meta-Llama-3-8Bee
8143e34e77a49a30ec2617c5c9cc22cb3cda2287
14.494166
llama3
0
8
true
true
true
false
false
0.195066
19.506576
0.462636
24.199033
0.03852
3.851964
0.313758
8.501119
0.365406
6.242448
0.321975
24.663859
false
2024-04-28
2024-07-04
1
meta-llama/Meta-Llama-3-8B
BEE-spoke-data_smol_llama-101M-GQA_bfloat16
bfloat16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/BEE-spoke-data/smol_llama-101M-GQA" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BEE-spoke-data/smol_llama-101M-GQA</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BEE-spoke-data__smol_llama-101M-GQA-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
BEE-spoke-data/smol_llama-101M-GQA
bb26643db413bada7e0c3c50752bf9da82403dba
3.918895
apache-2.0
26
0
true
true
true
false
false
0.138437
13.843712
0.301756
3.198004
0
0
0.25755
1.006711
0.371271
4.275521
0.110705
1.189421
false
2023-10-26
2024-07-06
0
BEE-spoke-data/smol_llama-101M-GQA
BEE-spoke-data_smol_llama-220M-GQA_bfloat16
bfloat16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/BEE-spoke-data/smol_llama-220M-GQA" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BEE-spoke-data/smol_llama-220M-GQA</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BEE-spoke-data__smol_llama-220M-GQA-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
BEE-spoke-data/smol_llama-220M-GQA
8845b1d3c0bc73522ef2700aab467183cbdca9f7
6.401567
apache-2.0
11
0
true
true
true
false
false
0.238605
23.860468
0.303167
3.037843
0
0
0.255872
0.782998
0.405875
9.067708
0.114943
1.660387
false
2023-12-22
2024-06-26
0
BEE-spoke-data/smol_llama-220M-GQA
BEE-spoke-data_smol_llama-220M-GQA-fineweb_edu_bfloat16
bfloat16
🟩 continuously pretrained
🟩
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/BEE-spoke-data/smol_llama-220M-GQA-fineweb_edu" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BEE-spoke-data/smol_llama-220M-GQA-fineweb_edu</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BEE-spoke-data__smol_llama-220M-GQA-fineweb_edu-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
BEE-spoke-data/smol_llama-220M-GQA-fineweb_edu
dec16b41d5e94070dbc1f8449a554373fd4cc1d1
6.516558
apache-2.0
1
0
true
true
true
false
false
0.198812
19.881248
0.292905
2.314902
0
0
0.259228
1.230425
0.43676
14.261719
0.112699
1.411052
false
2024-06-08
2024-06-26
1
BEE-spoke-data/smol_llama-220M-GQA
BEE-spoke-data_smol_llama-220M-openhermes_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/BEE-spoke-data/smol_llama-220M-openhermes" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BEE-spoke-data/smol_llama-220M-openhermes</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BEE-spoke-data__smol_llama-220M-openhermes-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
BEE-spoke-data/smol_llama-220M-openhermes
fb4bcd4b7eee363baacb4176a26cea2aaeb173f4
4.761772
apache-2.0
5
0
true
true
true
false
false
0.155523
15.55229
0.302752
3.107692
0
0
0.267617
2.348993
0.384729
6.224479
0.112035
1.337175
false
2023-12-30
2024-09-21
1
BEE-spoke-data/smol_llama-220M-GQA
BEE-spoke-data_tFINE-900m-e16-d32-flan_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
T5ForConditionalGeneration
<a target="_blank" href="https://huggingface.co/BEE-spoke-data/tFINE-900m-e16-d32-flan" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BEE-spoke-data/tFINE-900m-e16-d32-flan</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BEE-spoke-data__tFINE-900m-e16-d32-flan-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
BEE-spoke-data/tFINE-900m-e16-d32-flan
d9ffec9798402d13d8f2c56ec3de3ad092445297
4.433887
apache-2.0
0
0
true
true
true
false
false
0.150577
15.057714
0.302804
4.411894
0
0
0.233221
0
0.372417
3.71875
0.130735
3.414967
false
2024-09-06
2024-09-13
1
pszemraj/tFINE-900m-e16-d32-1024ctx
BEE-spoke-data_tFINE-900m-e16-d32-flan-infinity-instruct-7m-T2T_en-1024_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
T5ForConditionalGeneration
<a target="_blank" href="https://huggingface.co/BEE-spoke-data/tFINE-900m-e16-d32-flan-infinity-instruct-7m-T2T_en-1024" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BEE-spoke-data/tFINE-900m-e16-d32-flan-infinity-instruct-7m-T2T_en-1024</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BEE-spoke-data__tFINE-900m-e16-d32-flan-infinity-instruct-7m-T2T_en-1024-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
BEE-spoke-data/tFINE-900m-e16-d32-flan-infinity-instruct-7m-T2T_en-1024
b1e2f12f5224be9f7da0cb5ff30e1bbb3f10f6ca
5.823653
apache-2.0
0
0
true
true
true
false
false
0.132067
13.206736
0.313779
4.737018
0
0
0.254195
0.559284
0.439271
13.808854
0.12367
2.630024
false
2024-09-10
2024-09-14
2
pszemraj/tFINE-900m-e16-d32-1024ctx
BEE-spoke-data_tFINE-900m-e16-d32-instruct_2e_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
T5ForConditionalGeneration
<a target="_blank" href="https://huggingface.co/BEE-spoke-data/tFINE-900m-e16-d32-instruct_2e" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BEE-spoke-data/tFINE-900m-e16-d32-instruct_2e</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BEE-spoke-data__tFINE-900m-e16-d32-instruct_2e-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
BEE-spoke-data/tFINE-900m-e16-d32-instruct_2e
4c626138c9f4e0c3eafe74b2755eb89334c7ca59
5.681552
apache-2.0
0
0
true
true
true
false
false
0.140286
14.028555
0.313457
5.01307
0
0
0.259228
1.230425
0.420698
11.18724
0.12367
2.630024
false
2024-09-17
2024-09-22
3
pszemraj/tFINE-900m-e16-d32-1024ctx
BEE-spoke-data_tFINE-900m-instruct-orpo_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
T5ForConditionalGeneration
<a target="_blank" href="https://huggingface.co/BEE-spoke-data/tFINE-900m-instruct-orpo" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BEE-spoke-data/tFINE-900m-instruct-orpo</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BEE-spoke-data__tFINE-900m-instruct-orpo-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
BEE-spoke-data/tFINE-900m-instruct-orpo
e0a21c79bac74442252d36e2c01403afa3f0971b
3.431957
apache-2.0
0
0
true
true
true
false
true
0.132992
13.299157
0.302209
3.267301
0
0
0.259228
1.230425
0.340854
1.106771
0.115193
1.688091
false
2024-09-22
2024-09-23
0
BEE-spoke-data/tFINE-900m-instruct-orpo
Ba2han_Llama-Phi-3_DoRA_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Ba2han/Llama-Phi-3_DoRA" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Ba2han/Llama-Phi-3_DoRA</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Ba2han__Llama-Phi-3_DoRA-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Ba2han/Llama-Phi-3_DoRA
36f99064a7be8ba475c2ee5c5424e95c263ccb87
25.142604
mit
6
3
true
true
true
false
true
0.513053
51.305314
0.551456
37.249164
0.101964
10.196375
0.326342
10.178971
0.406927
9.532552
0.391539
32.393248
false
2024-05-15
2024-06-26
0
Ba2han/Llama-Phi-3_DoRA
BenevolenceMessiah_Yi-Coder-9B-Chat-Instruct-TIES-MoE-v1.0_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/BenevolenceMessiah/Yi-Coder-9B-Chat-Instruct-TIES-MoE-v1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BenevolenceMessiah/Yi-Coder-9B-Chat-Instruct-TIES-MoE-v1.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BenevolenceMessiah__Yi-Coder-9B-Chat-Instruct-TIES-MoE-v1.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
BenevolenceMessiah/Yi-Coder-9B-Chat-Instruct-TIES-MoE-v1.0
d90f6e36584dc9b367461701e83c833bdeb736f2
15.045915
apache-2.0
0
28
true
false
false
false
false
0.301153
30.115316
0.490867
26.877991
0.04003
4.003021
0.262584
1.677852
0.407979
8.930729
0.268035
18.670582
false
2024-09-21
2024-09-22
1
BenevolenceMessiah/Yi-Coder-9B-Chat-Instruct-TIES-MoE-v1.0 (Merge)
BlackBeenie_llama-3-luminous-merged_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/BlackBeenie/llama-3-luminous-merged" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BlackBeenie/llama-3-luminous-merged</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BlackBeenie__llama-3-luminous-merged-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
BlackBeenie/llama-3-luminous-merged
64288dd8e3305f2dc11d84fe0c653f351b2e8a9d
21.480108
0
8
false
true
true
false
false
0.432345
43.234507
0.515392
30.643687
0.07855
7.854985
0.292785
5.704698
0.414896
10.628646
0.377327
30.814125
false
2024-09-15
2024-10-11
1
BlackBeenie/llama-3-luminous-merged (Merge)
BlackBeenie_llama-3.1-8B-Galore-openassistant-guanaco_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/BlackBeenie/llama-3.1-8B-Galore-openassistant-guanaco" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BlackBeenie/llama-3.1-8B-Galore-openassistant-guanaco</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BlackBeenie__llama-3.1-8B-Galore-openassistant-guanaco-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
BlackBeenie/llama-3.1-8B-Galore-openassistant-guanaco
828fa03c10e9085700b7abbe26f95067fab010fd
18.072101
1
8
false
true
true
false
false
0.263484
26.348422
0.521337
31.444705
0.048338
4.833837
0.300336
6.711409
0.440625
14.578125
0.320645
24.516105
false
2024-10-16
2024-10-19
0
BlackBeenie/llama-3.1-8B-Galore-openassistant-guanaco
BoltMonkey_DreadMix_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/BoltMonkey/DreadMix" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BoltMonkey/DreadMix</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BoltMonkey__DreadMix-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
BoltMonkey/DreadMix
ab5dbaaff606538db73b6fd89aa169760104a566
28.459617
0
8
false
true
true
false
true
0.709491
70.949082
0.54351
34.845015
0.137462
13.746224
0.299497
6.599553
0.421219
13.61901
0.378989
30.998818
false
2024-10-12
2024-10-13
1
BoltMonkey/DreadMix (Merge)
BoltMonkey_NeuralDaredevil-SuperNova-Lite-7B-DARETIES-abliterated_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/BoltMonkey/NeuralDaredevil-SuperNova-Lite-7B-DARETIES-abliterated" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BoltMonkey/NeuralDaredevil-SuperNova-Lite-7B-DARETIES-abliterated</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BoltMonkey__NeuralDaredevil-SuperNova-Lite-7B-DARETIES-abliterated-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
BoltMonkey/NeuralDaredevil-SuperNova-Lite-7B-DARETIES-abliterated
969e4c9b41e733a367f5ea18ed50a6171b5e2357
27.499695
llama3.1
1
8
true
false
true
false
true
0.799891
79.989096
0.515199
30.7599
0.102719
10.271903
0.28104
4.138702
0.401875
9.467708
0.373338
30.370863
false
2024-10-01
2024-10-10
1
BoltMonkey/NeuralDaredevil-SuperNova-Lite-7B-DARETIES-abliterated (Merge)
BoltMonkey_NeuralDaredevil-SuperNova-Lite-7B-DARETIES-abliterated_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/BoltMonkey/NeuralDaredevil-SuperNova-Lite-7B-DARETIES-abliterated" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BoltMonkey/NeuralDaredevil-SuperNova-Lite-7B-DARETIES-abliterated</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BoltMonkey__NeuralDaredevil-SuperNova-Lite-7B-DARETIES-abliterated-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
BoltMonkey/NeuralDaredevil-SuperNova-Lite-7B-DARETIES-abliterated
969e4c9b41e733a367f5ea18ed50a6171b5e2357
21.345511
llama3.1
1
8
true
false
true
false
false
0.459023
45.902317
0.518544
30.793785
0.093656
9.365559
0.274329
3.243848
0.40826
9.532552
0.363115
29.235003
false
2024-10-01
2024-10-01
1
BoltMonkey/NeuralDaredevil-SuperNova-Lite-7B-DARETIES-abliterated (Merge)
BoltMonkey_SuperNeuralDreadDevil-8b_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/BoltMonkey/SuperNeuralDreadDevil-8b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BoltMonkey/SuperNeuralDreadDevil-8b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BoltMonkey__SuperNeuralDreadDevil-8b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
BoltMonkey/SuperNeuralDreadDevil-8b
804d5864127e603abec179a159b43f446246fafc
21.671492
1
8
false
true
true
false
true
0.485801
48.580101
0.515108
30.606714
0.08006
8.006042
0.285235
4.697987
0.415948
10.426823
0.349402
27.711288
false
2024-10-13
2024-10-13
1
BoltMonkey/SuperNeuralDreadDevil-8b (Merge)
BrainWave-ML_llama3.2-3B-maths-orpo_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/BrainWave-ML/llama3.2-3B-maths-orpo" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BrainWave-ML/llama3.2-3B-maths-orpo</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BrainWave-ML__llama3.2-3B-maths-orpo-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
BrainWave-ML/llama3.2-3B-maths-orpo
d149d83d8e8f3883421d800848fec85766181923
5.076083
apache-2.0
1
3
true
true
true
false
false
0.204907
20.490742
0.291178
2.347041
0
0
0.259228
1.230425
0.357531
4.52474
0.116772
1.863549
false
2024-10-24
2024-10-24
2
meta-llama/Llama-3.2-3B-Instruct
BramVanroy_GEITje-7B-ultra_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/BramVanroy/GEITje-7B-ultra" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BramVanroy/GEITje-7B-ultra</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BramVanroy__GEITje-7B-ultra-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
BramVanroy/GEITje-7B-ultra
d4552cdc6f015754646464d8411aa4f6bcdba8e8
10.909606
cc-by-nc-4.0
37
7
true
true
true
false
true
0.372344
37.234427
0.377616
12.879913
0.009063
0.906344
0.262584
1.677852
0.328979
1.522396
0.20113
11.236702
false
2024-01-27
2024-10-28
3
mistralai/Mistral-7B-v0.1
BramVanroy_fietje-2_bfloat16
bfloat16
🟩 continuously pretrained
🟩
Original
PhiForCausalLM
<a target="_blank" href="https://huggingface.co/BramVanroy/fietje-2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BramVanroy/fietje-2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BramVanroy__fietje-2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
BramVanroy/fietje-2
3abe75d01094b713368e3d911ffb78a2d66ead22
9.027007
mit
6
2
true
true
true
false
false
0.209803
20.980332
0.403567
15.603676
0.009063
0.906344
0.254195
0.559284
0.369563
5.161979
0.198554
10.950428
false
2024-04-09
2024-10-28
1
microsoft/phi-2
BramVanroy_fietje-2-chat_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
PhiForCausalLM
<a target="_blank" href="https://huggingface.co/BramVanroy/fietje-2-chat" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BramVanroy/fietje-2-chat</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BramVanroy__fietje-2-chat-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
BramVanroy/fietje-2-chat
364e785d90438b787b94e33741a930c9932353c0
10.388869
mit
1
2
true
true
true
false
true
0.291736
29.173593
0.414975
17.718966
0.005287
0.528701
0.239933
0
0.35276
3.195052
0.205452
11.716903
false
2024-04-29
2024-10-28
3
microsoft/phi-2
BramVanroy_fietje-2-instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
PhiForCausalLM
<a target="_blank" href="https://huggingface.co/BramVanroy/fietje-2-instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BramVanroy/fietje-2-instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BramVanroy__fietje-2-instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
BramVanroy/fietje-2-instruct
b7b44797cd52eda1182667217e8371dbdfee4976
10.196192
mit
2
2
true
true
true
false
true
0.278996
27.89964
0.413607
17.57248
0.005287
0.528701
0.233221
0
0.336917
2.914583
0.210356
12.261746
false
2024-04-27
2024-10-28
2
microsoft/phi-2
Casual-Autopsy_L3-Umbral-Mind-RP-v2.0-8B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Casual-Autopsy/L3-Umbral-Mind-RP-v2.0-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Casual-Autopsy/L3-Umbral-Mind-RP-v2.0-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Casual-Autopsy__L3-Umbral-Mind-RP-v2.0-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Casual-Autopsy/L3-Umbral-Mind-RP-v2.0-8B
b46c066ea8387264858dc3461f382e7b42fd9c48
25.76087
llama3
12
8
true
false
true
false
true
0.712263
71.226346
0.526241
32.486278
0.101208
10.120846
0.286913
4.9217
0.368667
5.55
0.37234
30.260047
false
2024-06-26
2024-07-02
1
Casual-Autopsy/L3-Umbral-Mind-RP-v2.0-8B (Merge)
CausalLM_14B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/CausalLM/14B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CausalLM/14B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CausalLM__14B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CausalLM/14B
cc054cf5953252d0709cb3267d1a85246e489e95
16.530646
wtfpl
303
14
true
true
true
false
false
0.278821
27.882131
0.470046
24.780943
0.033233
3.323263
0.302852
7.04698
0.415479
11.468229
0.322141
24.682329
true
2023-10-22
2024-06-12
0
CausalLM/14B
CausalLM_34b-beta_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/CausalLM/34b-beta" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CausalLM/34b-beta</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CausalLM__34b-beta-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CausalLM/34b-beta
0429951eb30ccdfff3515e711aaa7649a8a7364c
23.18454
gpl-3.0
62
34
true
true
true
false
false
0.304325
30.432475
0.5591
36.677226
0.041541
4.154079
0.346477
12.863535
0.374865
6.92474
0.532497
48.055186
true
2024-02-06
2024-06-26
0
CausalLM/34b-beta
Changgil_K2S3-14b-v0.2_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Changgil/K2S3-14b-v0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Changgil/K2S3-14b-v0.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Changgil__K2S3-14b-v0.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Changgil/K2S3-14b-v0.2
b4f0e1eed2640df2b75847ff37e6ebb1be217b6c
15.074375
cc-by-nc-4.0
0
14
true
true
true
false
false
0.324284
32.428401
0.461331
24.283947
0.045317
4.531722
0.28104
4.138702
0.39226
6.799219
0.264378
18.264258
false
2024-06-17
2024-06-27
0
Changgil/K2S3-14b-v0.2
Changgil_K2S3-v0.1_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Changgil/K2S3-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Changgil/K2S3-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Changgil__K2S3-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Changgil/K2S3-v0.1
d544e389f091983bb4f11314edb526d81753c919
14.751167
cc-by-nc-4.0
0
14
true
true
true
false
false
0.327656
32.765617
0.465549
24.559558
0.040785
4.07855
0.264262
1.901566
0.401406
7.842448
0.256233
17.359264
false
2024-04-29
2024-06-27
0
Changgil/K2S3-v0.1
ClaudioItaly_Albacus_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/ClaudioItaly/Albacus" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ClaudioItaly/Albacus</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ClaudioItaly__Albacus-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ClaudioItaly/Albacus
a53faf62d0f99b67478ed9d262872c821a3ba83c
20.392281
mit
1
8
true
false
true
false
false
0.466742
46.674158
0.511304
31.638865
0.064199
6.41994
0.271812
2.908277
0.413531
10.658073
0.316489
24.054374
false
2024-09-08
2024-09-08
1
ClaudioItaly/Albacus (Merge)
ClaudioItaly_Book-Gut12B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/ClaudioItaly/Book-Gut12B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ClaudioItaly/Book-Gut12B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ClaudioItaly__Book-Gut12B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ClaudioItaly/Book-Gut12B
ae54351faca8170c93bf1de3a51bf16650f5bcf5
23.154924
mit
1
12
true
false
true
false
false
0.399847
39.984685
0.541737
34.632193
0.087613
8.761329
0.307047
7.606264
0.463542
18.276042
0.367021
29.669031
false
2024-09-12
2024-09-17
1
ClaudioItaly/Book-Gut12B (Merge)
ClaudioItaly_Evolutionstory-7B-v2.2_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/ClaudioItaly/Evolutionstory-7B-v2.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ClaudioItaly/Evolutionstory-7B-v2.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ClaudioItaly__Evolutionstory-7B-v2.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ClaudioItaly/Evolutionstory-7B-v2.2
9f838721d24a5195bed59a5ed8d9af536f7f2459
20.697542
mit
1
7
true
false
true
false
false
0.481379
48.137941
0.510804
31.623865
0.064199
6.41994
0.275168
3.355705
0.413531
10.658073
0.315908
23.989731
false
2024-08-30
2024-09-01
1
ClaudioItaly/Evolutionstory-7B-v2.2 (Merge)
CohereForAI_aya-23-35B_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
CohereForCausalLM
<a target="_blank" href="https://huggingface.co/CohereForAI/aya-23-35B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CohereForAI/aya-23-35B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CohereForAI__aya-23-35B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CohereForAI/aya-23-35B
31d6fd858f20539a55401c7ad913086f54d9ca2c
24.616939
cc-by-nc-4.0
263
34
true
true
true
false
true
0.646193
64.619321
0.539955
34.85836
0.026435
2.643505
0.294463
5.928412
0.43099
13.473698
0.335605
26.178339
true
2024-05-19
2024-06-12
0
CohereForAI/aya-23-35B
CohereForAI_aya-23-8B_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
CohereForCausalLM
<a target="_blank" href="https://huggingface.co/CohereForAI/aya-23-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CohereForAI/aya-23-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CohereForAI__aya-23-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CohereForAI/aya-23-8B
ec151d218a24031eb039d92fb83d10445427efc9
15.973219
cc-by-nc-4.0
388
8
true
true
true
false
true
0.469889
46.988878
0.429616
20.203761
0.01435
1.435045
0.284396
4.58613
0.394063
8.424479
0.227809
14.20102
true
2024-05-19
2024-06-12
0
CohereForAI/aya-23-8B
CohereForAI_aya-expanse-32b_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
CohereForCausalLM
<a target="_blank" href="https://huggingface.co/CohereForAI/aya-expanse-32b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CohereForAI/aya-expanse-32b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CohereForAI__aya-expanse-32b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CohereForAI/aya-expanse-32b
08b69cfa4240e2009c80ad304f000b491d1b8c38
29.391219
cc-by-nc-4.0
159
32
true
true
true
false
true
0.730174
73.017372
0.564867
38.709611
0.133686
13.36858
0.325503
10.067114
0.387271
6.408854
0.412982
34.775783
true
2024-10-23
2024-10-24
0
CohereForAI/aya-expanse-32b
CohereForAI_aya-expanse-8b_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
CohereForCausalLM
<a target="_blank" href="https://huggingface.co/CohereForAI/aya-expanse-8b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CohereForAI/aya-expanse-8b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CohereForAI__aya-expanse-8b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CohereForAI/aya-expanse-8b
b9848575c8731981dfcf2e1f3bfbcb917a2e585d
22.142223
cc-by-nc-4.0
251
8
true
true
true
false
true
0.635852
63.585176
0.49772
28.523483
0.070242
7.024169
0.302852
7.04698
0.372885
4.410677
0.300366
22.262855
true
2024-10-23
2024-10-24
0
CohereForAI/aya-expanse-8b
CohereForAI_c4ai-command-r-plus_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
CohereForCausalLM
<a target="_blank" href="https://huggingface.co/CohereForAI/c4ai-command-r-plus" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CohereForAI/c4ai-command-r-plus</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CohereForAI__c4ai-command-r-plus-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CohereForAI/c4ai-command-r-plus
fa1bd7fb1572ceb861bbbbecfa8af83b29fa8cca
30.860542
cc-by-nc-4.0
1,678
103
true
true
true
false
true
0.766419
76.641866
0.581542
39.919954
0.075529
7.55287
0.305369
7.38255
0.480719
20.423177
0.399186
33.242834
true
2024-04-03
2024-06-13
0
CohereForAI/c4ai-command-r-plus
CohereForAI_c4ai-command-r-plus-08-2024_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
CohereForCausalLM
<a target="_blank" href="https://huggingface.co/CohereForAI/c4ai-command-r-plus-08-2024" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CohereForAI/c4ai-command-r-plus-08-2024</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CohereForAI__c4ai-command-r-plus-08-2024-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CohereForAI/c4ai-command-r-plus-08-2024
2d8cf3ab0af78b9e43546486b096f86adf3ba4d0
33.420888
cc-by-nc-4.0
169
103
true
true
true
false
true
0.753954
75.395395
0.5996
42.836865
0.110272
11.02719
0.350671
13.422819
0.482948
19.835156
0.442071
38.007905
true
2024-08-21
2024-09-19
0
CohereForAI/c4ai-command-r-plus-08-2024
CohereForAI_c4ai-command-r-v01_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
CohereForCausalLM
<a target="_blank" href="https://huggingface.co/CohereForAI/c4ai-command-r-v01" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CohereForAI/c4ai-command-r-v01</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CohereForAI__c4ai-command-r-v01-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CohereForAI/c4ai-command-r-v01
16881ccde1c68bbc7041280e6a66637bc46bfe88
25.349978
cc-by-nc-4.0
1,065
34
true
true
true
false
true
0.674819
67.481948
0.540642
34.556659
0
0
0.307047
7.606264
0.451698
16.128906
0.336935
26.326093
true
2024-03-11
2024-06-13
0
CohereForAI/c4ai-command-r-v01
Columbia-NLP_LION-Gemma-2b-dpo-v1.0_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
GemmaForCausalLM
<a target="_blank" href="https://huggingface.co/Columbia-NLP/LION-Gemma-2b-dpo-v1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Columbia-NLP/LION-Gemma-2b-dpo-v1.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Columbia-NLP__LION-Gemma-2b-dpo-v1.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Columbia-NLP/LION-Gemma-2b-dpo-v1.0
a5f780075831374f8850324448acf94976dea504
11.483995
0
2
false
true
true
false
true
0.327831
32.783127
0.391996
14.585976
0.043051
4.305136
0.249161
0
0.41201
9.834635
0.166556
7.395095
false
2024-06-28
2024-07-04
0
Columbia-NLP/LION-Gemma-2b-dpo-v1.0