eval_name
stringlengths
9
97
Precision
stringclasses
5 values
Type
stringclasses
6 values
T
stringclasses
6 values
Weight type
stringclasses
3 values
Architecture
stringclasses
53 values
Model
stringlengths
355
611
fullname
stringlengths
4
89
Model sha
stringlengths
0
40
Average ⬆️
float64
27
81.3
Hub License
stringclasses
35 values
Hub ❤️
int64
0
4.88k
#Params (B)
int64
0
238
Available on the hub
bool
2 classes
Merged
bool
2 classes
MoE
bool
2 classes
Flagged
bool
1 class
date
stringlengths
0
26
Chat Template
bool
2 classes
ARC
float64
19.7
87.5
HellaSwag
float64
20.7
92.8
MMLU
float64
17.8
89.4
TruthfulQA
float64
27.9
82.3
Winogrande
float64
47.2
91.5
GSM8K
float64
0
88.2
Maintainers Choice
bool
2 classes
sr5434_CodegebraGPT-10b_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Adapter
Unknown
<a target="_blank" href="https://huggingface.co/sr5434/CodegebraGPT-10b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sr5434/CodegebraGPT-10b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_sr5434__CodegebraGPT-10b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
sr5434/CodegebraGPT-10b
15e64a7f77eba0367eedbaaacb3560351471093b
62.526891
apache-2.0
0
10
true
true
true
true
2023-12-30T13:47:48Z
false
59.556314
83.449512
60.067787
46.526705
81.057616
44.503412
false
sr5434_CodegebraGPT-10b_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Adapter
Unknown
<a target="_blank" href="https://huggingface.co/sr5434/CodegebraGPT-10b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sr5434/CodegebraGPT-10b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_sr5434__CodegebraGPT-10b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
sr5434/CodegebraGPT-10b
263f3e4c48d6fb001cd556010ee50a0b6918b8cb
62.681011
apache-2.0
0
10
true
true
true
true
2024-01-04T22:29:21Z
false
59.812287
83.419638
60.195749
46.569773
80.97869
45.109932
false
sreeramajay_TinyLlama-1.1B-orca-v1.0_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/sreeramajay/TinyLlama-1.1B-orca-v1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sreeramajay/TinyLlama-1.1B-orca-v1.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_sreeramajay__TinyLlama-1.1B-orca-v1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
sreeramajay/TinyLlama-1.1B-orca-v1.0
7dbbc8ccc85c1c3f1ce7cffbb62b97ca6d2ca046
37.170104
apache-2.0
0
1
true
true
true
true
2024-01-08T02:01:59Z
false
36.348123
61.232822
25.181909
36.578424
61.404893
2.27445
false
ssmits_Falcon2-5.5B-multilingual_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
FalconForCausalLM
<a target="_blank" href="https://huggingface.co/ssmits/Falcon2-5.5B-multilingual" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ssmits/Falcon2-5.5B-multilingual</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_ssmits__Falcon2-5.5B-multilingual" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ssmits/Falcon2-5.5B-multilingual
3fe6285c040d8856df5b188ea0a6f58b94c1c4a6
32.1885
apache-2.0
1
5
true
false
true
true
2024-05-21T20:15:10Z
false
26.109215
40.300737
23.91409
47.55834
55.248619
0
false
stabilityai_StableBeluga-13B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/stabilityai/StableBeluga-13B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">stabilityai/StableBeluga-13B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_stabilityai__StableBeluga-13B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
stabilityai/StableBeluga-13B
1d6eef4cc2b73f39600a568803ad8183f2da4514
57.049417
null
114
13
true
true
true
true
2023-09-09T10:52:17Z
false
62.030717
82.274447
57.708847
49.60965
76.874507
13.798332
true
stabilityai_StableBeluga-7B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/stabilityai/StableBeluga-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">stabilityai/StableBeluga-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_stabilityai__StableBeluga-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
stabilityai/StableBeluga-7B
329adcfc39f48dce183eb0b155b732dbe03c6304
53.563467
null
130
6
true
true
true
true
2023-10-16T12:48:18Z
false
56.313993
79.137622
52.712472
50.190723
75.217048
7.808946
true
stabilityai_StableBeluga1-Delta_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/stabilityai/StableBeluga1-Delta" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">stabilityai/StableBeluga1-Delta</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_stabilityai__StableBeluga1-Delta" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
stabilityai/StableBeluga1-Delta
40a78d91d43ad9aef6663ff15ddc15be9922bce5
54.082645
cc-by-nc-4.0
58
65
true
true
true
true
2023-09-09T10:52:17Z
false
68.174061
85.879307
64.829439
55.810378
49.802684
0
true
stabilityai_StableBeluga2_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/stabilityai/StableBeluga2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">stabilityai/StableBeluga2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_stabilityai__StableBeluga2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
stabilityai/StableBeluga2
e4944caa6ece819413b140b8dcecea79fe7e22cf
67.415064
null
884
0
true
true
true
true
2023-09-09T10:52:17Z
false
71.075085
86.367258
68.794887
59.440802
82.951855
35.8605
true
stabilityai_japanese-stablelm-base-gamma-7b_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/stabilityai/japanese-stablelm-base-gamma-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">stabilityai/japanese-stablelm-base-gamma-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_stabilityai__japanese-stablelm-base-gamma-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
stabilityai/japanese-stablelm-base-gamma-7b
e1c3840c716485077b688296fefa8e5641249843
52.587985
apache-2.0
20
7
true
true
true
true
2023-12-11T04:22:29Z
false
50.341297
77.474607
54.745421
41.195837
73.954223
17.816528
true
stabilityai_japanese-stablelm-instruct-gamma-7b_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/stabilityai/japanese-stablelm-instruct-gamma-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">stabilityai/japanese-stablelm-instruct-gamma-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_stabilityai__japanese-stablelm-instruct-gamma-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
stabilityai/japanese-stablelm-instruct-gamma-7b
044918151c5b3910d12f2e489fb7c60752048e1e
52.82207
apache-2.0
51
7
true
true
true
true
2023-12-11T04:23:11Z
false
50.682594
78.679546
54.823019
39.772807
73.717443
19.257013
true
stabilityai_stable-code-3b_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
StableLmForCausalLM
<a target="_blank" href="https://huggingface.co/stabilityai/stable-code-3b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">stabilityai/stable-code-3b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_stabilityai__stable-code-3b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
stabilityai/stable-code-3b
b676b54bae5e15369ecac43b0eef511e86ab58db
41.527143
other
620
2
true
true
true
true
2024-04-19T20:52:13Z
false
35.324232
57.448715
37.53183
41.602467
57.695343
19.560273
true
stabilityai_stablelm-2-12b_bfloat16
bfloat16
🟢 pretrained
🟢
Original
StableLmForCausalLM
<a target="_blank" href="https://huggingface.co/stabilityai/stablelm-2-12b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">stabilityai/stablelm-2-12b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_stabilityai__stablelm-2-12b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
stabilityai/stablelm-2-12b
c4e0a2bc86c7268af8ae434b755de04e40791a80
63.483391
other
105
12
true
true
true
true
2024-04-09T14:34:59Z
false
58.447099
84.325832
62.039927
42.159644
77.900552
56.027293
true
stabilityai_stablelm-2-12b-chat_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
StableLmForCausalLM
<a target="_blank" href="https://huggingface.co/stabilityai/stablelm-2-12b-chat" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">stabilityai/stablelm-2-12b-chat</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_stabilityai__stablelm-2-12b-chat" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
stabilityai/stablelm-2-12b-chat
7141a7aecbbf6b7d71535a3f305767d89f266776
68.376272
other
80
12
true
true
false
true
2024-04-18T23:02:57Z
false
64.846416
85.958972
61.058661
62.014763
78.531965
57.846854
true
stabilityai_stablelm-2-1_6b_bfloat16
bfloat16
🟢 pretrained
🟢
Original
Unknown
<a target="_blank" href="https://huggingface.co/stabilityai/stablelm-2-1_6b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">stabilityai/stablelm-2-1_6b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_stabilityai__stablelm-2-1_6b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
stabilityai/stablelm-2-1_6b
810b45c00ea0af42ded794f9e613f6fc52330921
45.254774
other
170
1
true
true
true
true
2024-01-24T09:46:21Z
false
43.34471
70.454093
38.946573
36.783858
64.561957
17.437453
true
stabilityai_stablelm-2-1_6b-chat_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
StableLmForCausalLM
<a target="_blank" href="https://huggingface.co/stabilityai/stablelm-2-1_6b-chat" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">stabilityai/stablelm-2-1_6b-chat</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_stabilityai__stablelm-2-1_6b-chat" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
stabilityai/stablelm-2-1_6b-chat
b362325b3d08b5f37acc4a504b441b04414df818
50.710502
other
22
1
true
true
true
true
2024-04-16T19:37:30Z
false
43.515358
69.239195
41.472625
46.498739
64.719811
38.817286
true
stabilityai_stablelm-2-zephyr-1_6b_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
StableLmForCausalLM
<a target="_blank" href="https://huggingface.co/stabilityai/stablelm-2-zephyr-1_6b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">stabilityai/stablelm-2-zephyr-1_6b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_stabilityai__stablelm-2-zephyr-1_6b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
stabilityai/stablelm-2-zephyr-1_6b
c89d7d19e9781974793a7e9b0fe55bcabcf8abc5
49.990909
other
174
1
true
true
true
true
2024-03-01T07:59:58Z
false
43.686007
69.298944
42.034783
45.112892
64.483031
35.329795
true
stabilityai_stablelm-3b-4e1t_float16
float16
🟢 pretrained
🟢
Original
StableLMEpochForCausalLM
<a target="_blank" href="https://huggingface.co/stabilityai/stablelm-3b-4e1t" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">stabilityai/stablelm-3b-4e1t</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_stabilityai__stablelm-3b-4e1t" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
stabilityai/stablelm-3b-4e1t
a4750ace0db6f08d7bbba0aa52a585f231ea3cde
46.579707
cc-by-sa-4.0
307
2
true
true
true
true
2023-11-08T10:34:08Z
false
46.587031
75.941048
45.225738
37.196774
71.191792
3.335861
true
stabilityai_stablelm-base-alpha-3b_float16
float16
🟢 pretrained
🟢
Original
GPTNeoXForCausalLM
<a target="_blank" href="https://huggingface.co/stabilityai/stablelm-base-alpha-3b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">stabilityai/stablelm-base-alpha-3b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_stabilityai__stablelm-base-alpha-3b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
stabilityai/stablelm-base-alpha-3b
99567ccfe45fabe467c71393aa6716106edb83c2
31.496992
cc-by-sa-4.0
83
3
true
true
true
true
2023-10-16T12:48:18Z
false
26.450512
42.242581
25.430527
40.496579
53.906867
0.45489
true
stabilityai_stablelm-base-alpha-7b_float16
float16
🟢 pretrained
🟢
Original
GPTNeoXForCausalLM
<a target="_blank" href="https://huggingface.co/stabilityai/stablelm-base-alpha-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">stabilityai/stablelm-base-alpha-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_stabilityai__stablelm-base-alpha-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
stabilityai/stablelm-base-alpha-7b
38366357b5a45e002af2d254ff3d559444ec2147
34.36517
cc-by-sa-4.0
211
7
true
true
true
true
2023-09-09T10:52:17Z
false
31.996587
51.782513
26.205161
40.193766
55.406472
0.60652
true
stabilityai_stablelm-base-alpha-7b-v2_float16
float16
🟢 pretrained
🟢
Original
StableLMAlphaForCausalLM
<a target="_blank" href="https://huggingface.co/stabilityai/stablelm-base-alpha-7b-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">stabilityai/stablelm-base-alpha-7b-v2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_stabilityai__stablelm-base-alpha-7b-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
stabilityai/stablelm-base-alpha-7b-v2
eb3b56fee1ad4b1efe6625bbbc7a277df8ab5b96
46.178895
cc-by-sa-4.0
47
6
true
true
true
true
2023-11-08T10:34:08Z
false
47.354949
77.07628
45.09894
36.457202
68.508287
2.57771
true
stabilityai_stablelm-tuned-alpha-3b_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
GPTNeoXForCausalLM
<a target="_blank" href="https://huggingface.co/stabilityai/stablelm-tuned-alpha-3b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">stabilityai/stablelm-tuned-alpha-3b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_stabilityai__stablelm-tuned-alpha-3b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
stabilityai/stablelm-tuned-alpha-3b
d1c03d2114451d562416b9efe4281d319ceff99e
32.138678
cc-by-nc-sa-4.0
112
3
true
true
true
true
2023-09-09T10:52:17Z
false
27.8157
44.064927
23.080135
42.328762
55.011839
0.530705
true
stabilityai_stablelm-tuned-alpha-7b_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
GPTNeoXForCausalLM
<a target="_blank" href="https://huggingface.co/stabilityai/stablelm-tuned-alpha-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">stabilityai/stablelm-tuned-alpha-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_stabilityai__stablelm-tuned-alpha-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
stabilityai/stablelm-tuned-alpha-7b
25071b093c15c0d1cb2b2876c6deb621b764fcf5
34.040302
cc-by-nc-sa-4.0
359
7
true
true
true
true
2023-09-09T10:52:17Z
false
31.911263
53.594901
24.412462
40.371618
53.117601
0.833965
true
stabilityai_stablelm-zephyr-3b_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
StableLmForCausalLM
<a target="_blank" href="https://huggingface.co/stabilityai/stablelm-zephyr-3b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">stabilityai/stablelm-zephyr-3b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_stabilityai__stablelm-zephyr-3b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
stabilityai/stablelm-zephyr-3b
8b471c751c0e78cb46cf9f47738dd0eb45392071
53.425864
other
235
2
true
true
true
true
2024-03-01T07:42:28Z
false
46.075085
74.158534
46.167947
46.491393
65.509077
42.153146
true
stanford-oval_Llama-2-7b-WikiChat-fused_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/stanford-oval/Llama-2-7b-WikiChat-fused" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">stanford-oval/Llama-2-7b-WikiChat-fused</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_stanford-oval__Llama-2-7b-WikiChat-fused" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
stanford-oval/Llama-2-7b-WikiChat-fused
47cc2d3e1719da0f0300d07111ea6a9b6e3aa2d0
46.811165
llama2
6
6
true
true
true
true
2024-01-16T04:35:24Z
false
50.682594
74.995021
39.69073
46.362057
69.060773
0.075815
false
starmpcc_Asclepius-Llama2-13B_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/starmpcc/Asclepius-Llama2-13B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">starmpcc/Asclepius-Llama2-13B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_starmpcc__Asclepius-Llama2-13B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
starmpcc/Asclepius-Llama2-13B
579271bebb894d89369205060d151120a217ce81
50.25432
cc-by-nc-4.0
11
13
true
true
true
true
2023-11-16T12:47:35Z
false
55.887372
79.655447
52.380503
40.759568
72.691397
0.15163
false
starmpcc_Asclepius-Llama2-7B_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/starmpcc/Asclepius-Llama2-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">starmpcc/Asclepius-Llama2-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_starmpcc__Asclepius-Llama2-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
starmpcc/Asclepius-Llama2-7B
2f15bd8250d7825307e59cc2c785074ebbec3395
47.146361
cc-by-nc-4.0
12
7
true
true
true
true
2023-11-16T12:47:29Z
false
50.853242
76.52858
43.612955
43.30862
68.271507
0.30326
false
statking_Meta-Llama-3-8B-Instruct-ORPO-QLoRA_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Adapter
?
<a target="_blank" href="https://huggingface.co/statking/Meta-Llama-3-8B-Instruct-ORPO-QLoRA" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">statking/Meta-Llama-3-8B-Instruct-ORPO-QLoRA</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_statking__Meta-Llama-3-8B-Instruct-ORPO-QLoRA" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
statking/Meta-Llama-3-8B-Instruct-ORPO-QLoRA
7986c4c8c104e260155844d339a97dabe9ad861c
64.457363
llama3
0
8
true
true
true
true
2024-05-23T11:28:06Z
false
58.191126
79.416451
65.59263
48.380169
76.5588
58.605004
false
statking_zephyr-7b-sft-full-orpo_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/statking/zephyr-7b-sft-full-orpo" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">statking/zephyr-7b-sft-full-orpo</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_statking__zephyr-7b-sft-full-orpo" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
statking/zephyr-7b-sft-full-orpo
797ee78eb44b3831c9102d1619af9f7493066098
53.160008
apache-2.0
0
7
true
true
true
true
2024-05-23T09:39:05Z
false
54.522184
81.886078
58.127475
46.365905
78.058406
0
false
steve-cse_MelloGPT_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/steve-cse/MelloGPT" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">steve-cse/MelloGPT</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_steve-cse__MelloGPT" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
steve-cse/MelloGPT
aedecb296e2cdcb3da95a345a794ea26f071c419
57.588979
mit
10
0
true
true
true
true
2023-12-16T08:12:17Z
false
53.83959
76.120295
55.990584
55.609551
73.875296
30.09856
false
sumandas_llama3-openhermes-2.5_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/sumandas/llama3-openhermes-2.5" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sumandas/llama3-openhermes-2.5</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_sumandas__llama3-openhermes-2.5" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
sumandas/llama3-openhermes-2.5
6f4076b9c310edf0ef0ec8df0b324bf1979f2ae1
54.457726
llama2
2
8
true
true
true
true
2024-04-21T14:49:36Z
false
57.081911
79.735113
65.63645
45.997695
78.295185
0
false
sumo43_SOLAR-10.7B-Instruct-DPO-v1.0_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Adapter
Unknown
<a target="_blank" href="https://huggingface.co/sumo43/SOLAR-10.7B-Instruct-DPO-v1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sumo43/SOLAR-10.7B-Instruct-DPO-v1.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_sumo43__SOLAR-10.7B-Instruct-DPO-v1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
sumo43/SOLAR-10.7B-Instruct-DPO-v1.0
9e439597e3e788e3ff8a41df54e0dae0acda14a4
69.807701
0
10
false
true
true
true
2023-12-20T02:38:06Z
false
73.122867
89.772954
64.210017
73.271725
81.925809
36.542835
false
sumo43_Yi-32b-x2-v2.0_bfloat16
bfloat16
🟢 pretrained
🟢
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/sumo43/Yi-32b-x2-v2.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sumo43/Yi-32b-x2-v2.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_sumo43__Yi-32b-x2-v2.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
sumo43/Yi-32b-x2-v2.0
1e61f28b326fe0080ad476ce2b1dd041ec9f147f
76.16597
0
60
false
true
true
true
2024-01-17T02:52:11Z
false
73.037543
85.949014
76.790646
73.223704
82.794002
65.20091
false
sumo43_Yi-34b-x2_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/sumo43/Yi-34b-x2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sumo43/Yi-34b-x2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_sumo43__Yi-34b-x2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
sumo43/Yi-34b-x2
09876944a5d29e7f8e4da1347cd1d8f6f2151444
75.024799
mit
0
34
true
false
true
true
2024-01-15T07:37:23Z
false
72.866894
85.70006
76.638575
72.103777
82.794002
60.045489
false
superlazycoder_NeuralPipe-7B-slerp_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/superlazycoder/NeuralPipe-7B-slerp" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">superlazycoder/NeuralPipe-7B-slerp</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_superlazycoder__NeuralPipe-7B-slerp" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
superlazycoder/NeuralPipe-7B-slerp
98bf395c8868b226208debc63d67576fdee52528
71.011414
apache-2.0
0
7
true
false
true
true
2024-01-11T22:39:32Z
false
67.576792
86.168094
64.059859
59.840804
80.189424
68.23351
false
sutie_mixture-of-gemmas_float16
float16
🤝 base merges and moerges
🤝
Original
GemmaForCausalLM
<a target="_blank" href="https://huggingface.co/sutie/mixture-of-gemmas" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sutie/mixture-of-gemmas</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_sutie__mixture-of-gemmas" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
sutie/mixture-of-gemmas
8806db8e927c530f6f52833abd94ede43bd0aa7e
null
0
8
false
true
true
true
2024-05-10T12:55:31Z
false
24.744027
22.445728
37.871253
null
47.434886
0
false
sutie_mixture-of-gemmas_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
GemmaForCausalLM
<a target="_blank" href="https://huggingface.co/sutie/mixture-of-gemmas" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sutie/mixture-of-gemmas</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_sutie__mixture-of-gemmas" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
sutie/mixture-of-gemmas
472c86839d9408eeed1a490ec3665db92757fd0e
null
0
8
false
true
true
true
2024-05-11T15:25:29Z
false
25.170648
22.465644
37.710953
null
47.671665
0
false
sutie_mixture-of-gemmas-dare-linear_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
GemmaForCausalLM
<a target="_blank" href="https://huggingface.co/sutie/mixture-of-gemmas-dare-linear" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sutie/mixture-of-gemmas-dare-linear</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_sutie__mixture-of-gemmas-dare-linear" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
sutie/mixture-of-gemmas-dare-linear
2cb65add80a3660e4a2da4a9b39ff2af48ce00fb
null
0
8
false
true
true
true
2024-05-14T14:51:48Z
false
25.938567
22.883888
38.697358
null
47.987372
0
false
sutie_mixture-of-gemmas-dare-ties_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
GemmaForCausalLM
<a target="_blank" href="https://huggingface.co/sutie/mixture-of-gemmas-dare-ties" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sutie/mixture-of-gemmas-dare-ties</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_sutie__mixture-of-gemmas-dare-ties" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
sutie/mixture-of-gemmas-dare-ties
46cc2698c744bbdcd7753ba4638b931de0433d23
null
0
8
false
true
true
true
2024-05-14T14:51:06Z
false
25.170648
22.465644
37.710953
null
47.671665
0
false
sutie_mixture-of-gemmas-ties_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
GemmaForCausalLM
<a target="_blank" href="https://huggingface.co/sutie/mixture-of-gemmas-ties" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sutie/mixture-of-gemmas-ties</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_sutie__mixture-of-gemmas-ties" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
sutie/mixture-of-gemmas-ties
7c74e90f6ab4633bf10078d8b016c522730f3985
null
0
8
false
true
true
true
2024-05-14T14:52:09Z
false
35.836177
42.800239
44.196456
null
56.511444
0
false
swap-uniba_LLaMAntino-3-ANITA-8B-Inst-DPO-ITA_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/swap-uniba/LLaMAntino-3-ANITA-8B-Inst-DPO-ITA" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">swap-uniba/LLaMAntino-3-ANITA-8B-Inst-DPO-ITA</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_swap-uniba__LLaMAntino-3-ANITA-8B-Inst-DPO-ITA" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
swap-uniba/LLaMAntino-3-ANITA-8B-Inst-DPO-ITA
19178bdccbc734d31e37615fe25bcd291be29ef5
75.118582
llama3
19
8
true
true
true
true
2024-05-14T11:29:36Z
false
74.573379
92.750448
66.849856
75.92807
82.004736
58.605004
false
synapsoft_Llama-2-7b-chat-hf-flan2022-1.2M_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/synapsoft/Llama-2-7b-chat-hf-flan2022-1.2M" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">synapsoft/Llama-2-7b-chat-hf-flan2022-1.2M</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_synapsoft__Llama-2-7b-chat-hf-flan2022-1.2M" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
synapsoft/Llama-2-7b-chat-hf-flan2022-1.2M
825506858e4603745a479215b8dea1524bfab6a0
47.887986
null
1
6
false
true
true
true
2023-10-16T12:48:18Z
false
49.573379
76.249751
45.990772
42.174509
71.823204
1.5163
false
synapsoft_Llama-2-7b-hf-flan2022-1.2M_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/synapsoft/Llama-2-7b-hf-flan2022-1.2M" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">synapsoft/Llama-2-7b-hf-flan2022-1.2M</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_synapsoft__Llama-2-7b-hf-flan2022-1.2M" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
synapsoft/Llama-2-7b-hf-flan2022-1.2M
792f946a1413a7c58378d7a350b7d75b9df80561
43.677861
null
1
6
false
true
true
true
2023-10-16T12:48:18Z
false
23.293515
78.460466
42.334004
37.973341
75.532755
4.473086
false
tcapelle_gemma-7b-zephyr-dpo_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
GemmaForCausalLM
<a target="_blank" href="https://huggingface.co/tcapelle/gemma-7b-zephyr-dpo" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">tcapelle/gemma-7b-zephyr-dpo</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_tcapelle__gemma-7b-zephyr-dpo" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
tcapelle/gemma-7b-zephyr-dpo
a3980aba73509cc3fa7553dd612478ac589255ba
61.615898
0
8
false
true
true
true
2024-02-28T11:57:06Z
false
60.836177
80.442143
60.595935
42.484138
75.374901
49.962092
false
tcapelle_gemma-7b-zephyr-sft_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
GemmaForCausalLM
<a target="_blank" href="https://huggingface.co/tcapelle/gemma-7b-zephyr-sft" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">tcapelle/gemma-7b-zephyr-sft</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_tcapelle__gemma-7b-zephyr-sft" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
tcapelle/gemma-7b-zephyr-sft
51918d1d0284e398a08f3b74b642f940efc925be
61.641118
0
8
false
true
true
true
2024-02-28T11:56:51Z
false
61.433447
80.73093
60.331431
43.349434
74.191002
49.810462
false
team-lucid_mptk-1b_float16
float16
🟢 pretrained
🟢
Original
MptForCausalLM
<a target="_blank" href="https://huggingface.co/team-lucid/mptk-1b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">team-lucid/mptk-1b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_team-lucid__mptk-1b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
team-lucid/mptk-1b
aea467410ae0cead4fded6b98a3575e92b22862f
null
apache-2.0
1
1
true
true
true
true
2024-03-05T11:08:30Z
false
22.696246
25.114519
27.024879
null
49.723757
0
false
team-lucid_mptk-1b_bfloat16
bfloat16
🟢 pretrained
🟢
Original
MptForCausalLM
<a target="_blank" href="https://huggingface.co/team-lucid/mptk-1b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">team-lucid/mptk-1b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_team-lucid__mptk-1b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
team-lucid/mptk-1b
382a746dfb0745bab2b2e63a1e6a28ba1aa3f306
29.704468
apache-2.0
1
1
true
true
true
true
2024-03-05T14:08:25Z
false
24.061433
35.610436
26.945662
39.709802
51.065509
0.833965
false
teilomillet_MiniMerlin-3B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/teilomillet/MiniMerlin-3B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">teilomillet/MiniMerlin-3B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_teilomillet__MiniMerlin-3B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
teilomillet/MiniMerlin-3B
7fefc3d23e77c699aadba55c40d9e364eb73baf0
47.628944
apache-2.0
0
3
true
true
true
true
2023-12-15T11:58:24Z
false
44.368601
66.560446
43.20755
47.066173
64.404104
20.166793
false
teilomillet_MiniMerlin-3b-v0.1_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/teilomillet/MiniMerlin-3b-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">teilomillet/MiniMerlin-3b-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_teilomillet__MiniMerlin-3b-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
teilomillet/MiniMerlin-3b-v0.1
2addcbd985f8a7f8bb7a7c21a5ec0e2505e549c6
41.604928
apache-2.0
0
3
true
true
true
true
2023-12-12T19:27:02Z
false
40.699659
54.062936
43.31823
49.647375
60.536701
1.36467
false
teknium_CollectiveCognition-v1-Mistral-7B_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/teknium/CollectiveCognition-v1-Mistral-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">teknium/CollectiveCognition-v1-Mistral-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_teknium__CollectiveCognition-v1-Mistral-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
teknium/CollectiveCognition-v1-Mistral-7B
58777f0563610fa770c4fa252c0350de71d4ab9d
60.099427
apache-2.0
16
7
true
true
true
true
2023-10-16T13:19:55Z
false
62.372014
85.500896
62.764566
54.481897
77.584846
17.892343
true
teknium_CollectiveCognition-v1.1-Mistral-7B_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/teknium/CollectiveCognition-v1.1-Mistral-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">teknium/CollectiveCognition-v1.1-Mistral-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_teknium__CollectiveCognition-v1.1-Mistral-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
teknium/CollectiveCognition-v1.1-Mistral-7B
5f57f70ec99450c70da2540e94dd7fd67be4b23c
62.915311
apache-2.0
77
7
true
true
true
true
2023-10-16T12:48:18Z
false
62.116041
84.166501
62.349786
57.624139
75.374901
35.8605
true
teknium_Mistral-Trismegistus-7B_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/teknium/Mistral-Trismegistus-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">teknium/Mistral-Trismegistus-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_teknium__Mistral-Trismegistus-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
teknium/Mistral-Trismegistus-7B
0a5752d096ebab21759dbe203f6b7c7f6092faf2
52.658907
apache-2.0
198
7
true
true
true
true
2023-10-16T12:48:18Z
false
54.095563
77.912766
54.489028
49.358571
70.165746
9.931766
true
teknium_OpenHermes-13B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/teknium/OpenHermes-13B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">teknium/OpenHermes-13B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_teknium__OpenHermes-13B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
teknium/OpenHermes-13B
f09d0fe655ad57cce9179b7b40ea6f81e07db18c
55.244824
mit
53
13
true
true
true
true
2023-10-16T12:48:18Z
false
59.812287
82.244573
56.347278
46.011282
75.453828
11.599697
true
teknium_OpenHermes-2.5-Mistral-7B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/teknium/OpenHermes-2.5-Mistral-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">teknium/OpenHermes-2.5-Mistral-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_teknium__OpenHermes-2.5-Mistral-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
teknium/OpenHermes-2.5-Mistral-7B
2a54cad766bc90828354db5c4199795aecfd0df1
61.454901
apache-2.0
787
7
true
true
true
true
2023-11-14T15:14:43Z
false
64.931741
84.295957
63.821668
52.305641
77.900552
25.473844
true
teknium_OpenHermes-2.5-Mistral-7B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/teknium/OpenHermes-2.5-Mistral-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">teknium/OpenHermes-2.5-Mistral-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_teknium__OpenHermes-2.5-Mistral-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
teknium/OpenHermes-2.5-Mistral-7B
2a54cad766bc90828354db5c4199795aecfd0df1
61.520319
apache-2.0
787
7
true
true
true
true
2023-11-17T19:50:31Z
false
64.931741
84.176459
63.636963
52.237982
78.058406
26.080364
true
teknium_OpenHermes-7B_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/teknium/OpenHermes-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">teknium/OpenHermes-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_teknium__OpenHermes-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
teknium/OpenHermes-7B
74edb1ad58d3d517ef46c4e2a31081084ecbc473
51.264668
mit
13
7
true
true
true
true
2023-10-16T12:54:17Z
false
56.143345
78.321052
48.617758
44.995354
74.506709
5.003791
true
tenyx_Llama3-TenyxChat-70B_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/tenyx/Llama3-TenyxChat-70B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">tenyx/Llama3-TenyxChat-70B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_tenyx__Llama3-TenyxChat-70B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
tenyx/Llama3-TenyxChat-70B
de770dc2c767b50b17bef491ec6983c29e60f668
78.404942
llama3
61
70
true
true
false
true
2024-04-27T17:40:55Z
false
72.098976
86.207927
80.041893
62.851521
82.951855
86.277483
false
tenyx_TenyxChat-7B-v1_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/tenyx/TenyxChat-7B-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">tenyx/TenyxChat-7B-v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_tenyx__TenyxChat-7B-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
tenyx/TenyxChat-7B-v1
c3c7ee002c4fdb1b8c2e2c78b7fba0c389673710
68.459991
apache-2.0
25
7
true
false
true
true
2024-01-10T19:35:40Z
false
65.614334
85.550687
64.80754
51.279982
80.50513
63.002274
false
tenyx_TenyxChat-8x7B-v1_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/tenyx/TenyxChat-8x7B-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">tenyx/TenyxChat-8x7B-v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_tenyx__TenyxChat-8x7B-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
tenyx/TenyxChat-8x7B-v1
86fd0b7d132126be49c02e061ebec02e1d3a4e38
72.721987
apache-2.0
12
46
true
true
false
true
2024-01-20T06:27:29Z
false
69.709898
87.761402
71.118959
65.419294
81.21547
61.106899
false
theBodhiTree_theBodhiTree-Zephyr-Gamma-7b_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralModel
<a target="_blank" href="https://huggingface.co/theBodhiTree/theBodhiTree-Zephyr-Gamma-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">theBodhiTree/theBodhiTree-Zephyr-Gamma-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_theBodhiTree__theBodhiTree-Zephyr-Gamma-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
theBodhiTree/theBodhiTree-Zephyr-Gamma-7b
8e4088cd99a5fa55b87eee2aaaa186624d11b785
null
apache-2.0
0
7
true
false
true
true
2024-05-12T07:45:15Z
false
29.607509
29.077873
26.958976
null
51.223362
0
false
theNovaAI_Hypernova-experimental_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/theNovaAI/Hypernova-experimental" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">theNovaAI/Hypernova-experimental</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_theNovaAI__Hypernova-experimental" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
theNovaAI/Hypernova-experimental
25cb19886e7ff7946e8ddb897d83dc1c7f4483c0
58.911726
cc-by-sa-4.0
0
13
true
true
true
true
2024-05-03T00:53:37Z
false
61.433447
83.638717
55.490443
51.37978
75.295975
26.231994
false
theNovaAI_Supernova-experimental_float16
float16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/theNovaAI/Supernova-experimental" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">theNovaAI/Supernova-experimental</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_theNovaAI__Supernova-experimental" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
theNovaAI/Supernova-experimental
e0b2524a7ac1e08c8c04e50d4461b89699d3603c
59.79295
cc-by-sa-4.0
0
0
true
false
true
true
2024-03-08T16:23:12Z
false
63.054608
83.658634
56.590619
49.371884
77.348066
28.733889
false
theo77186_Llama-3-8B-Instruct-norefusal_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/theo77186/Llama-3-8B-Instruct-norefusal" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">theo77186/Llama-3-8B-Instruct-norefusal</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_theo77186__Llama-3-8B-Instruct-norefusal" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
theo77186/Llama-3-8B-Instruct-norefusal
c4e1d87f712e42f6fe0564a26aee4aebe3b4a000
65.158728
llama3
4
8
true
true
true
true
2024-05-25T16:55:35Z
false
60.580205
78.181637
66.476305
51.468397
74.427782
59.818044
false
thepowefuldeez_mistral-openhermes-sft_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralModel
<a target="_blank" href="https://huggingface.co/thepowefuldeez/mistral-openhermes-sft" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">thepowefuldeez/mistral-openhermes-sft</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_thepowefuldeez__mistral-openhermes-sft" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
thepowefuldeez/mistral-openhermes-sft
ccc1c1fbc71f677a7b82a0ce7564c8780bd00c71
null
0
7
false
true
true
true
2024-04-17T14:30:10Z
false
28.412969
29.625573
25.653336
null
52.486188
0
false
thomasgauthier_Unmixtraled-22B-v0.1-expert-2_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/thomasgauthier/Unmixtraled-22B-v0.1-expert-2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">thomasgauthier/Unmixtraled-22B-v0.1-expert-2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_thomasgauthier__Unmixtraled-22B-v0.1-expert-2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
thomasgauthier/Unmixtraled-22B-v0.1-expert-2
0d55da7a493a19a7e10ffe6bc831181aca59a433
29.219618
apache-2.0
2
22
true
true
false
true
2024-04-26T00:00:11Z
false
27.047782
26.289584
23.085072
49.329365
49.565904
0
false
tianlinliu0121_zephyr-7b-dpo-full-beta-0.2_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/tianlinliu0121/zephyr-7b-dpo-full-beta-0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">tianlinliu0121/zephyr-7b-dpo-full-beta-0.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_tianlinliu0121__zephyr-7b-dpo-full-beta-0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
tianlinliu0121/zephyr-7b-dpo-full-beta-0.2
727b63fc1ca6a592072159a7185c22f74cd38480
61.35969
mit
0
7
true
true
true
true
2023-11-23T22:54:31Z
false
61.860068
83.977295
61.850453
54.783004
76.953433
28.733889
false
tianlinliu0121_zephyr-7b-dpo-full-beta-0.2_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/tianlinliu0121/zephyr-7b-dpo-full-beta-0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">tianlinliu0121/zephyr-7b-dpo-full-beta-0.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_tianlinliu0121__zephyr-7b-dpo-full-beta-0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
tianlinliu0121/zephyr-7b-dpo-full-beta-0.2
727b63fc1ca6a592072159a7185c22f74cd38480
61.5492
mit
0
7
true
true
true
true
2023-11-23T22:57:07Z
false
61.774744
84.037044
61.785232
54.721999
76.953433
30.022745
false
tiiuae_falcon-11B_float16
float16
🟢 pretrained
🟢
Original
Unknown
<a target="_blank" href="https://huggingface.co/tiiuae/falcon-11B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">tiiuae/falcon-11B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_tiiuae__falcon-11B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
tiiuae/falcon-11B
22d1486ae18ddb892ed296aabcaed7537274faed
64.281231
unknown
184
11
true
true
true
true
2024-05-13T08:43:50Z
false
59.726962
82.911771
58.367012
52.557798
78.295185
53.828658
true
tiiuae_falcon-180B_8bit
8bit
🟢 pretrained
🟢
Original
FalconForCausalLM
<a target="_blank" href="https://huggingface.co/tiiuae/falcon-180B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">tiiuae/falcon-180B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_tiiuae__falcon-180B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
tiiuae/falcon-180B
71a1a70b629e9963f7b4601e82f3f9079d48011e
67.852958
unknown
1,105
179
true
true
true
true
2023-09-15T11:25:37.918274
false
69.453925
88.856801
70.497143
45.467796
86.898185
45.943897
true
tiiuae_falcon-180B_4bit
4bit
🟢 pretrained
🟢
Original
FalconForCausalLM
<a target="_blank" href="https://huggingface.co/tiiuae/falcon-180B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">tiiuae/falcon-180B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_tiiuae__falcon-180B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
tiiuae/falcon-180B
71a1a70b629e9963f7b4601e82f3f9079d48011e
65.462583
unknown
1,105
179
true
true
true
true
2023-09-15T11:25:37.918274
false
69.197952
88.886676
69.58661
45.156951
86.740331
33.206975
true
tiiuae_falcon-40b_float16
float16
🟢 pretrained
🟢
Original
FalconForCausalLM
<a target="_blank" href="https://huggingface.co/tiiuae/falcon-40b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">tiiuae/falcon-40b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_tiiuae__falcon-40b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
tiiuae/falcon-40b
3d7c5902f1dc9da830979a826cd96114b3ba4ec1
58.072067
apache-2.0
2,413
40
true
true
true
true
2023-09-09T10:52:17Z
false
61.860068
85.281816
56.893851
41.64662
81.294396
21.455648
true
tiiuae_falcon-7b_float16
float16
🟢 pretrained
🟢
Original
FalconForCausalLM
<a target="_blank" href="https://huggingface.co/tiiuae/falcon-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">tiiuae/falcon-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_tiiuae__falcon-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
tiiuae/falcon-7b
378337427557d1df3e742264a2901a49f25d4eb1
44.17474
apache-2.0
1,048
7
true
true
true
true
2023-09-09T10:52:17Z
false
47.866894
78.131846
27.78547
34.263826
72.375691
4.624716
true
tiiuae_falcon-7b-instruct_bfloat16
bfloat16
?
Original
FalconForCausalLM
<a target="_blank" href="https://huggingface.co/tiiuae/falcon-7b-instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">tiiuae/falcon-7b-instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_tiiuae__falcon-7b-instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
tiiuae/falcon-7b-instruct
eb410fb6ffa9028e97adb801f0d6ec46d02f8b07
43.164914
apache-2.0
878
7
true
true
true
true
2024-06-09T15:03:00Z
false
45.819113
70.782713
25.660047
44.068167
68.034728
4.624716
true
tiiuae_falcon-7b-instruct_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
FalconForCausalLM
<a target="_blank" href="https://huggingface.co/tiiuae/falcon-7b-instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">tiiuae/falcon-7b-instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_tiiuae__falcon-7b-instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
tiiuae/falcon-7b-instruct
cf4b3c42ce2fdfe24f753f0f0d179202fea59c99
43.263889
apache-2.0
878
7
true
true
true
true
2023-10-16T12:54:17Z
false
46.16041
70.85242
25.836964
44.077208
67.955801
4.700531
true
tiiuae_falcon-rw-1b_float16
float16
🟢 pretrained
🟢
Original
FalconForCausalLM
<a target="_blank" href="https://huggingface.co/tiiuae/falcon-rw-1b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">tiiuae/falcon-rw-1b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_tiiuae__falcon-rw-1b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
tiiuae/falcon-rw-1b
e4b9872bb803165eb22f0a867d4e6a64d34fce19
37.072371
apache-2.0
99
1
true
true
true
true
2023-10-16T12:48:18Z
false
35.068259
63.563035
25.280319
35.955599
62.036306
0.530705
true
timpal0l_Mistral-7B-v0.1-flashback-v2_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/timpal0l/Mistral-7B-v0.1-flashback-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">timpal0l/Mistral-7B-v0.1-flashback-v2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_timpal0l__Mistral-7B-v0.1-flashback-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
timpal0l/Mistral-7B-v0.1-flashback-v2
2711647da9d8da18d746406d60ad8d806b7f1fd7
57.525291
mit
6
7
true
true
true
true
2024-01-11T14:20:16Z
false
57.167235
80.740888
59.978967
40.658215
77.190213
29.416224
false
titan087_OpenLlama13B-Guanaco_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/titan087/OpenLlama13B-Guanaco" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">titan087/OpenLlama13B-Guanaco</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_titan087__OpenLlama13B-Guanaco" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
titan087/OpenLlama13B-Guanaco
42ed3023ae1afe861f533570be881a03b10fc860
47.985926
1
13
false
true
true
true
2023-09-09T10:52:17Z
false
51.194539
75.243975
43.756228
38.395034
71.744278
7.581501
false
tlphams_zoyllm-7b-slimorca_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/tlphams/zoyllm-7b-slimorca" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">tlphams/zoyllm-7b-slimorca</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_tlphams__zoyllm-7b-slimorca" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
tlphams/zoyllm-7b-slimorca
4b49caa2c42b3e8757f986624b047dab485ee26f
51.440902
cc-by-nc-sa-4.0
0
7
true
true
true
true
2023-12-04T06:28:42Z
false
50.59727
72.117108
48.777484
49.131664
67.324388
20.697498
false
togethercomputer_GPT-JT-6B-v0_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
GPTJForCausalLM
<a target="_blank" href="https://huggingface.co/togethercomputer/GPT-JT-6B-v0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">togethercomputer/GPT-JT-6B-v0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_togethercomputer__GPT-JT-6B-v0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
togethercomputer/GPT-JT-6B-v0
41bd1937dbc51f9e589d310bddab5b4c1409e783
44.046013
null
2
6
true
true
true
true
2023-09-09T10:52:17Z
false
42.064846
67.964549
49.344052
38.890853
64.798737
1.21304
true
togethercomputer_GPT-JT-6B-v1_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
GPTJForCausalLM
<a target="_blank" href="https://huggingface.co/togethercomputer/GPT-JT-6B-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">togethercomputer/GPT-JT-6B-v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_togethercomputer__GPT-JT-6B-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
togethercomputer/GPT-JT-6B-v1
f34aa35f906895602c1f86f5685e598afdea8051
43.127292
apache-2.0
300
6
true
true
true
true
2023-10-16T12:48:18Z
false
40.870307
67.147978
47.190181
37.069946
65.272297
1.21304
true
togethercomputer_GPT-JT-Moderation-6B_float16
float16
?
Original
GPTJForCausalLM
<a target="_blank" href="https://huggingface.co/togethercomputer/GPT-JT-Moderation-6B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">togethercomputer/GPT-JT-Moderation-6B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_togethercomputer__GPT-JT-Moderation-6B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
togethercomputer/GPT-JT-Moderation-6B
1297870783f6091294769014afddf94499966a78
41.799988
apache-2.0
31
6
true
true
true
true
2023-10-16T12:48:18Z
false
40.52901
67.655845
41.634625
37.327134
62.667719
0.985595
true
togethercomputer_GPT-NeoXT-Chat-Base-20B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
GPTNeoXForCausalLM
<a target="_blank" href="https://huggingface.co/togethercomputer/GPT-NeoXT-Chat-Base-20B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">togethercomputer/GPT-NeoXT-Chat-Base-20B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_togethercomputer__GPT-NeoXT-Chat-Base-20B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
togethercomputer/GPT-NeoXT-Chat-Base-20B
d386708e84d862a65f7d2b4989f64750cb657227
43.015454
apache-2.0
692
20
true
true
true
true
2023-09-09T10:52:17Z
false
45.648464
74.029078
29.919101
34.509307
67.087609
6.899166
true
togethercomputer_LLaMA-2-7B-32K_float16
float16
?
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/togethercomputer/LLaMA-2-7B-32K" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">togethercomputer/LLaMA-2-7B-32K</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_togethercomputer__LLaMA-2-7B-32K" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
togethercomputer/LLaMA-2-7B-32K
aef6d8946ae1015bdb65c478a2dd73b58daaef47
47.074216
llama2
525
7
true
true
true
true
2024-06-09T15:00:31Z
false
47.525597
76.140211
43.325707
39.230194
71.902131
4.321456
true
togethercomputer_Llama-2-7B-32K-Instruct_float16
float16
?
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/togethercomputer/Llama-2-7B-32K-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">togethercomputer/Llama-2-7B-32K-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_togethercomputer__Llama-2-7B-32K-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
togethercomputer/Llama-2-7B-32K-Instruct
35696b9a7ab330dcbe240ff76fb44ab1eccf45bf
50.023682
llama2
160
7
true
true
true
true
2024-06-09T15:00:58Z
false
51.109215
78.510257
46.105045
44.856155
73.875296
5.686126
true
togethercomputer_Llama-2-7B-32K-Instruct_8bit
8bit
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/togethercomputer/Llama-2-7B-32K-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">togethercomputer/Llama-2-7B-32K-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_togethercomputer__Llama-2-7B-32K-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
togethercomputer/Llama-2-7B-32K-Instruct
b050a6f17d46e32c4b90a30492f14746589f74b7
49.654402
llama2
160
7
true
true
true
true
2023-10-16T12:48:18Z
false
51.365188
78.470424
45.532239
45.008782
72.84925
4.700531
true
togethercomputer_Pythia-Chat-Base-7B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
GPTNeoXForCausalLM
<a target="_blank" href="https://huggingface.co/togethercomputer/Pythia-Chat-Base-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">togethercomputer/Pythia-Chat-Base-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_togethercomputer__Pythia-Chat-Base-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
togethercomputer/Pythia-Chat-Base-7B
97aa918c383820e1a69f042801091d7deb996c20
39.810235
apache-2.0
66
7
true
true
true
true
2023-09-09T10:52:17Z
false
40.017065
68.671579
27.441099
34.628186
64.009471
4.094011
true
togethercomputer_RedPajama-INCITE-7B-Base_float16
float16
🟢 pretrained
🟢
Original
GPTNeoXForCausalLM
<a target="_blank" href="https://huggingface.co/togethercomputer/RedPajama-INCITE-7B-Base" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">togethercomputer/RedPajama-INCITE-7B-Base</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_togethercomputer__RedPajama-INCITE-7B-Base" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
togethercomputer/RedPajama-INCITE-7B-Base
78f7e482443971f4873ba3239f0ac810a367833b
41.491462
apache-2.0
93
7
true
true
true
true
2023-09-09T10:50:14Z
false
46.245734
71.629158
27.68214
33.034754
67.324388
3.0326
true
togethercomputer_RedPajama-INCITE-7B-Chat_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
GPTNeoXForCausalLM
<a target="_blank" href="https://huggingface.co/togethercomputer/RedPajama-INCITE-7B-Chat" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">togethercomputer/RedPajama-INCITE-7B-Chat</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_togethercomputer__RedPajama-INCITE-7B-Chat" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
togethercomputer/RedPajama-INCITE-7B-Chat
47b94a739e2f3164b438501c8684acc5d5acc146
39.367741
apache-2.0
92
7
true
true
true
true
2023-09-09T10:52:17Z
false
42.064846
70.822545
26.942823
36.094978
59.826361
0.45489
true
togethercomputer_RedPajama-INCITE-7B-Instruct_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
GPTNeoXForCausalLM
<a target="_blank" href="https://huggingface.co/togethercomputer/RedPajama-INCITE-7B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">togethercomputer/RedPajama-INCITE-7B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_togethercomputer__RedPajama-INCITE-7B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
togethercomputer/RedPajama-INCITE-7B-Instruct
95667a602ff2646bf67fe3a57c4eb9a1edec87fe
42.375751
apache-2.0
104
7
true
true
true
true
2023-09-09T10:52:17Z
false
44.112628
72.017526
37.61836
33.957284
64.95659
1.592115
true
togethercomputer_RedPajama-INCITE-Base-3B-v1_float16
float16
🟢 pretrained
🟢
Original
GPTNeoXForCausalLM
<a target="_blank" href="https://huggingface.co/togethercomputer/RedPajama-INCITE-Base-3B-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">togethercomputer/RedPajama-INCITE-Base-3B-v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_togethercomputer__RedPajama-INCITE-Base-3B-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
togethercomputer/RedPajama-INCITE-Base-3B-v1
094fbdd0c911feb485ce55de1952ab2e75277e1e
38.537852
apache-2.0
90
3
true
true
true
true
2023-09-09T10:52:17Z
false
40.187713
64.767975
27.027874
33.234887
64.719811
1.288855
true
togethercomputer_RedPajama-INCITE-Base-7B-v0.1_float16
float16
🟢 pretrained
🟢
Original
Unknown
<a target="_blank" href="https://huggingface.co/togethercomputer/RedPajama-INCITE-Base-7B-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">togethercomputer/RedPajama-INCITE-Base-7B-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_togethercomputer__RedPajama-INCITE-Base-7B-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
togethercomputer/RedPajama-INCITE-Base-7B-v0.1
78f7e482443971f4873ba3239f0ac810a367833b
41.251382
0
6
true
true
true
true
2023-10-16T12:46:18Z
false
46.245734
71.629158
27.68214
33.034754
67.324388
1.592115
true
togethercomputer_RedPajama-INCITE-Chat-3B-v1_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
GPTNeoXForCausalLM
<a target="_blank" href="https://huggingface.co/togethercomputer/RedPajama-INCITE-Chat-3B-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">togethercomputer/RedPajama-INCITE-Chat-3B-v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_togethercomputer__RedPajama-INCITE-Chat-3B-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
togethercomputer/RedPajama-INCITE-Chat-3B-v1
f0e0995eba801096ed04cb87931d96a8316871af
39.527194
apache-2.0
147
3
true
true
true
true
2023-09-09T10:52:17Z
false
42.832765
67.616013
26.231263
34.443339
65.509077
0.530705
true
togethercomputer_RedPajama-INCITE-Chat-7B-v0.1_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Unknown
<a target="_blank" href="https://huggingface.co/togethercomputer/RedPajama-INCITE-Chat-7B-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">togethercomputer/RedPajama-INCITE-Chat-7B-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_togethercomputer__RedPajama-INCITE-Chat-7B-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
togethercomputer/RedPajama-INCITE-Chat-7B-v0.1
47b94a739e2f3164b438501c8684acc5d5acc146
39.367741
0
6
true
true
true
true
2023-09-09T10:52:17Z
false
42.064846
70.822545
26.942823
36.094978
59.826361
0.45489
true
togethercomputer_RedPajama-INCITE-Instruct-3B-v1_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
GPTNeoXForCausalLM
<a target="_blank" href="https://huggingface.co/togethercomputer/RedPajama-INCITE-Instruct-3B-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">togethercomputer/RedPajama-INCITE-Instruct-3B-v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_togethercomputer__RedPajama-INCITE-Instruct-3B-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
togethercomputer/RedPajama-INCITE-Instruct-3B-v1
0c66778ee09a036886741707733620b91057909a
39.055049
apache-2.0
91
3
true
true
true
true
2023-09-09T10:52:17Z
false
41.552901
65.484963
25.032214
36.412515
64.483031
1.36467
true
togethercomputer_RedPajama-INCITE-Instruct-7B-v0.1_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Unknown
<a target="_blank" href="https://huggingface.co/togethercomputer/RedPajama-INCITE-Instruct-7B-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">togethercomputer/RedPajama-INCITE-Instruct-7B-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_togethercomputer__RedPajama-INCITE-Instruct-7B-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
togethercomputer/RedPajama-INCITE-Instruct-7B-v0.1
95667a602ff2646bf67fe3a57c4eb9a1edec87fe
42.375751
0
6
true
true
true
true
2023-09-09T10:52:17Z
false
44.112628
72.017526
37.61836
33.957284
64.95659
1.592115
true
tokyotech-llm_Swallow-70b-instruct-hf_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/tokyotech-llm/Swallow-70b-instruct-hf" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">tokyotech-llm/Swallow-70b-instruct-hf</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_tokyotech-llm__Swallow-70b-instruct-hf" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
tokyotech-llm/Swallow-70b-instruct-hf
feba815b847806df03f23a375f3d4d07fa251134
65.743646
llama2
38
69
true
true
true
true
2023-12-30T11:43:05Z
false
66.211604
85.142402
67.084436
47.995873
82.083662
45.943897
false
totally-not-an-llm_EverythingLM-13b-16k_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/totally-not-an-llm/EverythingLM-13b-16k" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">totally-not-an-llm/EverythingLM-13b-16k</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_totally-not-an-llm__EverythingLM-13b-16k" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
totally-not-an-llm/EverythingLM-13b-16k
8456a856a8b115b05e76a7d0d945853b10ac71e2
52.334836
llama2
33
13
true
true
true
true
2023-10-16T12:48:18Z
false
56.569966
80.581557
50.178073
47.46482
72.770324
6.444276
false
totally-not-an-llm_EverythingLM-13b-V2-16k_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/totally-not-an-llm/EverythingLM-13b-V2-16k" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">totally-not-an-llm/EverythingLM-13b-V2-16k</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_totally-not-an-llm__EverythingLM-13b-V2-16k" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
totally-not-an-llm/EverythingLM-13b-V2-16k
943f932ae1ae462389e6d2db5273158530749fff
52.745532
llama2
31
13
true
true
true
true
2023-09-09T10:52:17Z
false
58.703072
80.880303
49.686661
47.3727
73.007103
6.823351
false
totally-not-an-llm_EverythingLM-13b-V3-16k_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/totally-not-an-llm/EverythingLM-13b-V3-16k" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">totally-not-an-llm/EverythingLM-13b-V3-16k</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_totally-not-an-llm__EverythingLM-13b-V3-16k" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
totally-not-an-llm/EverythingLM-13b-V3-16k
1de9244bfadb947f80872727f76790cbc76e7142
51.11143
llama2
6
13
true
true
true
true
2023-10-16T12:54:17Z
false
58.191126
80.123481
50.479681
45.184868
70.718232
1.97119
false
totally-not-an-llm_EverythingLM-13b-V3-peft_4bit
4bit
💬 chat models (RLHF, DPO, IFT, ...)
💬
Adapter
Unknown
<a target="_blank" href="https://huggingface.co/totally-not-an-llm/EverythingLM-13b-V3-peft" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">totally-not-an-llm/EverythingLM-13b-V3-peft</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_totally-not-an-llm__EverythingLM-13b-V3-peft" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
totally-not-an-llm/EverythingLM-13b-V3-peft
7a2eed5038addcf4fa3b8dd358b45eb96134e749
54.242307
null
1
12
false
true
true
true
2023-10-16T12:48:18Z
false
58.361775
81.029675
54.70104
52.977604
72.84925
5.534496
false
totally-not-an-llm_PuddleJumper-13b_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/totally-not-an-llm/PuddleJumper-13b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">totally-not-an-llm/PuddleJumper-13b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_totally-not-an-llm__PuddleJumper-13b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
totally-not-an-llm/PuddleJumper-13b
f3a8a475ff0c6ae37ac8ae0690980be11cac731a
55.113742
llama2
6
13
true
true
true
true
2023-09-09T10:52:17Z
false
58.703072
81.179048
58.254493
56.439652
72.770324
3.335861
false