eval_name
stringlengths
12
111
Precision
stringclasses
3 values
Type
stringclasses
6 values
T
stringclasses
6 values
Weight type
stringclasses
2 values
Architecture
stringclasses
51 values
Model
stringlengths
355
689
fullname
stringlengths
4
102
Model sha
stringlengths
0
40
Average ⬆️
float64
1.41
51.2
Hub License
stringclasses
25 values
Hub ❤️
int64
0
5.85k
#Params (B)
int64
-1
140
Available on the hub
bool
2 classes
Not_Merged
bool
2 classes
MoE
bool
2 classes
Flagged
bool
2 classes
Chat Template
bool
2 classes
CO₂ cost (kg)
float64
0.03
107
IFEval Raw
float64
0
0.87
IFEval
float64
0
86.7
BBH Raw
float64
0.28
0.75
BBH
float64
0.81
63.5
MATH Lvl 5 Raw
float64
0
0.51
MATH Lvl 5
float64
0
50.7
GPQA Raw
float64
0.22
0.44
GPQA
float64
0
24.9
MUSR Raw
float64
0.29
0.59
MUSR
float64
0
36.5
MMLU-PRO Raw
float64
0.1
0.72
MMLU-PRO
float64
0
68.7
Maintainer's Highlight
bool
2 classes
Upload To Hub Date
stringclasses
411 values
Submission Date
stringclasses
158 values
Generation
int64
0
10
Base Model
stringlengths
4
102
theprint_ReWiz-Llama-3.2-3B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/theprint/ReWiz-Llama-3.2-3B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">theprint/ReWiz-Llama-3.2-3B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/theprint__ReWiz-Llama-3.2-3B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
theprint/ReWiz-Llama-3.2-3B
e6aed95ad8f104f105b8423cd5f87c75705a828c
17.984844
apache-2.0
2
3
true
true
true
false
false
1.30972
0.464893
46.489315
0.434326
19.293728
0.097432
9.743202
0.283557
4.474273
0.361375
6.938542
0.28873
20.970006
false
2024-10-18
2024-10-28
1
theprint/ReWiz-Llama-3.2-3B (Merge)
theprint_ReWiz-Nemo-12B-Instruct_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/theprint/ReWiz-Nemo-12B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">theprint/ReWiz-Nemo-12B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/theprint__ReWiz-Nemo-12B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
theprint/ReWiz-Nemo-12B-Instruct
6f8ea24f8d19b48850d68bef1b5c50837d37761b
15.631853
apache-2.0
2
12
true
true
true
false
false
1.17003
0.106238
10.623811
0.509241
29.926389
0.071752
7.175227
0.323826
9.8434
0.409563
10.228646
0.333943
25.993647
false
2024-10-31
2024-11-02
1
unsloth/Mistral-Nemo-Instruct-2407-bnb-4bit
theprint_ReWiz-Qwen-2.5-14B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Adapter
?
<a target="_blank" href="https://huggingface.co/theprint/ReWiz-Qwen-2.5-14B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">theprint/ReWiz-Qwen-2.5-14B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/theprint__ReWiz-Qwen-2.5-14B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
theprint/ReWiz-Qwen-2.5-14B
e5524628f15c30d7542427c53a565e6e2d3ff760
29.641502
apache-2.0
5
16
true
true
true
false
false
5.928266
0.278546
27.854648
0.617949
44.861873
0.268882
26.888218
0.380034
17.337808
0.453896
15.436979
0.509225
45.469489
false
2024-11-05
2024-11-10
2
Qwen/Qwen2.5-14B
theprint_ReWiz-Worldbuilder-7B_float16
float16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/theprint/ReWiz-Worldbuilder-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">theprint/ReWiz-Worldbuilder-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/theprint__ReWiz-Worldbuilder-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
theprint/ReWiz-Worldbuilder-7B
e88c715097d824f115f59a97e612d662ffb1031f
15.664819
0
7
false
true
true
false
false
0.610867
0.25102
25.101952
0.463616
25.076347
0.029456
2.945619
0.269295
2.572707
0.45725
16.389583
0.297124
21.902704
false
2024-10-28
2024-10-28
1
theprint/ReWiz-Worldbuilder-7B (Merge)
theprint_RuDolph-Hermes-7B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/theprint/RuDolph-Hermes-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">theprint/RuDolph-Hermes-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/theprint__RuDolph-Hermes-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
theprint/RuDolph-Hermes-7B
e07aea56963bbfe5c6753d1056566a56acc30d4a
19.024425
0
7
false
true
true
false
false
0.502067
0.360429
36.042922
0.505293
30.709648
0.050604
5.060423
0.312081
8.277405
0.422615
11.026823
0.307264
23.029329
false
2024-11-10
2024-11-10
1
theprint/RuDolph-Hermes-7B (Merge)
theprint_WorldBuilder-12B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Adapter
?
<a target="_blank" href="https://huggingface.co/theprint/WorldBuilder-12B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">theprint/WorldBuilder-12B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/theprint__WorldBuilder-12B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
theprint/WorldBuilder-12B
20cfd0e98fb2628b00867147b2c6f423d27f3561
14.377937
apache-2.0
0
13
true
true
true
false
false
2.831275
0.137438
13.743755
0.50101
29.277996
0.036254
3.625378
0.29698
6.263982
0.406646
8.997396
0.319232
24.359116
false
2024-10-27
2024-11-18
1
unsloth/mistral-nemo-base-2407-bnb-4bit
theprint_phi-3-mini-4k-python_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Adapter
?
<a target="_blank" href="https://huggingface.co/theprint/phi-3-mini-4k-python" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">theprint/phi-3-mini-4k-python</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/theprint__phi-3-mini-4k-python-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
theprint/phi-3-mini-4k-python
81453e5718775630581ab9950e6c0ccf0d7a4177
17.564493
apache-2.0
0
4
true
true
true
false
false
1.375551
0.240878
24.087754
0.493759
28.446016
0.095166
9.516616
0.291107
5.480984
0.392167
9.220833
0.357713
28.634752
false
2024-06-03
2024-09-13
1
unsloth/Phi-3-mini-4k-instruct-bnb-4bit
thomas-yanxin_XinYuan-Qwen2-1_5B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/thomas-yanxin/XinYuan-Qwen2-1_5B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">thomas-yanxin/XinYuan-Qwen2-1_5B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/thomas-yanxin__XinYuan-Qwen2-1_5B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
thomas-yanxin/XinYuan-Qwen2-1_5B
a01b362887832bea08d686737861ac3d5b437a32
11.515091
other
1
1
true
true
true
false
true
1.352364
0.298556
29.855561
0.363549
12.12558
0.067221
6.722054
0.270134
2.684564
0.363396
2.624479
0.235705
15.07831
false
2024-08-25
2024-09-04
1
Removed
thomas-yanxin_XinYuan-Qwen2-7B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/thomas-yanxin/XinYuan-Qwen2-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">thomas-yanxin/XinYuan-Qwen2-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/thomas-yanxin__XinYuan-Qwen2-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
thomas-yanxin/XinYuan-Qwen2-7B
c62d83eee2f4812ac17fc17d307f4aa1a77c5359
22.217714
other
1
7
true
true
true
false
true
3.276154
0.44376
44.376033
0.493663
28.401489
0.132931
13.293051
0.291107
5.480984
0.405812
9.259896
0.392453
32.494829
false
2024-08-21
2024-09-03
0
thomas-yanxin/XinYuan-Qwen2-7B
thomas-yanxin_XinYuan-Qwen2-7B-0917_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/thomas-yanxin/XinYuan-Qwen2-7B-0917" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">thomas-yanxin/XinYuan-Qwen2-7B-0917</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/thomas-yanxin__XinYuan-Qwen2-7B-0917-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
thomas-yanxin/XinYuan-Qwen2-7B-0917
6cee1b155fca9ae1f558f434953dfdadb9596af0
22.721617
other
4
7
true
true
true
false
true
1.485564
0.37192
37.191984
0.516922
32.619938
0.088369
8.836858
0.309564
7.941834
0.440104
13.679688
0.424535
36.059397
false
2024-09-17
2024-09-17
0
thomas-yanxin/XinYuan-Qwen2-7B-0917
thomas-yanxin_XinYuan-Qwen2.5-7B-0917_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/thomas-yanxin/XinYuan-Qwen2.5-7B-0917" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">thomas-yanxin/XinYuan-Qwen2.5-7B-0917</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/thomas-yanxin__XinYuan-Qwen2.5-7B-0917-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
thomas-yanxin/XinYuan-Qwen2.5-7B-0917
bbbeafd1003c4d5e13f09b7223671957384b961a
18.175037
other
4
7
true
true
true
false
true
0.971225
0.357706
35.770644
0.518411
33.439669
0
0
0.28104
4.138702
0.367552
3.677344
0.388215
32.023862
false
2024-09-17
2024-09-24
0
thomas-yanxin/XinYuan-Qwen2.5-7B-0917
tiiuae_falcon-11B_bfloat16
bfloat16
🟢 pretrained
🟢
Original
FalconForCausalLM
<a target="_blank" href="https://huggingface.co/tiiuae/falcon-11B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">tiiuae/falcon-11B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/tiiuae__falcon-11B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
tiiuae/falcon-11B
066e3bf4e2d9aaeefa129af0a6d39727d27816b3
13.814138
unknown
212
11
true
true
true
false
false
1.082871
0.326132
32.613244
0.439164
21.937999
0.02568
2.567976
0.270973
2.796421
0.398646
7.530729
0.238946
15.43846
true
2024-05-09
2024-06-09
0
tiiuae/falcon-11B
tiiuae_falcon-40b_bfloat16
bfloat16
🟢 pretrained
🟢
Original
FalconForCausalLM
<a target="_blank" href="https://huggingface.co/tiiuae/falcon-40b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">tiiuae/falcon-40b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/tiiuae__falcon-40b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
tiiuae/falcon-40b
4a70170c215b36a3cce4b4253f6d0612bb7d4146
11.36354
apache-2.0
2,419
40
true
true
true
false
false
21.793584
0.249645
24.964539
0.401853
16.583305
0.015861
1.586103
0.27349
3.131991
0.363146
5.193229
0.250499
16.722074
true
2023-05-24
2024-06-09
0
tiiuae/falcon-40b
tiiuae_falcon-40b-instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
FalconForCausalLM
<a target="_blank" href="https://huggingface.co/tiiuae/falcon-40b-instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">tiiuae/falcon-40b-instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/tiiuae__falcon-40b-instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
tiiuae/falcon-40b-instruct
ecb78d97ac356d098e79f0db222c9ce7c5d9ee5f
10.434154
apache-2.0
1,172
40
true
true
true
false
false
19.733245
0.245449
24.544874
0.405387
17.220114
0.016616
1.661631
0.25
0
0.376229
5.161979
0.226147
14.016327
true
2023-05-25
2024-06-09
0
tiiuae/falcon-40b-instruct
tiiuae_falcon-7b_bfloat16
bfloat16
🟢 pretrained
🟢
Original
FalconForCausalLM
<a target="_blank" href="https://huggingface.co/tiiuae/falcon-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">tiiuae/falcon-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/tiiuae__falcon-7b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
tiiuae/falcon-7b
898df1396f35e447d5fe44e0a3ccaaaa69f30d36
5.110504
apache-2.0
1,078
7
true
true
true
false
false
0.785841
0.182051
18.20514
0.328524
5.963937
0.006042
0.60423
0.244966
0
0.377844
4.497135
0.112533
1.392583
true
2023-04-24
2024-06-09
0
tiiuae/falcon-7b
tiiuae_falcon-7b-instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
FalconForCausalLM
<a target="_blank" href="https://huggingface.co/tiiuae/falcon-7b-instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">tiiuae/falcon-7b-instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/tiiuae__falcon-7b-instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
tiiuae/falcon-7b-instruct
cf4b3c42ce2fdfe24f753f0f0d179202fea59c99
5.015869
apache-2.0
924
7
true
true
true
false
false
0.766215
0.196889
19.68887
0.320342
4.823178
0.006042
0.60423
0.247483
0
0.363365
3.253906
0.115525
1.72503
true
2023-04-25
2024-06-09
0
tiiuae/falcon-7b-instruct
tiiuae_falcon-mamba-7b_bfloat16
bfloat16
🟢 pretrained
🟢
Original
FalconMambaForCausalLM
<a target="_blank" href="https://huggingface.co/tiiuae/falcon-mamba-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">tiiuae/falcon-mamba-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/tiiuae__falcon-mamba-7b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
tiiuae/falcon-mamba-7b
5337fd73f19847e111ba2291f3f0e1617b90c37d
15.116297
other
217
7
true
true
true
false
false
3.610408
0.333576
33.357602
0.428485
19.876878
0.040785
4.07855
0.310403
8.053691
0.421031
10.86224
0.230219
14.468824
true
2024-07-17
2024-07-23
0
tiiuae/falcon-mamba-7b
tklohj_WindyFloLLM_float16
float16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/tklohj/WindyFloLLM" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">tklohj/WindyFloLLM</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/tklohj__WindyFloLLM-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
tklohj/WindyFloLLM
21f4241ab3f091d1d309e9076a8d8e3f014908a8
14.205891
0
13
false
true
true
false
false
1.098512
0.266856
26.685639
0.463662
24.398763
0.013595
1.359517
0.275168
3.355705
0.425313
11.864063
0.258145
17.571661
false
2024-06-30
2024-07-10
1
tklohj/WindyFloLLM (Merge)
togethercomputer_GPT-JT-6B-v1_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
GPTJForCausalLM
<a target="_blank" href="https://huggingface.co/togethercomputer/GPT-JT-6B-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">togethercomputer/GPT-JT-6B-v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/togethercomputer__GPT-JT-6B-v1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
togethercomputer/GPT-JT-6B-v1
f34aa35f906895602c1f86f5685e598afdea8051
6.827354
apache-2.0
301
6
true
true
true
false
false
37.958811
0.206106
20.610646
0.330266
7.318524
0.007553
0.755287
0.260906
1.454139
0.373656
3.873698
0.162566
6.951832
true
2022-11-24
2024-06-12
0
togethercomputer/GPT-JT-6B-v1
togethercomputer_GPT-NeoXT-Chat-Base-20B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
GPTNeoXForCausalLM
<a target="_blank" href="https://huggingface.co/togethercomputer/GPT-NeoXT-Chat-Base-20B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">togethercomputer/GPT-NeoXT-Chat-Base-20B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/togethercomputer__GPT-NeoXT-Chat-Base-20B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
togethercomputer/GPT-NeoXT-Chat-Base-20B
d386708e84d862a65f7d2b4989f64750cb657227
4.964062
apache-2.0
695
20
true
true
true
false
false
2.983588
0.182976
18.297562
0.332097
6.830795
0.01284
1.283988
0.25
0
0.346063
1.757812
0.114528
1.614214
true
2023-03-03
2024-06-12
0
togethercomputer/GPT-NeoXT-Chat-Base-20B
togethercomputer_LLaMA-2-7B-32K_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/togethercomputer/LLaMA-2-7B-32K" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">togethercomputer/LLaMA-2-7B-32K</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/togethercomputer__LLaMA-2-7B-32K-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
togethercomputer/LLaMA-2-7B-32K
46c24bb5aef59722fa7aa6d75e832afd1d64b980
6.737011
llama2
533
7
true
true
true
false
false
0.584573
0.186497
18.649738
0.339952
8.089984
0.008308
0.830816
0.25
0
0.375365
4.320573
0.176779
8.530954
true
2023-07-26
2024-06-12
0
togethercomputer/LLaMA-2-7B-32K
togethercomputer_Llama-2-7B-32K-Instruct_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/togethercomputer/Llama-2-7B-32K-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">togethercomputer/Llama-2-7B-32K-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/togethercomputer__Llama-2-7B-32K-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
togethercomputer/Llama-2-7B-32K-Instruct
d27380af003252f5eb0d218e104938b4e673e3f3
8.20819
llama2
159
7
true
true
true
false
false
0.589909
0.213
21.300039
0.344347
8.56347
0.01284
1.283988
0.251678
0.223714
0.405594
9.199219
0.178108
8.678709
true
2023-08-08
2024-06-12
0
togethercomputer/Llama-2-7B-32K-Instruct
togethercomputer_RedPajama-INCITE-7B-Base_float16
float16
🟢 pretrained
🟢
Original
GPTNeoXForCausalLM
<a target="_blank" href="https://huggingface.co/togethercomputer/RedPajama-INCITE-7B-Base" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">togethercomputer/RedPajama-INCITE-7B-Base</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/togethercomputer__RedPajama-INCITE-7B-Base-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
togethercomputer/RedPajama-INCITE-7B-Base
78f7e482443971f4873ba3239f0ac810a367833b
5.486286
apache-2.0
94
7
true
true
true
false
false
1.220607
0.20823
20.822972
0.319489
5.087242
0.011329
1.132931
0.255034
0.671141
0.362
3.016667
0.119681
2.186761
true
2023-05-04
2024-06-12
0
togethercomputer/RedPajama-INCITE-7B-Base
togethercomputer_RedPajama-INCITE-7B-Chat_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
GPTNeoXForCausalLM
<a target="_blank" href="https://huggingface.co/togethercomputer/RedPajama-INCITE-7B-Chat" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">togethercomputer/RedPajama-INCITE-7B-Chat</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/togethercomputer__RedPajama-INCITE-7B-Chat-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
togethercomputer/RedPajama-INCITE-7B-Chat
47b94a739e2f3164b438501c8684acc5d5acc146
3.962784
apache-2.0
92
7
true
true
true
false
false
1.219336
0.155798
15.579773
0.317545
4.502174
0.001511
0.151057
0.252517
0.33557
0.34476
1.861719
0.112118
1.34641
true
2023-05-04
2024-06-13
0
togethercomputer/RedPajama-INCITE-7B-Chat
togethercomputer_RedPajama-INCITE-7B-Instruct_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
GPTNeoXForCausalLM
<a target="_blank" href="https://huggingface.co/togethercomputer/RedPajama-INCITE-7B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">togethercomputer/RedPajama-INCITE-7B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/togethercomputer__RedPajama-INCITE-7B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
togethercomputer/RedPajama-INCITE-7B-Instruct
7f36397b9985a3f981cdb618f8fec1c565ca5927
6.356021
apache-2.0
104
7
true
true
true
false
false
1.181119
0.205507
20.550694
0.337744
7.905416
0.015106
1.510574
0.250839
0.111857
0.36851
5.030469
0.127244
3.027113
true
2023-05-05
2024-06-12
0
togethercomputer/RedPajama-INCITE-7B-Instruct
togethercomputer_RedPajama-INCITE-Base-3B-v1_float16
float16
🟢 pretrained
🟢
Original
GPTNeoXForCausalLM
<a target="_blank" href="https://huggingface.co/togethercomputer/RedPajama-INCITE-Base-3B-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">togethercomputer/RedPajama-INCITE-Base-3B-v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/togethercomputer__RedPajama-INCITE-Base-3B-v1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
togethercomputer/RedPajama-INCITE-Base-3B-v1
094fbdd0c911feb485ce55de1952ab2e75277e1e
5.445562
apache-2.0
90
3
true
true
true
false
false
0.776102
0.229363
22.936254
0.30604
3.518608
0.009819
0.981873
0.243289
0
0.373875
4.001042
0.11112
1.235594
true
2023-05-04
2024-06-12
0
togethercomputer/RedPajama-INCITE-Base-3B-v1
togethercomputer_RedPajama-INCITE-Chat-3B-v1_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
GPTNeoXForCausalLM
<a target="_blank" href="https://huggingface.co/togethercomputer/RedPajama-INCITE-Chat-3B-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">togethercomputer/RedPajama-INCITE-Chat-3B-v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/togethercomputer__RedPajama-INCITE-Chat-3B-v1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
togethercomputer/RedPajama-INCITE-Chat-3B-v1
f0e0995eba801096ed04cb87931d96a8316871af
4.748119
apache-2.0
152
3
true
true
true
false
false
0.774909
0.165215
16.521496
0.321669
5.164728
0.003021
0.302115
0.244128
0
0.368448
5.089323
0.112699
1.411052
true
2023-05-05
2024-06-13
0
togethercomputer/RedPajama-INCITE-Chat-3B-v1
togethercomputer_RedPajama-INCITE-Instruct-3B-v1_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
GPTNeoXForCausalLM
<a target="_blank" href="https://huggingface.co/togethercomputer/RedPajama-INCITE-Instruct-3B-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">togethercomputer/RedPajama-INCITE-Instruct-3B-v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/togethercomputer__RedPajama-INCITE-Instruct-3B-v1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
togethercomputer/RedPajama-INCITE-Instruct-3B-v1
0c66778ee09a036886741707733620b91057909a
5.676527
apache-2.0
93
3
true
true
true
false
false
0.760671
0.212426
21.242636
0.314602
4.510786
0.006798
0.679758
0.247483
0
0.388604
6.408854
0.110954
1.217125
true
2023-05-05
2024-06-12
0
togethercomputer/RedPajama-INCITE-Instruct-3B-v1
tokyotech-llm_Llama-3-Swallow-8B-Instruct-v0.1_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/tokyotech-llm/Llama-3-Swallow-8B-Instruct-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">tokyotech-llm/Llama-3-Swallow-8B-Instruct-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/tokyotech-llm__Llama-3-Swallow-8B-Instruct-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
tokyotech-llm/Llama-3-Swallow-8B-Instruct-v0.1
1fae784584dd03680b72dd4de7eefbc5b7cabcd5
22.307385
llama3
16
8
true
true
true
false
true
0.85811
0.550772
55.077195
0.500939
29.267966
0.072508
7.250755
0.28943
5.257271
0.435698
13.795573
0.30876
23.195553
false
2024-06-26
2024-09-12
0
tokyotech-llm/Llama-3-Swallow-8B-Instruct-v0.1
unsloth_Phi-3-mini-4k-instruct_float16
float16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/unsloth/Phi-3-mini-4k-instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">unsloth/Phi-3-mini-4k-instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/unsloth__Phi-3-mini-4k-instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
unsloth/Phi-3-mini-4k-instruct
636c707430a5509c80b1aa51d05c127ed339a975
27.178374
mit
41
3
true
true
true
false
true
0.469533
0.544028
54.402762
0.550024
36.732473
0.154079
15.407855
0.322987
9.731544
0.428417
13.11875
0.403092
33.676862
false
2024-04-29
2024-11-25
0
unsloth/Phi-3-mini-4k-instruct
upstage_SOLAR-10.7B-Instruct-v1.0_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/upstage/SOLAR-10.7B-Instruct-v1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">upstage/SOLAR-10.7B-Instruct-v1.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/upstage__SOLAR-10.7B-Instruct-v1.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
upstage/SOLAR-10.7B-Instruct-v1.0
c08c25ed66414a878fe0401a3596d536c083606c
19.628255
cc-by-nc-4.0
616
10
true
true
true
false
true
0.782776
0.473661
47.3661
0.516249
31.872402
0
0
0.308725
7.829978
0.389938
6.942188
0.31383
23.758865
true
2023-12-12
2024-06-12
1
upstage/SOLAR-10.7B-Instruct-v1.0 (Merge)
upstage_SOLAR-10.7B-v1.0_float16
float16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/upstage/SOLAR-10.7B-v1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">upstage/SOLAR-10.7B-v1.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/upstage__SOLAR-10.7B-v1.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
upstage/SOLAR-10.7B-v1.0
a45090b8e56bdc2b8e32e46b3cd782fc0bea1fa5
4.916448
apache-2.0
292
10
true
true
true
false
false
1.519194
0.171585
17.158473
0.299835
2.147163
0.023414
2.34139
0.260906
1.454139
0.368198
4.52474
0.116855
1.872784
true
2023-12-12
2024-06-12
0
upstage/SOLAR-10.7B-v1.0
upstage_solar-pro-preview-instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
SolarForCausalLM
<a target="_blank" href="https://huggingface.co/upstage/solar-pro-preview-instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">upstage/solar-pro-preview-instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/upstage__solar-pro-preview-instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
upstage/solar-pro-preview-instruct
b4db141b5fb08b23f8bc323bc34e2cff3e9675f8
39.900891
mit
430
22
true
true
true
false
true
1.741763
0.841581
84.158145
0.681684
54.822351
0.218278
21.827795
0.370805
16.107383
0.441656
15.007031
0.527344
47.482639
true
2024-09-09
2024-09-11
0
upstage/solar-pro-preview-instruct
uukuguy_speechless-code-mistral-7b-v1.0_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/uukuguy/speechless-code-mistral-7b-v1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">uukuguy/speechless-code-mistral-7b-v1.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/uukuguy__speechless-code-mistral-7b-v1.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
uukuguy/speechless-code-mistral-7b-v1.0
1862e0a712efc6002112e9c1235a197d58419b37
18.091887
apache-2.0
18
7
true
true
true
false
false
0.646398
0.366524
36.652416
0.457171
24.091412
0.046073
4.607251
0.284396
4.58613
0.450177
14.772135
0.314578
23.841977
false
2023-10-10
2024-06-26
0
uukuguy/speechless-code-mistral-7b-v1.0
uukuguy_speechless-codellama-34b-v2.0_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/uukuguy/speechless-codellama-34b-v2.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">uukuguy/speechless-codellama-34b-v2.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/uukuguy__speechless-codellama-34b-v2.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
uukuguy/speechless-codellama-34b-v2.0
419bc42a254102d6a5486a1a854068e912c4047c
17.209358
llama2
17
34
true
true
true
false
false
1.991254
0.460422
46.042168
0.481313
25.993293
0.043051
4.305136
0.269295
2.572707
0.378708
7.205208
0.254239
17.137633
false
2023-10-04
2024-06-26
0
uukuguy/speechless-codellama-34b-v2.0
uukuguy_speechless-coder-ds-6.7b_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/uukuguy/speechless-coder-ds-6.7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">uukuguy/speechless-coder-ds-6.7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/uukuguy__speechless-coder-ds-6.7b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
uukuguy/speechless-coder-ds-6.7b
c813a5268c6dfe267a720ad3b51773f1ab0feb59
9.639323
apache-2.0
5
6
true
true
true
false
false
0.788604
0.25047
25.046986
0.403637
15.897457
0.016616
1.661631
0.264262
1.901566
0.381938
5.342188
0.171875
7.986111
false
2023-12-30
2024-06-26
0
uukuguy/speechless-coder-ds-6.7b
uukuguy_speechless-instruct-mistral-7b-v0.2_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/uukuguy/speechless-instruct-mistral-7b-v0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">uukuguy/speechless-instruct-mistral-7b-v0.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/uukuguy__speechless-instruct-mistral-7b-v0.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
uukuguy/speechless-instruct-mistral-7b-v0.2
87a4d214f7d028d61c3dc013a7410b3c34a24072
18.018597
apache-2.0
0
7
true
true
true
false
false
0.61762
0.326132
32.613244
0.460667
24.558747
0.043807
4.380665
0.281879
4.250559
0.490177
21.172135
0.290226
21.136229
false
2024-05-22
2024-06-26
0
uukuguy/speechless-instruct-mistral-7b-v0.2
uukuguy_speechless-llama2-hermes-orca-platypus-wizardlm-13b_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/uukuguy/speechless-llama2-hermes-orca-platypus-wizardlm-13b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">uukuguy/speechless-llama2-hermes-orca-platypus-wizardlm-13b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/uukuguy__speechless-llama2-hermes-orca-platypus-wizardlm-13b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
uukuguy/speechless-llama2-hermes-orca-platypus-wizardlm-13b
954cc87b0ed5fa280126de546daf648861031512
18.600891
32
13
false
true
true
false
false
0.979524
0.456175
45.617517
0.484554
26.791727
0.01435
1.435045
0.270134
2.684564
0.4655
17.754167
0.255901
17.322326
false
2023-09-01
2024-06-26
0
uukuguy/speechless-llama2-hermes-orca-platypus-wizardlm-13b
uukuguy_speechless-mistral-dolphin-orca-platypus-samantha-7b_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/uukuguy/speechless-mistral-dolphin-orca-platypus-samantha-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">uukuguy/speechless-mistral-dolphin-orca-platypus-samantha-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/uukuguy__speechless-mistral-dolphin-orca-platypus-samantha-7b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
uukuguy/speechless-mistral-dolphin-orca-platypus-samantha-7b
b1de043468a15198b55a6509293a4ee585139043
18.340089
llama2
17
7
true
true
true
false
false
0.655719
0.370022
37.002154
0.498277
29.653129
0.029456
2.945619
0.283557
4.474273
0.436135
13.85026
0.299036
22.1151
false
2023-10-13
2024-06-26
0
uukuguy/speechless-mistral-dolphin-orca-platypus-samantha-7b
uukuguy_speechless-zephyr-code-functionary-7b_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/uukuguy/speechless-zephyr-code-functionary-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">uukuguy/speechless-zephyr-code-functionary-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/uukuguy__speechless-zephyr-code-functionary-7b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
uukuguy/speechless-zephyr-code-functionary-7b
d66fc775ece679966e352195c42444e9c70af7fa
16.360129
apache-2.0
2
7
true
true
true
false
false
0.634
0.269579
26.957916
0.466428
25.983623
0.036254
3.625378
0.300336
6.711409
0.426771
11.613021
0.309425
23.26943
false
2024-01-23
2024-06-26
0
uukuguy/speechless-zephyr-code-functionary-7b
v000000_L3.1-Niitorm-8B-DPO-t0.0001_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/v000000/L3.1-Niitorm-8B-DPO-t0.0001" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">v000000/L3.1-Niitorm-8B-DPO-t0.0001</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/v000000__L3.1-Niitorm-8B-DPO-t0.0001-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
v000000/L3.1-Niitorm-8B-DPO-t0.0001
a34150b5f63de4bc83d79b1de127faff3750289f
28.100642
7
8
false
true
true
false
true
0.878109
0.768867
76.886661
0.513423
30.513173
0.161631
16.163142
0.294463
5.928412
0.387979
7.264063
0.386636
31.848404
false
2024-09-19
2024-09-19
1
v000000/L3.1-Niitorm-8B-DPO-t0.0001 (Merge)
v000000_L3.1-Storniitova-8B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/v000000/L3.1-Storniitova-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">v000000/L3.1-Storniitova-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/v000000__L3.1-Storniitova-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
v000000/L3.1-Storniitova-8B
05b126857f43d1b1383e50f8c97d214ceb199723
28.281707
7
8
false
true
true
false
true
0.81354
0.781656
78.165601
0.515145
30.810993
0.146526
14.652568
0.28943
5.257271
0.402896
9.961979
0.377576
30.841829
false
2024-09-12
2024-09-18
1
v000000/L3.1-Storniitova-8B (Merge)
v000000_Qwen2.5-14B-Gutenberg-1e-Delta_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/v000000/Qwen2.5-14B-Gutenberg-1e-Delta" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">v000000/Qwen2.5-14B-Gutenberg-1e-Delta</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/v000000__Qwen2.5-14B-Gutenberg-1e-Delta-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
v000000/Qwen2.5-14B-Gutenberg-1e-Delta
f624854b4380e01322e752ce4daadd49ac86580f
32.105096
apache-2.0
4
14
true
true
true
false
true
1.802387
0.804512
80.451203
0.63985
48.616672
0
0
0.328859
10.514541
0.407302
9.379427
0.493019
43.668735
false
2024-09-20
2024-09-28
1
v000000/Qwen2.5-14B-Gutenberg-1e-Delta (Merge)
v000000_Qwen2.5-14B-Gutenberg-Instruct-Slerpeno_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/v000000/Qwen2.5-14B-Gutenberg-Instruct-Slerpeno" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">v000000/Qwen2.5-14B-Gutenberg-Instruct-Slerpeno</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/v000000__Qwen2.5-14B-Gutenberg-Instruct-Slerpeno-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
v000000/Qwen2.5-14B-Gutenberg-Instruct-Slerpeno
1069abb4c25855e67ffaefa08a0befbb376e7ca7
33.665023
apache-2.0
4
14
true
false
true
false
false
2.23179
0.485476
48.547631
0.651079
49.7394
0.213746
21.374622
0.364094
15.212528
0.469063
18.432812
0.538148
48.683141
false
2024-09-20
2024-09-28
1
v000000/Qwen2.5-14B-Gutenberg-Instruct-Slerpeno (Merge)
v000000_Qwen2.5-Lumen-14B_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/v000000/Qwen2.5-Lumen-14B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">v000000/Qwen2.5-Lumen-14B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/v000000__Qwen2.5-Lumen-14B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
v000000/Qwen2.5-Lumen-14B
fbb1d184ed01dac52d307737893ebb6b0ace444c
32.200288
apache-2.0
15
14
true
true
true
false
true
1.836693
0.80636
80.636046
0.639081
48.507861
0
0
0.32802
10.402685
0.411396
10.291146
0.490276
43.363992
false
2024-09-20
2024-09-20
1
v000000/Qwen2.5-Lumen-14B (Merge)
vhab10_Llama-3.1-8B-Base-Instruct-SLERP_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/vhab10/Llama-3.1-8B-Base-Instruct-SLERP" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">vhab10/Llama-3.1-8B-Base-Instruct-SLERP</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/vhab10__Llama-3.1-8B-Base-Instruct-SLERP-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
vhab10/Llama-3.1-8B-Base-Instruct-SLERP
eccb4bde0dc91f586954109ecdce7c94f47e2625
19.249617
mit
1
8
true
false
true
false
false
0.806721
0.290712
29.071198
0.505744
29.926042
0.11858
11.858006
0.296141
6.152125
0.401063
9.366146
0.362118
29.124187
false
2024-09-16
2024-09-29
1
vhab10/Llama-3.1-8B-Base-Instruct-SLERP (Merge)
vhab10_Llama-3.2-Instruct-3B-TIES_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/vhab10/Llama-3.2-Instruct-3B-TIES" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">vhab10/Llama-3.2-Instruct-3B-TIES</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/vhab10__Llama-3.2-Instruct-3B-TIES-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
vhab10/Llama-3.2-Instruct-3B-TIES
0e8661730f40a6a279bd273cfe9fe46bbd0507dd
17.296562
mit
0
1
true
false
true
false
false
1.122926
0.472737
47.273678
0.433236
19.183159
0.095921
9.592145
0.269295
2.572707
0.349656
3.873698
0.291556
21.283983
false
2024-10-06
2024-11-23
1
vhab10/Llama-3.2-Instruct-3B-TIES (Merge)
vhab10_llama-3-8b-merged-linear_float16
float16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/vhab10/llama-3-8b-merged-linear" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">vhab10/llama-3-8b-merged-linear</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/vhab10__llama-3-8b-merged-linear-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
vhab10/llama-3-8b-merged-linear
c37e7671b5ccfadbf3065fa5b48af05cd4f13292
23.911368
mit
0
4
true
true
true
false
true
1.304943
0.591663
59.166345
0.493709
27.816051
0.081571
8.1571
0.299497
6.599553
0.419052
11.68151
0.370429
30.047651
false
2024-09-26
2024-09-26
1
vhab10/llama-3-8b-merged-linear (Merge)
vicgalle_CarbonBeagle-11B_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/vicgalle/CarbonBeagle-11B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">vicgalle/CarbonBeagle-11B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/vicgalle__CarbonBeagle-11B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
vicgalle/CarbonBeagle-11B
3fe9bf5327606d013b182fed17a472f5f043759b
22.470186
apache-2.0
9
10
true
false
true
false
true
0.915379
0.54153
54.152981
0.529365
33.060604
0.061934
6.193353
0.302013
6.935123
0.402031
9.18724
0.327626
25.291814
false
2024-01-21
2024-06-26
1
vicgalle/CarbonBeagle-11B (Merge)
vicgalle_CarbonBeagle-11B-truthy_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/vicgalle/CarbonBeagle-11B-truthy" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">vicgalle/CarbonBeagle-11B-truthy</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/vicgalle__CarbonBeagle-11B-truthy-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
vicgalle/CarbonBeagle-11B-truthy
476cd2a6d938bddb38dfbeb4cb21e3e34303413d
21.357727
apache-2.0
9
10
true
true
true
false
true
0.907273
0.521221
52.122147
0.534842
33.988376
0.05136
5.135952
0.299497
6.599553
0.373969
4.11276
0.335688
26.187574
false
2024-02-10
2024-07-13
0
vicgalle/CarbonBeagle-11B-truthy
vicgalle_Configurable-Hermes-2-Pro-Llama-3-8B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/vicgalle/Configurable-Hermes-2-Pro-Llama-3-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">vicgalle/Configurable-Hermes-2-Pro-Llama-3-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/vicgalle__Configurable-Hermes-2-Pro-Llama-3-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
vicgalle/Configurable-Hermes-2-Pro-Llama-3-8B
3cb5792509966a963645be24fdbeb2e7dc6cac15
22.351954
apache-2.0
6
8
true
true
true
false
true
0.748927
0.576251
57.625101
0.505484
30.509625
0.063444
6.344411
0.29698
6.263982
0.418365
10.06224
0.309757
23.306368
false
2024-05-02
2024-07-24
2
NousResearch/Meta-Llama-3-8B
vicgalle_Configurable-Llama-3.1-8B-Instruct_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/vicgalle/Configurable-Llama-3.1-8B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">vicgalle/Configurable-Llama-3.1-8B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/vicgalle__Configurable-Llama-3.1-8B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
vicgalle/Configurable-Llama-3.1-8B-Instruct
133b3ab1a5385ff9b3d17da2addfe3fc1fd6f733
28.010111
apache-2.0
12
8
true
true
true
false
true
0.79661
0.83124
83.124
0.504476
29.661398
0.172961
17.296073
0.274329
3.243848
0.384542
5.934375
0.359209
28.800975
false
2024-07-24
2024-08-05
0
vicgalle/Configurable-Llama-3.1-8B-Instruct
vicgalle_Configurable-Yi-1.5-9B-Chat_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/vicgalle/Configurable-Yi-1.5-9B-Chat" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">vicgalle/Configurable-Yi-1.5-9B-Chat</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/vicgalle__Configurable-Yi-1.5-9B-Chat-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
vicgalle/Configurable-Yi-1.5-9B-Chat
992cb2232caae78eff6a836b2e0642f7cbf6018e
23.972567
apache-2.0
2
8
true
true
true
false
true
0.941909
0.432345
43.234507
0.54522
35.334445
0.073263
7.326284
0.343121
12.416107
0.427115
12.022656
0.401513
33.501404
false
2024-05-12
2024-06-26
0
vicgalle/Configurable-Yi-1.5-9B-Chat
vicgalle_ConfigurableBeagle-11B_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/vicgalle/ConfigurableBeagle-11B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">vicgalle/ConfigurableBeagle-11B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/vicgalle__ConfigurableBeagle-11B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
vicgalle/ConfigurableBeagle-11B
bbc16dbf94b8e8a99bb3e2ada6755faf9c2990dd
22.635544
apache-2.0
3
10
true
true
true
false
true
0.879857
0.583445
58.344526
0.528659
32.392023
0.043807
4.380665
0.302013
6.935123
0.395302
7.379427
0.337434
26.381501
false
2024-02-17
2024-06-26
0
vicgalle/ConfigurableBeagle-11B
vicgalle_ConfigurableHermes-7B_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/vicgalle/ConfigurableHermes-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">vicgalle/ConfigurableHermes-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/vicgalle__ConfigurableHermes-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
vicgalle/ConfigurableHermes-7B
1333a88eaf6591836b2d9825d1eaec7260f336c9
19.536295
apache-2.0
3
7
true
true
true
false
true
0.617282
0.54108
54.107989
0.457297
23.158164
0.047583
4.758308
0.276846
3.579418
0.405688
9.110938
0.302527
22.502955
false
2024-02-17
2024-06-26
0
vicgalle/ConfigurableHermes-7B
vicgalle_ConfigurableSOLAR-10.7B_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/vicgalle/ConfigurableSOLAR-10.7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">vicgalle/ConfigurableSOLAR-10.7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/vicgalle__ConfigurableSOLAR-10.7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
vicgalle/ConfigurableSOLAR-10.7B
9d9baad88ea9dbaa61881f15e4f0d16e931033b4
19.045696
apache-2.0
2
10
true
true
true
false
true
0.677681
0.509956
50.995581
0.486681
27.45095
0
0
0.298658
6.487696
0.380479
5.193229
0.31732
24.14672
false
2024-03-10
2024-06-26
0
vicgalle/ConfigurableSOLAR-10.7B
vicgalle_Humanish-RP-Llama-3.1-8B_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/vicgalle/Humanish-RP-Llama-3.1-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">vicgalle/Humanish-RP-Llama-3.1-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/vicgalle__Humanish-RP-Llama-3.1-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
vicgalle/Humanish-RP-Llama-3.1-8B
d27aa731db1d390a8d17b0a4565c9231ee5ae8b9
25.347671
apache-2.0
6
8
true
true
true
false
true
0.753451
0.666926
66.692598
0.510039
29.95856
0.147281
14.728097
0.286913
4.9217
0.395208
8.267708
0.347656
27.517361
false
2024-08-03
2024-08-03
0
vicgalle/Humanish-RP-Llama-3.1-8B
vicgalle_Merge-Mistral-Prometheus-7B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/vicgalle/Merge-Mistral-Prometheus-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">vicgalle/Merge-Mistral-Prometheus-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/vicgalle__Merge-Mistral-Prometheus-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
vicgalle/Merge-Mistral-Prometheus-7B
a7083581b508ce83c74f9267f07024bd462e7161
16.574054
apache-2.0
1
7
true
false
true
false
true
0.630356
0.484801
48.480144
0.42014
18.410406
0.017372
1.73716
0.263423
1.789709
0.41
9.95
0.271692
19.076906
false
2024-05-04
2024-06-26
1
vicgalle/Merge-Mistral-Prometheus-7B (Merge)
vicgalle_Merge-Mixtral-Prometheus-8x7B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/vicgalle/Merge-Mixtral-Prometheus-8x7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">vicgalle/Merge-Mixtral-Prometheus-8x7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/vicgalle__Merge-Mixtral-Prometheus-8x7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
vicgalle/Merge-Mixtral-Prometheus-8x7B
ba53ee5b52a81e56b01e919c069a0d045cfd4e83
24.794158
apache-2.0
2
46
true
false
false
false
true
3.674009
0.574403
57.440259
0.53515
34.651421
0.094411
9.441088
0.308725
7.829978
0.40975
9.585417
0.368351
29.816785
false
2024-05-04
2024-06-26
1
vicgalle/Merge-Mixtral-Prometheus-8x7B (Merge)
vicgalle_Roleplay-Llama-3-8B_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/vicgalle/Roleplay-Llama-3-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">vicgalle/Roleplay-Llama-3-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/vicgalle__Roleplay-Llama-3-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
vicgalle/Roleplay-Llama-3-8B
57297eb57dcc2c116f061d9dda341094203da01b
24.083124
apache-2.0
36
8
true
true
true
false
true
1.126159
0.732022
73.202215
0.501232
28.554604
0.095166
9.516616
0.260906
1.454139
0.352885
1.677344
0.370844
30.093824
false
2024-04-19
2024-06-26
0
vicgalle/Roleplay-Llama-3-8B
vihangd_smart-dan-sft-v0.1_4bit
4bit
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/vihangd/smart-dan-sft-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">vihangd/smart-dan-sft-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/vihangd__smart-dan-sft-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
vihangd/smart-dan-sft-v0.1
924b4a09153d4061fa9d58f03b10cd7cde7e3084
3.783096
apache-2.0
0
0
true
true
true
false
false
0.361025
0.157646
15.764616
0.306177
3.125599
0.004532
0.453172
0.255034
0.671141
0.350188
1.106771
0.114195
1.577275
false
2024-08-09
2024-08-20
0
vihangd/smart-dan-sft-v0.1
vonjack_MobileLLM-125M-HF_float16
float16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/vonjack/MobileLLM-125M-HF" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">vonjack/MobileLLM-125M-HF</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/vonjack__MobileLLM-125M-HF-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
vonjack/MobileLLM-125M-HF
7664f5e1b91faa04fac545f64db84c26316c7e63
5.464647
cc-by-nc-4.0
0
0
true
true
true
false
false
0.171811
0.210728
21.072754
0.30273
3.146584
0.003021
0.302115
0.260067
1.342282
0.378187
5.106771
0.116356
1.817376
false
2024-11-15
2024-11-15
0
vonjack/MobileLLM-125M-HF
vonjack_Phi-3.5-mini-instruct-hermes-fc-json_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Adapter
?
<a target="_blank" href="https://huggingface.co/vonjack/Phi-3.5-mini-instruct-hermes-fc-json" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">vonjack/Phi-3.5-mini-instruct-hermes-fc-json</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/vonjack__Phi-3.5-mini-instruct-hermes-fc-json-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
vonjack/Phi-3.5-mini-instruct-hermes-fc-json
4cacfb35723647d408f0414886d0dfe67404a14f
4.516525
apache-2.0
1
4
true
true
true
false
true
1.285189
0.141584
14.158433
0.297476
2.390836
0
0
0.254195
0.559284
0.404135
8.45026
0.113863
1.540337
false
2024-11-05
2024-11-05
1
vonjack/Phi-3.5-mini-instruct-hermes-fc-json (Merge)
vonjack_Qwen2.5-Coder-0.5B-Merged_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/vonjack/Qwen2.5-Coder-0.5B-Merged" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">vonjack/Qwen2.5-Coder-0.5B-Merged</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/vonjack__Qwen2.5-Coder-0.5B-Merged-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
vonjack/Qwen2.5-Coder-0.5B-Merged
38e4789c0fc5fad359de2f7bafdb65c3ae26b95b
6.350287
0
0
false
true
true
false
true
0.496779
0.309971
30.997088
0.307602
3.588738
0
0
0.253356
0.447427
0.330344
0.826302
0.12018
2.242169
false
2024-11-19
2024-11-19
1
vonjack/Qwen2.5-Coder-0.5B-Merged (Merge)
vonjack_SmolLM2-1.7B-Merged_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/vonjack/SmolLM2-1.7B-Merged" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">vonjack/SmolLM2-1.7B-Merged</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/vonjack__SmolLM2-1.7B-Merged-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
vonjack/SmolLM2-1.7B-Merged
232d54a335220b0d83d6036f6d8df3971d3e79bb
11.944703
0
1
false
true
true
false
true
0.311327
0.369797
36.979658
0.358655
10.76653
0.045317
4.531722
0.279362
3.914989
0.340792
3.832292
0.204787
11.643026
false
2024-11-18
2024-11-18
1
vonjack/SmolLM2-1.7B-Merged (Merge)
vonjack_SmolLM2-135M-Merged_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/vonjack/SmolLM2-135M-Merged" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">vonjack/SmolLM2-135M-Merged</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/vonjack__SmolLM2-135M-Merged-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
vonjack/SmolLM2-135M-Merged
a1700ca913a87ad713edfe57a2030a9d7c088970
5.73396
0
0
false
true
true
false
true
0.34551
0.248297
24.829674
0.309993
4.587041
0.003021
0.302115
0.238255
0
0.366187
3.440104
0.111203
1.244829
false
2024-11-15
2024-11-15
1
vonjack/SmolLM2-135M-Merged (Merge)
vonjack_SmolLM2-360M-Merged_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/vonjack/SmolLM2-360M-Merged" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">vonjack/SmolLM2-360M-Merged</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/vonjack__SmolLM2-360M-Merged-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
vonjack/SmolLM2-360M-Merged
32bceedf56b29a4a9fdd459a36fbc7fae5e274c8
7.130731
0
0
false
true
true
false
true
0.385742
0.320587
32.058715
0.315485
4.741734
0.007553
0.755287
0.255872
0.782998
0.352729
3.357813
0.109791
1.08784
false
2024-11-15
2024-11-15
1
vonjack/SmolLM2-360M-Merged (Merge)
w4r10ck_SOLAR-10.7B-Instruct-v1.0-uncensored_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/w4r10ck/SOLAR-10.7B-Instruct-v1.0-uncensored" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">w4r10ck/SOLAR-10.7B-Instruct-v1.0-uncensored</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/w4r10ck__SOLAR-10.7B-Instruct-v1.0-uncensored-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
w4r10ck/SOLAR-10.7B-Instruct-v1.0-uncensored
baa7b3899e85af4b2f02b01fd93f203872140d27
20.577181
apache-2.0
30
10
true
true
true
false
false
0.801971
0.388406
38.84061
0.530153
33.858639
0.003021
0.302115
0.294463
5.928412
0.463948
18.49349
0.334358
26.03982
false
2023-12-14
2024-10-11
0
w4r10ck/SOLAR-10.7B-Instruct-v1.0-uncensored
wannaphong_KhanomTanLLM-Instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/wannaphong/KhanomTanLLM-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">wannaphong/KhanomTanLLM-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/wannaphong__KhanomTanLLM-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
wannaphong/KhanomTanLLM-Instruct
351239c92c0ff3304d1dd98fdf4ac054a8c1acc3
4.617874
apache-2.0
1
3
true
true
true
false
true
0.401731
0.162118
16.211763
0.309312
3.944866
0.001511
0.151057
0.263423
1.789709
0.370062
4.291146
0.111868
1.318706
false
2024-08-24
2024-08-29
0
wannaphong/KhanomTanLLM-Instruct
waqasali1707_Beast-Soul-new_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/waqasali1707/Beast-Soul-new" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">waqasali1707/Beast-Soul-new</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/waqasali1707__Beast-Soul-new-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
waqasali1707/Beast-Soul-new
a23d68c4556d91a129de3f8fd8b9e0ff0890f4cc
22.108388
0
7
false
true
true
false
false
0.636888
0.502987
50.298652
0.522495
33.044262
0.070242
7.024169
0.282718
4.362416
0.448563
14.503646
0.310755
23.417184
false
2024-08-07
2024-08-07
1
waqasali1707/Beast-Soul-new (Merge)
wave-on-discord_qwent-7b_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/wave-on-discord/qwent-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">wave-on-discord/qwent-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/wave-on-discord__qwent-7b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
wave-on-discord/qwent-7b
40000e76d2a4d0ad054aff9fe873c5beb0e4925e
8.734093
0
7
false
true
true
false
false
1.323496
0.201485
20.148539
0.42281
18.066398
0
0
0.265101
2.013423
0.381656
5.473698
0.160322
6.702497
false
2024-09-30
2024-09-30
1
wave-on-discord/qwent-7b (Merge)
win10_ArliAI-RPMax-v1.3-merge-13.3B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/win10/ArliAI-RPMax-v1.3-merge-13.3B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">win10/ArliAI-RPMax-v1.3-merge-13.3B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/win10__ArliAI-RPMax-v1.3-merge-13.3B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
win10/ArliAI-RPMax-v1.3-merge-13.3B
4d3ed351827f1afc1652e13aafeb1eae79b8f562
16.456101
0
13
false
true
true
false
true
1.451305
0.303826
30.382607
0.458139
23.0298
0.034743
3.47432
0.274329
3.243848
0.43251
14.163802
0.31998
24.442228
false
2024-11-16
2024-11-17
1
win10/ArliAI-RPMax-v1.3-merge-13.3B (Merge)
win10_Breeze-13B-32k-Instruct-v1_0_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/win10/Breeze-13B-32k-Instruct-v1_0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">win10/Breeze-13B-32k-Instruct-v1_0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/win10__Breeze-13B-32k-Instruct-v1_0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
win10/Breeze-13B-32k-Instruct-v1_0
220c957cf5d9c534a4ef75c11a18221c461de40a
15.411206
apache-2.0
0
12
true
false
true
false
true
1.448811
0.358431
35.843118
0.461123
25.258699
0.009819
0.981873
0.264262
1.901566
0.420198
11.058073
0.256815
17.423907
false
2024-06-26
2024-06-26
0
win10/Breeze-13B-32k-Instruct-v1_0
win10_EVA-Norns-Qwen2.5-v0.1_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/win10/EVA-Norns-Qwen2.5-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">win10/EVA-Norns-Qwen2.5-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/win10__EVA-Norns-Qwen2.5-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
win10/EVA-Norns-Qwen2.5-v0.1
90c3ca66e700b4a7d2c509634f9b9748a2e4c3ab
24.657872
1
7
false
true
true
false
true
0.656661
0.621963
62.196306
0.507241
30.060942
0.154834
15.483384
0.285235
4.697987
0.40451
8.563802
0.342503
26.944814
false
2024-11-17
2024-11-18
1
win10/EVA-Norns-Qwen2.5-v0.1 (Merge)
win10_Llama-3.2-3B-Instruct-24-9-29_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/win10/Llama-3.2-3B-Instruct-24-9-29" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">win10/Llama-3.2-3B-Instruct-24-9-29</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/win10__Llama-3.2-3B-Instruct-24-9-29-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
win10/Llama-3.2-3B-Instruct-24-9-29
4defb10e2415111abb873d695dd40c387c1d6d57
23.929169
llama3.2
0
3
true
true
true
false
true
0.713606
0.733221
73.322119
0.461423
24.196426
0.166163
16.616314
0.274329
3.243848
0.355521
1.440104
0.322806
24.756206
false
2024-09-29
2024-10-11
2
meta-llama/Llama-3.2-3B-Instruct
win10_Norns-Qwen2.5-12B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/win10/Norns-Qwen2.5-12B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">win10/Norns-Qwen2.5-12B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/win10__Norns-Qwen2.5-12B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
win10/Norns-Qwen2.5-12B
464793295c8633a95e6faedad24dfa8f0fd35663
16.386375
1
12
false
true
true
false
true
1.622972
0.489697
48.969734
0.461892
23.769257
0.004532
0.453172
0.283557
4.474273
0.35549
2.202865
0.266041
18.448951
false
2024-11-17
2024-11-17
1
win10/Norns-Qwen2.5-12B (Merge)
win10_Norns-Qwen2.5-7B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/win10/Norns-Qwen2.5-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">win10/Norns-Qwen2.5-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/win10__Norns-Qwen2.5-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
win10/Norns-Qwen2.5-7B
148d9156f734a8050812892879cf13d1ca01f137
24.593277
0
7
false
true
true
false
true
0.649914
0.612221
61.222113
0.507289
30.250415
0.155589
15.558912
0.284396
4.58613
0.408479
9.126563
0.34134
26.815529
false
2024-11-17
2024-11-18
1
win10/Norns-Qwen2.5-7B (Merge)
win10_llama3-13.45b-Instruct_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/win10/llama3-13.45b-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">win10/llama3-13.45b-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/win10__llama3-13.45b-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
win10/llama3-13.45b-Instruct
94cc0f415e355c6d3d47168a6ff5239ca586904a
17.277282
llama3
1
13
true
false
true
false
true
2.136535
0.414435
41.443481
0.486542
26.67569
0.020393
2.039275
0.258389
1.118568
0.38476
6.328385
0.334525
26.058289
false
2024-06-09
2024-06-26
1
win10/llama3-13.45b-Instruct (Merge)
winglian_Llama-3-8b-64k-PoSE_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/winglian/Llama-3-8b-64k-PoSE" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">winglian/Llama-3-8b-64k-PoSE</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/winglian__Llama-3-8b-64k-PoSE-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
winglian/Llama-3-8b-64k-PoSE
5481d9b74a3ec5a95789673e194c8ff86e2bc2bc
11.004738
74
8
false
true
true
false
true
0.911021
0.285691
28.569086
0.370218
13.307317
0.033233
3.323263
0.260906
1.454139
0.339552
3.077344
0.246676
16.297281
false
2024-04-24
2024-06-26
0
winglian/Llama-3-8b-64k-PoSE
winglian_llama-3-8b-256k-PoSE_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/winglian/llama-3-8b-256k-PoSE" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">winglian/llama-3-8b-256k-PoSE</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/winglian__llama-3-8b-256k-PoSE-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
winglian/llama-3-8b-256k-PoSE
93e7b0b6433c96583ffcef3bc47203e6fdcbbe8b
6.557715
42
8
false
true
true
false
true
1.050723
0.290911
29.091145
0.315658
5.502849
0.015106
1.510574
0.25755
1.006711
0.331552
0.94401
0.111619
1.291002
false
2024-04-26
2024-06-26
0
winglian/llama-3-8b-256k-PoSE
xMaulana_FinMatcha-3B-Instruct_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/xMaulana/FinMatcha-3B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">xMaulana/FinMatcha-3B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/xMaulana__FinMatcha-3B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
xMaulana/FinMatcha-3B-Instruct
be2c0c04fc4dc3fb93631e3c663721da92fea8fc
24.016243
apache-2.0
0
3
true
true
true
false
true
6.577035
0.754828
75.48283
0.453555
23.191023
0.135952
13.595166
0.269295
2.572707
0.363333
5.016667
0.318152
24.239066
false
2024-09-29
2024-10-22
1
xMaulana/FinMatcha-3B-Instruct (Merge)
xinchen9_Llama3.1_8B_Instruct_CoT_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/xinchen9/Llama3.1_8B_Instruct_CoT" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">xinchen9/Llama3.1_8B_Instruct_CoT</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/xinchen9__Llama3.1_8B_Instruct_CoT-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
xinchen9/Llama3.1_8B_Instruct_CoT
cab1b33ddff08de11c5daea8ae079d126d503d8b
16.190743
apache-2.0
0
8
true
true
true
false
false
1.856552
0.297357
29.735657
0.439821
21.142866
0.05287
5.287009
0.302013
6.935123
0.437062
13.166146
0.287899
20.87766
false
2024-09-16
2024-09-19
0
xinchen9/Llama3.1_8B_Instruct_CoT
xinchen9_Llama3.1_CoT_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/xinchen9/Llama3.1_CoT" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">xinchen9/Llama3.1_CoT</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/xinchen9__Llama3.1_CoT-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
xinchen9/Llama3.1_CoT
3cb467f51a59ff163bb942fcde3ef60573c12b79
13.351283
apache-2.0
0
8
true
true
true
false
true
0.950099
0.224616
22.461624
0.434101
19.899124
0.015106
1.510574
0.288591
5.145414
0.430458
11.773958
0.273853
19.317007
false
2024-09-04
2024-09-06
0
xinchen9/Llama3.1_CoT
xinchen9_Llama3.1_CoT_V1_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/xinchen9/Llama3.1_CoT_V1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">xinchen9/Llama3.1_CoT_V1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/xinchen9__Llama3.1_CoT_V1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
xinchen9/Llama3.1_CoT_V1
c5ed4b8bfc364ebae1843af14799818551f5251f
14.394947
apache-2.0
0
8
true
true
true
false
false
1.873462
0.245299
24.529914
0.4376
20.166003
0.01284
1.283988
0.279362
3.914989
0.457219
16.41901
0.280502
20.055777
false
2024-09-06
2024-09-07
0
xinchen9/Llama3.1_CoT_V1
xinchen9_Mistral-7B-CoT_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/xinchen9/Mistral-7B-CoT" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">xinchen9/Mistral-7B-CoT</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/xinchen9__Mistral-7B-CoT-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
xinchen9/Mistral-7B-CoT
9a3c8103dac20d5497d1b8fc041bb5125ff4dc00
11.202955
apache-2.0
0
7
true
true
true
false
false
1.888689
0.279871
27.987074
0.387268
14.806193
0.019637
1.963746
0.249161
0
0.399427
8.195052
0.228391
14.265662
false
2024-09-09
2024-09-23
0
xinchen9/Mistral-7B-CoT
xinchen9_llama3-b8-ft-dis_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/xinchen9/llama3-b8-ft-dis" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">xinchen9/llama3-b8-ft-dis</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/xinchen9__llama3-b8-ft-dis-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
xinchen9/llama3-b8-ft-dis
e4da730f28f79543262de37908943c35f8df81fe
13.897963
apache-2.0
0
8
true
true
true
false
false
1.062327
0.154599
15.459869
0.462579
24.727457
0.034743
3.47432
0.312919
8.389262
0.365375
6.405208
0.324385
24.931664
false
2024-06-28
2024-07-11
0
xinchen9/llama3-b8-ft-dis
xkp24_Llama-3-8B-Instruct-SPPO-Iter2_bt_2b-table_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/xkp24/Llama-3-8B-Instruct-SPPO-Iter2_bt_2b-table" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">xkp24/Llama-3-8B-Instruct-SPPO-Iter2_bt_2b-table</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/xkp24__Llama-3-8B-Instruct-SPPO-Iter2_bt_2b-table-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
xkp24/Llama-3-8B-Instruct-SPPO-Iter2_bt_2b-table
c083d6796f54f66b4cec2261657a02801c761093
22.421029
0
8
false
true
true
false
true
0.624231
0.637475
63.747523
0.491227
27.422821
0.067976
6.797583
0.259228
1.230425
0.382
5.483333
0.3686
29.844489
false
2024-09-30
2024-10-01
0
xkp24/Llama-3-8B-Instruct-SPPO-Iter2_bt_2b-table
xkp24_Llama-3-8B-Instruct-SPPO-Iter2_bt_8b-table_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/xkp24/Llama-3-8B-Instruct-SPPO-Iter2_bt_8b-table" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">xkp24/Llama-3-8B-Instruct-SPPO-Iter2_bt_8b-table</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/xkp24__Llama-3-8B-Instruct-SPPO-Iter2_bt_8b-table-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
xkp24/Llama-3-8B-Instruct-SPPO-Iter2_bt_8b-table
5416d34b5243559914a377ee9d95ce4830bf8dba
24.502405
0
8
false
true
true
false
true
0.750264
0.727451
72.745094
0.505686
29.398353
0.084592
8.459215
0.260067
1.342282
0.381906
5.104948
0.369681
29.964539
false
2024-09-30
2024-10-01
0
xkp24/Llama-3-8B-Instruct-SPPO-Iter2_bt_8b-table
xkp24_Llama-3-8B-Instruct-SPPO-Iter2_gp_2b-table_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/xkp24/Llama-3-8B-Instruct-SPPO-Iter2_gp_2b-table" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">xkp24/Llama-3-8B-Instruct-SPPO-Iter2_gp_2b-table</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/xkp24__Llama-3-8B-Instruct-SPPO-Iter2_gp_2b-table-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
xkp24/Llama-3-8B-Instruct-SPPO-Iter2_gp_2b-table
235204157d7fac0d64fa609d5aee3cebb49ccd11
22.236354
0
8
false
true
true
false
true
0.671741
0.656859
65.685936
0.495183
27.6952
0.064955
6.495468
0.259228
1.230425
0.359396
2.291146
0.37018
30.019947
false
2024-09-30
2024-09-30
0
xkp24/Llama-3-8B-Instruct-SPPO-Iter2_gp_2b-table
xkp24_Llama-3-8B-Instruct-SPPO-Iter2_gp_8b-table_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/xkp24/Llama-3-8B-Instruct-SPPO-Iter2_gp_8b-table" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">xkp24/Llama-3-8B-Instruct-SPPO-Iter2_gp_8b-table</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/xkp24__Llama-3-8B-Instruct-SPPO-Iter2_gp_8b-table-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
xkp24/Llama-3-8B-Instruct-SPPO-Iter2_gp_8b-table
9db00cbbba84453b18956fcc76f264f94a205955
22.935265
0
8
false
true
true
false
true
0.719228
0.66208
66.207995
0.500449
28.508587
0.077795
7.779456
0.259228
1.230425
0.380542
5.001042
0.359957
28.884087
false
2024-09-30
2024-09-30
0
xkp24/Llama-3-8B-Instruct-SPPO-Iter2_gp_8b-table
xkp24_Llama-3-8B-Instruct-SPPO-score-Iter2_bt_2b-table-0.001_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/xkp24/Llama-3-8B-Instruct-SPPO-score-Iter2_bt_2b-table-0.001" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">xkp24/Llama-3-8B-Instruct-SPPO-score-Iter2_bt_2b-table-0.001</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/xkp24__Llama-3-8B-Instruct-SPPO-score-Iter2_bt_2b-table-0.001-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
xkp24/Llama-3-8B-Instruct-SPPO-score-Iter2_bt_2b-table-0.001
1062757826de031a4ae82277e6e737e19e82e514
21.845481
0
8
false
true
true
false
true
0.615003
0.604228
60.422789
0.493606
27.613714
0.064955
6.495468
0.259228
1.230425
0.379333
5.216667
0.370844
30.093824
false
2024-09-30
2024-10-01
0
xkp24/Llama-3-8B-Instruct-SPPO-score-Iter2_bt_2b-table-0.001
xkp24_Llama-3-8B-Instruct-SPPO-score-Iter2_bt_8b-table-0.002_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/xkp24/Llama-3-8B-Instruct-SPPO-score-Iter2_bt_8b-table-0.002" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">xkp24/Llama-3-8B-Instruct-SPPO-score-Iter2_bt_8b-table-0.002</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/xkp24__Llama-3-8B-Instruct-SPPO-score-Iter2_bt_8b-table-0.002-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
xkp24/Llama-3-8B-Instruct-SPPO-score-Iter2_bt_8b-table-0.002
e5d2f179b4a7bd851dcf2b7db6358b13001bf1af
23.938825
0
8
false
true
true
false
true
0.841468
0.713188
71.318768
0.499638
28.574879
0.069486
6.94864
0.258389
1.118568
0.387208
6.067708
0.366439
29.604388
false
2024-09-30
2024-10-01
0
xkp24/Llama-3-8B-Instruct-SPPO-score-Iter2_bt_8b-table-0.002
xkp24_Llama-3-8B-Instruct-SPPO-score-Iter2_gp_2b-table-0.001_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/xkp24/Llama-3-8B-Instruct-SPPO-score-Iter2_gp_2b-table-0.001" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">xkp24/Llama-3-8B-Instruct-SPPO-score-Iter2_gp_2b-table-0.001</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/xkp24__Llama-3-8B-Instruct-SPPO-score-Iter2_gp_2b-table-0.001-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
xkp24/Llama-3-8B-Instruct-SPPO-score-Iter2_gp_2b-table-0.001
0e319ad47ed2b2636b72d07ee9b32657e1e50412
21.224624
0
8
false
true
true
false
true
0.679841
0.594711
59.471092
0.489922
26.943904
0.073263
7.326284
0.259228
1.230425
0.358094
2.328385
0.370429
30.047651
false
2024-09-30
2024-09-30
0
xkp24/Llama-3-8B-Instruct-SPPO-score-Iter2_gp_2b-table-0.001
xkp24_Llama-3-8B-Instruct-SPPO-score-Iter2_gp_8b-table-0.002_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/xkp24/Llama-3-8B-Instruct-SPPO-score-Iter2_gp_8b-table-0.002" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">xkp24/Llama-3-8B-Instruct-SPPO-score-Iter2_gp_8b-table-0.002</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/xkp24__Llama-3-8B-Instruct-SPPO-score-Iter2_gp_8b-table-0.002-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
xkp24/Llama-3-8B-Instruct-SPPO-score-Iter2_gp_8b-table-0.002
0877f2458ea667edcf9213383df41294c788190f
22.69358
0
8
false
true
true
false
true
0.769119
0.645319
64.531887
0.495108
28.046978
0.067976
6.797583
0.260067
1.342282
0.393875
7.334375
0.352975
28.108378
false
2024-09-30
2024-10-01
0
xkp24/Llama-3-8B-Instruct-SPPO-score-Iter2_gp_8b-table-0.002
xukp20_Llama-3-8B-Instruct-SPPO-Iter3_bt_2b-table_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/xukp20/Llama-3-8B-Instruct-SPPO-Iter3_bt_2b-table" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">xukp20/Llama-3-8B-Instruct-SPPO-Iter3_bt_2b-table</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/xukp20__Llama-3-8B-Instruct-SPPO-Iter3_bt_2b-table-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
xukp20/Llama-3-8B-Instruct-SPPO-Iter3_bt_2b-table
d2b87100e5ba3215fddbd308bb17b7bf12fe6c9e
21.01778
0
8
false
true
true
false
true
0.98643
0.575602
57.560163
0.490121
26.866404
0.079305
7.930514
0.259228
1.230425
0.365969
2.979427
0.365858
29.539746
false
2024-09-28
2024-09-29
0
xukp20/Llama-3-8B-Instruct-SPPO-Iter3_bt_2b-table
xukp20_Llama-3-8B-Instruct-SPPO-Iter3_bt_8b-table_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/xukp20/Llama-3-8B-Instruct-SPPO-Iter3_bt_8b-table" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">xukp20/Llama-3-8B-Instruct-SPPO-Iter3_bt_8b-table</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/xukp20__Llama-3-8B-Instruct-SPPO-Iter3_bt_8b-table-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
xukp20/Llama-3-8B-Instruct-SPPO-Iter3_bt_8b-table
19a48ccf5ea463afbbbc61d650b8fb63ff2d94c7
23.969226
0
8
false
true
true
false
true
0.590153
0.703446
70.344575
0.509187
29.731239
0.086858
8.685801
0.259228
1.230425
0.373906
3.904948
0.369265
29.918366
false
2024-09-28
2024-09-29
0
xukp20/Llama-3-8B-Instruct-SPPO-Iter3_bt_8b-table
xukp20_Llama-3-8B-Instruct-SPPO-Iter3_gp_2b-table_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/xukp20/Llama-3-8B-Instruct-SPPO-Iter3_gp_2b-table" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">xukp20/Llama-3-8B-Instruct-SPPO-Iter3_gp_2b-table</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/xukp20__Llama-3-8B-Instruct-SPPO-Iter3_gp_2b-table-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
xukp20/Llama-3-8B-Instruct-SPPO-Iter3_gp_2b-table
0fe230b3432fb2b0f89942d7926291a4dbeb2820
21.781466
0
8
false
true
true
false
true
0.665521
0.602379
60.237946
0.496953
27.892403
0.086103
8.610272
0.259228
1.230425
0.367365
3.18724
0.365775
29.530511
false
2024-09-28
2024-09-29
0
xukp20/Llama-3-8B-Instruct-SPPO-Iter3_gp_2b-table
xukp20_Llama-3-8B-Instruct-SPPO-Iter3_gp_8b-table_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/xukp20/Llama-3-8B-Instruct-SPPO-Iter3_gp_8b-table" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">xukp20/Llama-3-8B-Instruct-SPPO-Iter3_gp_8b-table</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/xukp20__Llama-3-8B-Instruct-SPPO-Iter3_gp_8b-table-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
xukp20/Llama-3-8B-Instruct-SPPO-Iter3_gp_8b-table
d1e19da1029f2d4d45de015754bc52dcb1ea5570
23.059714
0
8
false
true
true
false
true
0.588419
0.66203
66.203008
0.499994
28.439824
0.083082
8.308157
0.259228
1.230425
0.381812
5.126562
0.361453
29.05031
false
2024-09-28
2024-09-29
0
xukp20/Llama-3-8B-Instruct-SPPO-Iter3_gp_8b-table
xukp20_Llama-3-8B-Instruct-SPPO-score-Iter3_bt_2b-table-0.001_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/xukp20/Llama-3-8B-Instruct-SPPO-score-Iter3_bt_2b-table-0.001" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">xukp20/Llama-3-8B-Instruct-SPPO-score-Iter3_bt_2b-table-0.001</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/xukp20__Llama-3-8B-Instruct-SPPO-score-Iter3_bt_2b-table-0.001-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
xukp20/Llama-3-8B-Instruct-SPPO-score-Iter3_bt_2b-table-0.001
a478aa202c59773eba615ae37feb4cc750757695
20.364052
0
8
false
true
true
false
true
0.586443
0.533636
53.363631
0.491487
27.145374
0.06571
6.570997
0.259228
1.230425
0.377969
4.71276
0.36245
29.161126
false
2024-09-28
2024-09-29
0
xukp20/Llama-3-8B-Instruct-SPPO-score-Iter3_bt_2b-table-0.001
xukp20_Llama-3-8B-Instruct-SPPO-score-Iter3_bt_8b-table-0.002_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/xukp20/Llama-3-8B-Instruct-SPPO-score-Iter3_bt_8b-table-0.002" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">xukp20/Llama-3-8B-Instruct-SPPO-score-Iter3_bt_8b-table-0.002</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/xukp20__Llama-3-8B-Instruct-SPPO-score-Iter3_bt_8b-table-0.002-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
xukp20/Llama-3-8B-Instruct-SPPO-score-Iter3_bt_8b-table-0.002
8ef9ef7e2bf522e707a7b090af55f2ec1eafd4b9
23.261322
0
8
false
true
true
false
true
0.869474
0.685161
68.516093
0.507516
29.74055
0.054381
5.438066
0.258389
1.118568
0.383177
5.630469
0.362118
29.124187
false
2024-09-28
2024-09-29
0
xukp20/Llama-3-8B-Instruct-SPPO-score-Iter3_bt_8b-table-0.002