eval_name
stringlengths
12
111
Precision
stringclasses
3 values
Type
stringclasses
6 values
T
stringclasses
6 values
Weight type
stringclasses
2 values
Architecture
stringclasses
51 values
Model
stringlengths
355
689
fullname
stringlengths
4
102
Model sha
stringlengths
0
40
Average ⬆️
float64
1.41
51.2
Hub License
stringclasses
25 values
Hub ❤️
int64
0
5.85k
#Params (B)
int64
-1
140
Available on the hub
bool
2 classes
Not_Merged
bool
2 classes
MoE
bool
2 classes
Flagged
bool
2 classes
Chat Template
bool
2 classes
CO₂ cost (kg)
float64
0.03
107
IFEval Raw
float64
0
0.87
IFEval
float64
0
86.7
BBH Raw
float64
0.28
0.75
BBH
float64
0.81
63.5
MATH Lvl 5 Raw
float64
0
0.51
MATH Lvl 5
float64
0
50.7
GPQA Raw
float64
0.22
0.44
GPQA
float64
0
24.9
MUSR Raw
float64
0.29
0.59
MUSR
float64
0
36.5
MMLU-PRO Raw
float64
0.1
0.72
MMLU-PRO
float64
0
68.7
Maintainer's Highlight
bool
2 classes
Upload To Hub Date
stringclasses
411 values
Submission Date
stringclasses
158 values
Generation
int64
0
10
Base Model
stringlengths
4
102
nbeerbower_Mistral-Small-Drummer-22B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/nbeerbower/Mistral-Small-Drummer-22B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nbeerbower/Mistral-Small-Drummer-22B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nbeerbower__Mistral-Small-Drummer-22B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
nbeerbower/Mistral-Small-Drummer-22B
53b21ece0c64ffc8aba81f294ad19e2c06e9852c
29.74388
other
11
22
true
true
true
false
false
1.612722
0.633129
63.312899
0.57932
40.12177
0.18429
18.429003
0.343121
12.416107
0.406365
9.795573
0.409491
34.387928
false
2024-09-26
2024-10-01
1
nbeerbower/Mistral-Small-Drummer-22B (Merge)
nbeerbower_Mistral-Small-Gutenberg-Doppel-22B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/nbeerbower/Mistral-Small-Gutenberg-Doppel-22B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nbeerbower/Mistral-Small-Gutenberg-Doppel-22B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nbeerbower__Mistral-Small-Gutenberg-Doppel-22B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
nbeerbower/Mistral-Small-Gutenberg-Doppel-22B
d8091aad5f882b714321e4d51f504cc61996ee67
27.858747
other
9
22
true
true
true
false
false
1.588603
0.489323
48.932277
0.585893
40.931345
0.21148
21.148036
0.346477
12.863535
0.397063
8.566146
0.4124
34.711141
false
2024-09-25
2024-09-25
1
nbeerbower/Mistral-Small-Gutenberg-Doppel-22B (Merge)
nbeerbower_Nemo-Loony-12B-experimental_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/nbeerbower/Nemo-Loony-12B-experimental" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nbeerbower/Nemo-Loony-12B-experimental</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nbeerbower__Nemo-Loony-12B-experimental-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
nbeerbower/Nemo-Loony-12B-experimental
7b06f30502a9b58c028ac1079e1b3d2988b76866
10.431803
0
12
false
true
true
false
true
1.237582
0.373444
37.344357
0.382222
12.974588
0.01284
1.283988
0.270134
2.684564
0.334063
1.757812
0.15891
6.545508
false
2024-11-26
2024-11-26
1
nbeerbower/Nemo-Loony-12B-experimental (Merge)
nbeerbower_Qwen2.5-Gutenberg-Doppel-14B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/nbeerbower/Qwen2.5-Gutenberg-Doppel-14B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nbeerbower/Qwen2.5-Gutenberg-Doppel-14B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nbeerbower__Qwen2.5-Gutenberg-Doppel-14B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
nbeerbower/Qwen2.5-Gutenberg-Doppel-14B
11a5060f9e7315ea07241106f086ac4694dded60
32.302115
apache-2.0
11
14
true
true
true
false
true
1.690612
0.809083
80.908323
0.638174
48.238909
0
0
0.333054
11.073826
0.410063
10.024479
0.492104
43.567154
false
2024-11-11
2024-11-11
1
nbeerbower/Qwen2.5-Gutenberg-Doppel-14B (Merge)
nbeerbower_SmolNemo-12B-FFT-experimental_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/nbeerbower/SmolNemo-12B-FFT-experimental" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nbeerbower/SmolNemo-12B-FFT-experimental</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nbeerbower__SmolNemo-12B-FFT-experimental-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
nbeerbower/SmolNemo-12B-FFT-experimental
d8d7a90ae9b9cb79cdc0912a685c3cb8d7a25560
8.320055
apache-2.0
0
12
true
true
true
false
true
1.225415
0.334801
33.480055
0.333609
6.542439
0.002266
0.226586
0.260067
1.342282
0.384698
5.920573
0.121676
2.408392
false
2024-11-25
2024-11-26
1
nbeerbower/SmolNemo-12B-FFT-experimental (Merge)
nbeerbower_Stella-mistral-nemo-12B-v2_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/nbeerbower/Stella-mistral-nemo-12B-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nbeerbower/Stella-mistral-nemo-12B-v2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nbeerbower__Stella-mistral-nemo-12B-v2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
nbeerbower/Stella-mistral-nemo-12B-v2
b81bab28f7dcb25a0aa0fe4dcf957f3083ee6b43
22.430369
3
12
false
true
true
false
false
1.740872
0.327431
32.743122
0.548375
35.364516
0.112538
11.253776
0.332215
10.961969
0.430396
14.432812
0.368434
29.82602
false
2024-09-07
2024-09-14
1
nbeerbower/Stella-mistral-nemo-12B-v2 (Merge)
nbeerbower_gemma2-gutenberg-27B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/nbeerbower/gemma2-gutenberg-27B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nbeerbower/gemma2-gutenberg-27B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nbeerbower__gemma2-gutenberg-27B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
nbeerbower/gemma2-gutenberg-27B
d4febe52e8b7b13a98126dbf1716ed1329f48922
10.108961
gemma
4
27
true
true
true
false
false
7.695458
0.294708
29.470804
0.379657
13.091525
0
0
0.272651
3.020134
0.372729
4.157813
0.198221
10.91349
false
2024-09-09
2024-09-23
1
nbeerbower/gemma2-gutenberg-27B (Merge)
nbeerbower_gemma2-gutenberg-9B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/nbeerbower/gemma2-gutenberg-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nbeerbower/gemma2-gutenberg-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nbeerbower__gemma2-gutenberg-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
nbeerbower/gemma2-gutenberg-9B
ebdab2d41f257fc9e7c858498653644d13386ce5
22.649257
gemma
12
9
true
true
true
false
false
2.809609
0.279595
27.959481
0.59509
42.355611
0.016616
1.661631
0.338087
11.744966
0.45951
16.705469
0.419215
35.468381
false
2024-07-14
2024-08-03
1
nbeerbower/gemma2-gutenberg-9B (Merge)
nbeerbower_llama-3-gutenberg-8B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/nbeerbower/llama-3-gutenberg-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nbeerbower/llama-3-gutenberg-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nbeerbower__llama-3-gutenberg-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
nbeerbower/llama-3-gutenberg-8B
4ed3aac5e30c078bee79ae193c2d301d38860b20
21.296229
other
7
8
true
true
true
false
false
0.883569
0.437191
43.71911
0.49936
27.958133
0.077795
7.779456
0.301174
6.823266
0.407302
10.046094
0.383062
31.451315
false
2024-05-05
2024-07-10
1
nbeerbower/llama-3-gutenberg-8B (Merge)
nbeerbower_llama3.1-cc-8B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/nbeerbower/llama3.1-cc-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nbeerbower/llama3.1-cc-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nbeerbower__llama3.1-cc-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
nbeerbower/llama3.1-cc-8B
5269bb26f1afe005f144564f484e7554f185239f
20.256042
llama3
1
8
true
true
true
false
false
0.937237
0.506809
50.68086
0.487119
26.483812
0.070997
7.099698
0.285235
4.697987
0.38851
6.497135
0.334691
26.076758
false
2024-08-18
2024-09-14
1
nbeerbower/llama3.1-cc-8B (Merge)
nbeerbower_mistral-nemo-bophades-12B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/nbeerbower/mistral-nemo-bophades-12B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nbeerbower/mistral-nemo-bophades-12B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nbeerbower__mistral-nemo-bophades-12B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
nbeerbower/mistral-nemo-bophades-12B
252a358e099f77a0a28125e00a57aa3a107b3910
24.847434
apache-2.0
8
12
true
true
true
false
true
2.052347
0.679441
67.944055
0.498847
29.543905
0.070242
7.024169
0.285235
4.697987
0.417781
12.089323
0.350066
27.785165
false
2024-08-13
2024-09-03
1
nbeerbower/mistral-nemo-bophades-12B (Merge)
nbeerbower_mistral-nemo-cc-12B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/nbeerbower/mistral-nemo-cc-12B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nbeerbower/mistral-nemo-cc-12B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nbeerbower__mistral-nemo-cc-12B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
nbeerbower/mistral-nemo-cc-12B
fc32293e0b022d6daef9bfdb0c54d57a5226bf9a
17.077529
apache-2.0
1
12
true
true
true
false
false
1.494622
0.143532
14.353249
0.539941
34.446547
0.018127
1.812689
0.315436
8.724832
0.442365
14.26224
0.359791
28.865618
false
2024-08-18
2024-09-14
1
nbeerbower/mistral-nemo-cc-12B (Merge)
nbeerbower_mistral-nemo-gutades-12B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/nbeerbower/mistral-nemo-gutades-12B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nbeerbower/mistral-nemo-gutades-12B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nbeerbower__mistral-nemo-gutades-12B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
nbeerbower/mistral-nemo-gutades-12B
5689f929808a6165f94ba43f872b944a4bdaaea3
21.000396
apache-2.0
2
12
true
true
true
false
false
1.82462
0.342519
34.251896
0.540719
34.574408
0.113293
11.329305
0.315436
8.724832
0.404042
8.671875
0.356051
28.450059
false
2024-09-17
2024-09-23
1
nbeerbower/mistral-nemo-gutades-12B (Merge)
nbeerbower_mistral-nemo-gutenberg-12B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/nbeerbower/mistral-nemo-gutenberg-12B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nbeerbower/mistral-nemo-gutenberg-12B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nbeerbower__mistral-nemo-gutenberg-12B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
nbeerbower/mistral-nemo-gutenberg-12B
6aeb6f769a53eb111839db8f439b614730e39593
20.998979
apache-2.0
6
12
true
true
true
false
false
1.574815
0.350387
35.038697
0.528136
32.433874
0.114804
11.480363
0.307047
7.606264
0.417063
10.966146
0.356217
28.468528
false
2024-08-12
2024-09-03
1
nbeerbower/mistral-nemo-gutenberg-12B (Merge)
nbeerbower_mistral-nemo-gutenberg-12B-v2_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/nbeerbower/mistral-nemo-gutenberg-12B-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nbeerbower/mistral-nemo-gutenberg-12B-v2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nbeerbower__mistral-nemo-gutenberg-12B-v2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
nbeerbower/mistral-nemo-gutenberg-12B-v2
86bf9c105ff40835132e41699ac1a76ee0e5b683
24.117039
apache-2.0
26
12
true
true
true
false
true
2.870908
0.62034
62.033959
0.53972
34.730616
0.024924
2.492447
0.277685
3.691275
0.428698
13.98724
0.3499
27.766696
false
2024-08-13
2024-09-03
1
nbeerbower/mistral-nemo-gutenberg-12B-v2 (Merge)
nbeerbower_mistral-nemo-gutenberg-12B-v3_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/nbeerbower/mistral-nemo-gutenberg-12B-v3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nbeerbower/mistral-nemo-gutenberg-12B-v3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nbeerbower__mistral-nemo-gutenberg-12B-v3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
nbeerbower/mistral-nemo-gutenberg-12B-v3
3e1a716281f23280abd72e402139c578faca175a
19.177219
apache-2.0
10
12
true
true
true
false
false
1.835369
0.218271
21.827085
0.544066
34.957915
0.05287
5.287009
0.314597
8.612975
0.445031
14.995573
0.364445
29.382757
false
2024-08-15
2024-09-03
1
nbeerbower/mistral-nemo-gutenberg-12B-v3 (Merge)
nbeerbower_mistral-nemo-gutenberg-12B-v4_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/nbeerbower/mistral-nemo-gutenberg-12B-v4" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nbeerbower/mistral-nemo-gutenberg-12B-v4</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nbeerbower__mistral-nemo-gutenberg-12B-v4-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
nbeerbower/mistral-nemo-gutenberg-12B-v4
59409afe585ae6945a588c867f879a9d31e571e6
19.750865
apache-2.0
15
12
true
true
true
false
false
1.760461
0.23793
23.79298
0.526903
31.971258
0.120846
12.084592
0.316275
8.836689
0.410427
13.203385
0.357547
28.616283
false
2024-08-22
2024-09-03
1
nbeerbower/mistral-nemo-gutenberg-12B-v4 (Merge)
nbeerbower_mistral-nemo-gutenberg2-12B-test_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/nbeerbower/mistral-nemo-gutenberg2-12B-test" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nbeerbower/mistral-nemo-gutenberg2-12B-test</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nbeerbower__mistral-nemo-gutenberg2-12B-test-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
nbeerbower/mistral-nemo-gutenberg2-12B-test
10da6150b0bedf8fd59206d72c4c0335ac665df3
20.920227
apache-2.0
1
12
true
true
true
false
false
1.675027
0.338472
33.847192
0.525478
32.044759
0.113293
11.329305
0.317114
8.948546
0.415729
10.966146
0.355469
28.385417
false
2024-09-24
2024-09-25
1
nbeerbower/mistral-nemo-gutenberg2-12B-test (Merge)
nbeerbower_mistral-nemo-wissenschaft-12B_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/nbeerbower/mistral-nemo-wissenschaft-12B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nbeerbower/mistral-nemo-wissenschaft-12B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nbeerbower__mistral-nemo-wissenschaft-12B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
nbeerbower/mistral-nemo-wissenschaft-12B
2480f9924415c72fe00ae9391bb15a6d05c889eb
24.67911
apache-2.0
6
12
true
true
true
false
true
1.429373
0.652013
65.201332
0.504031
29.567999
0.071752
7.175227
0.292785
5.704698
0.417781
12.289323
0.353225
28.136082
false
2024-08-12
2024-08-30
1
nbeerbower/mistral-nemo-wissenschaft-12B (Merge)
nbrahme_IndusQ_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
GPT2LMHeadModel
<a target="_blank" href="https://huggingface.co/nbrahme/IndusQ" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nbrahme/IndusQ</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nbrahme__IndusQ-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
nbrahme/IndusQ
d4224f753e6a2d6e7476752fb927c26c55ec9467
5.623546
0
1
false
true
true
false
true
0.150617
0.243975
24.397488
0.30624
3.747096
0
0
0.265101
2.013423
0.336635
2.246094
0.112035
1.337175
false
2024-09-18
0
Removed
netcat420_MFANN-Llama3.1-Abliterated-SLERP-TIES-V2_float16
float16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/netcat420/MFANN-Llama3.1-Abliterated-SLERP-TIES-V2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">netcat420/MFANN-Llama3.1-Abliterated-SLERP-TIES-V2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/netcat420__MFANN-Llama3.1-Abliterated-SLERP-TIES-V2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
netcat420/MFANN-Llama3.1-Abliterated-SLERP-TIES-V2
0e649dd355ad7d562f9346c96642c24eff35338e
19.213728
apache-2.0
0
8
true
false
true
false
false
0.704113
0.42098
42.097967
0.492376
26.93837
0.076284
7.628399
0.29698
6.263982
0.37276
4.328385
0.352227
28.025266
false
2024-11-08
2024-11-09
0
netcat420/MFANN-Llama3.1-Abliterated-SLERP-TIES-V2
netcat420_MFANN-Llama3.1-Abliterated-SLERP-TIES-V3_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/netcat420/MFANN-Llama3.1-Abliterated-SLERP-TIES-V3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">netcat420/MFANN-Llama3.1-Abliterated-SLERP-TIES-V3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/netcat420__MFANN-Llama3.1-Abliterated-SLERP-TIES-V3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
netcat420/MFANN-Llama3.1-Abliterated-SLERP-TIES-V3
381cf003a5e28d2b273226364b568cc60b857b5b
19.22203
2
8
false
true
true
false
false
0.72091
0.423802
42.380218
0.491402
26.978851
0.075529
7.55287
0.29698
6.263982
0.374062
4.491146
0.348986
27.665115
false
2024-11-25
2024-11-26
1
netcat420/MFANN-Llama3.1-Abliterated-SLERP-TIES-V3 (Merge)
netcat420_MFANN-Llama3.1-Abliterated-SLERP-V4_float16
float16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/netcat420/MFANN-Llama3.1-Abliterated-SLERP-V4" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">netcat420/MFANN-Llama3.1-Abliterated-SLERP-V4</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/netcat420__MFANN-Llama3.1-Abliterated-SLERP-V4-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
netcat420/MFANN-Llama3.1-Abliterated-SLERP-V4
af160f1cf089ccbcbf00f99b951797a1f3daeb04
19.412059
apache-2.0
0
8
true
false
true
false
false
0.722467
0.416883
41.688276
0.490897
26.706074
0.068731
6.873112
0.305369
7.38255
0.382094
5.861719
0.351646
27.960624
false
2024-11-08
2024-11-09
0
netcat420/MFANN-Llama3.1-Abliterated-SLERP-V4
netcat420_MFANN-Llama3.1-Abliterated-SLERP-V5_float16
float16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/netcat420/MFANN-Llama3.1-Abliterated-SLERP-V5" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">netcat420/MFANN-Llama3.1-Abliterated-SLERP-V5</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/netcat420__MFANN-Llama3.1-Abliterated-SLERP-V5-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
netcat420/MFANN-Llama3.1-Abliterated-SLERP-V5
e0502b359816fe3ecd4f7206e5230398604fdfe2
19.493723
2
8
false
true
true
false
false
0.705626
0.432895
43.289472
0.495189
27.367143
0.081571
8.1571
0.293624
5.816555
0.378125
5.165625
0.344498
27.166445
false
2024-11-25
2024-11-26
1
netcat420/MFANN-Llama3.1-Abliterated-SLERP-V5 (Merge)
netcat420_MFANN-Llama3.1-Abliterated-Slerp-TIES_float16
float16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/netcat420/MFANN-Llama3.1-Abliterated-Slerp-TIES" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">netcat420/MFANN-Llama3.1-Abliterated-Slerp-TIES</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/netcat420__MFANN-Llama3.1-Abliterated-Slerp-TIES-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
netcat420/MFANN-Llama3.1-Abliterated-Slerp-TIES
dbe0a3b69206c042de2b0a96fc156feeecaa49c7
19.134712
2
8
false
true
true
false
false
0.773314
0.429347
42.934746
0.496751
27.599829
0.059668
5.966767
0.291946
5.592841
0.368698
4.58724
0.353142
28.126847
false
2024-10-28
2024-10-29
1
netcat420/MFANN-Llama3.1-Abliterated-Slerp-TIES (Merge)
netcat420_MFANN-Llama3.1-Abliterated-Slerp-V3.2_float16
float16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/netcat420/MFANN-Llama3.1-Abliterated-Slerp-V3.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">netcat420/MFANN-Llama3.1-Abliterated-Slerp-V3.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/netcat420__MFANN-Llama3.1-Abliterated-Slerp-V3.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
netcat420/MFANN-Llama3.1-Abliterated-Slerp-V3.2
56abb76e65cbf9dc49af662b09894d119d49705a
18.91525
1
8
false
true
true
false
false
0.741419
0.412811
41.281134
0.497825
27.774394
0.061934
6.193353
0.287752
5.033557
0.375427
5.128385
0.352726
28.080674
false
2024-10-28
2024-10-29
1
netcat420/MFANN-Llama3.1-Abliterated-Slerp-V3.2 (Merge)
netcat420_MFANN-llama3.1-Abliterated-SLERP_float16
float16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/netcat420/MFANN-llama3.1-Abliterated-SLERP" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">netcat420/MFANN-llama3.1-Abliterated-SLERP</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/netcat420__MFANN-llama3.1-Abliterated-SLERP-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
netcat420/MFANN-llama3.1-Abliterated-SLERP
0c7b2916727e6c28bbca2aa613b8247b66905915
13.90681
1
8
false
true
true
false
false
0.773579
0.259063
25.906262
0.45745
22.280625
0.049849
4.984894
0.27349
3.131991
0.380917
5.714583
0.292803
21.422503
false
2024-09-25
2024-10-07
1
netcat420/MFANN-llama3.1-Abliterated-SLERP (Merge)
netcat420_MFANN-llama3.1-abliterated-SLERP-v3_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/netcat420/MFANN-llama3.1-abliterated-SLERP-v3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">netcat420/MFANN-llama3.1-abliterated-SLERP-v3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/netcat420__MFANN-llama3.1-abliterated-SLERP-v3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
netcat420/MFANN-llama3.1-abliterated-SLERP-v3
f90a20024060942826302c30860572c227dd4013
18.080026
llama3.1
1
8
true
false
true
false
false
0.79157
0.379939
37.993856
0.493058
27.18727
0.066465
6.646526
0.291107
5.480984
0.366031
3.053906
0.353059
28.117612
false
2024-10-07
2024-10-07
1
netcat420/MFANN-llama3.1-abliterated-SLERP-v3 (Merge)
netcat420_MFANN-llama3.1-abliterated-SLERP-v3.1_float16
float16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/netcat420/MFANN-llama3.1-abliterated-SLERP-v3.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">netcat420/MFANN-llama3.1-abliterated-SLERP-v3.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/netcat420__MFANN-llama3.1-abliterated-SLERP-v3.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
netcat420/MFANN-llama3.1-abliterated-SLERP-v3.1
6d306eb66466cb8e1456a36f3895890a117e91e4
19.029174
llama3.1
1
8
true
false
true
false
false
1.749808
0.420155
42.015519
0.492069
27.026316
0.073263
7.326284
0.292785
5.704698
0.368635
3.846094
0.354305
28.256132
false
2024-10-08
2024-10-17
1
netcat420/MFANN-llama3.1-abliterated-SLERP-v3.1 (Merge)
netcat420_MFANN-llama3.1-abliterated-v2_float16
float16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/netcat420/MFANN-llama3.1-abliterated-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">netcat420/MFANN-llama3.1-abliterated-v2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/netcat420__MFANN-llama3.1-abliterated-v2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
netcat420/MFANN-llama3.1-abliterated-v2
3d0a5d3634726e1a63ac84bee561b346960ca1d7
19.745935
0
8
false
true
true
false
false
0.824524
0.442911
44.291147
0.494083
27.353618
0.072508
7.250755
0.292785
5.704698
0.384542
6.201042
0.349069
27.67435
false
2024-10-04
2024-10-07
1
netcat420/MFANN-llama3.1-abliterated-v2 (Merge)
netcat420_MFANN-phigments-slerp-V2_float16
float16
🤝 base merges and moerges
🤝
Original
PhiForCausalLM
<a target="_blank" href="https://huggingface.co/netcat420/MFANN-phigments-slerp-V2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">netcat420/MFANN-phigments-slerp-V2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/netcat420__MFANN-phigments-slerp-V2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
netcat420/MFANN-phigments-slerp-V2
94596dab22ab78f0d2ec00b8e33c8fa98581ad0f
16.004358
0
2
false
true
true
false
false
0.40812
0.32316
32.316033
0.482728
26.927492
0.015861
1.586103
0.272651
3.020134
0.403729
13.099479
0.271692
19.076906
false
2024-10-23
2024-10-26
1
netcat420/MFANN-phigments-slerp-V2 (Merge)
netcat420_MFANN3bv0.15_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
PhiForCausalLM
<a target="_blank" href="https://huggingface.co/netcat420/MFANN3bv0.15" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">netcat420/MFANN3bv0.15</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/netcat420__MFANN3bv0.15-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
netcat420/MFANN3bv0.15
20dbdfb9154cc2f6d43651fc8cea63a120220dc7
11.811262
mit
0
2
true
true
true
false
false
0.467138
0.201211
20.121057
0.453931
23.469347
0.019637
1.963746
0.251678
0.223714
0.395792
8.773958
0.246842
16.315751
false
2024-07-04
2024-07-05
0
netcat420/MFANN3bv0.15
netcat420_MFANN3bv0.18_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
PhiForCausalLM
<a target="_blank" href="https://huggingface.co/netcat420/MFANN3bv0.18" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">netcat420/MFANN3bv0.18</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/netcat420__MFANN3bv0.18-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
netcat420/MFANN3bv0.18
3e792e3413217b63ea9caa0e8b8595fbeb236a69
12.574348
mit
0
2
true
true
true
false
false
0.482514
0.220645
22.064456
0.451437
23.073404
0.020393
2.039275
0.25755
1.006711
0.402365
10.595573
0.25
16.666667
false
2024-07-25
2024-07-25
0
netcat420/MFANN3bv0.18
netcat420_MFANN3bv0.19_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
PhiForCausalLM
<a target="_blank" href="https://huggingface.co/netcat420/MFANN3bv0.19" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">netcat420/MFANN3bv0.19</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/netcat420__MFANN3bv0.19-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
netcat420/MFANN3bv0.19
073d42274686f5cb6ef6ff9f6ade24eab198e1f2
12.503372
0
2
false
true
true
false
false
0.486488
0.225815
22.581528
0.45158
22.907055
0.017372
1.73716
0.25755
1.006711
0.402396
9.899479
0.251995
16.888298
false
2024-08-04
2024-08-08
0
netcat420/MFANN3bv0.19
netcat420_MFANN3bv0.20_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
PhiForCausalLM
<a target="_blank" href="https://huggingface.co/netcat420/MFANN3bv0.20" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">netcat420/MFANN3bv0.20</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/netcat420__MFANN3bv0.20-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
netcat420/MFANN3bv0.20
ac8ba24559cbdb5704d77b602580d911c265fdee
12.383183
mit
0
2
true
true
true
false
false
0.509418
0.219346
21.934578
0.449337
22.790711
0.015106
1.510574
0.259228
1.230425
0.407729
10.166146
0.25
16.666667
false
2024-08-29
2024-08-29
2
netcat420/MFANN3bv0.19.12 (Merge)
netcat420_MFANN3bv0.21_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
PhiForCausalLM
<a target="_blank" href="https://huggingface.co/netcat420/MFANN3bv0.21" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">netcat420/MFANN3bv0.21</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/netcat420__MFANN3bv0.21-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
netcat420/MFANN3bv0.21
8e78416dce916b69247fa03bd587369d0dade5ed
11.703842
mit
0
2
true
true
true
false
false
1.446786
0.191519
19.15185
0.447002
22.583426
0.01284
1.283988
0.264262
1.901566
0.375948
9.826823
0.239279
15.475399
false
2024-09-23
2024-09-24
1
netcat420/MFANN3bv0.21 (Merge)
netcat420_MFANN3bv0.22_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
PhiForCausalLM
<a target="_blank" href="https://huggingface.co/netcat420/MFANN3bv0.22" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">netcat420/MFANN3bv0.22</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/netcat420__MFANN3bv0.22-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
netcat420/MFANN3bv0.22
20c26f267ebe62ef1da037a5b840a304cb8d740b
11.916627
mit
0
2
true
true
true
false
false
0.395668
0.197938
19.793814
0.448511
22.491537
0.006042
0.60423
0.261745
1.565996
0.352135
10.183594
0.251745
16.860594
false
2024-10-25
2024-10-26
0
netcat420/MFANN3bv0.22
netcat420_MFANN3bv0.23_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
PhiForCausalLM
<a target="_blank" href="https://huggingface.co/netcat420/MFANN3bv0.23" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">netcat420/MFANN3bv0.23</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/netcat420__MFANN3bv0.23-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
netcat420/MFANN3bv0.23
93eacd43dcb307016e22a4d9f9f8deef49cd9111
11.183676
0
2
false
true
true
false
false
0.388184
0.204808
20.480769
0.449542
22.696341
0.009063
0.906344
0.251678
0.223714
0.34274
7.042448
0.241772
15.752438
false
2024-11-06
2024-11-07
0
netcat420/MFANN3bv0.23
netcat420_MFANN3bv0.24_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
PhiForCausalLM
<a target="_blank" href="https://huggingface.co/netcat420/MFANN3bv0.24" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">netcat420/MFANN3bv0.24</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/netcat420__MFANN3bv0.24-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
netcat420/MFANN3bv0.24
55813c2586488a2e7be5883f7e695396f5629d3e
11.520757
mit
0
2
true
true
true
false
false
0.384537
0.220045
22.004504
0.440735
21.545385
0.010574
1.057402
0.258389
1.118568
0.352073
8.375781
0.235206
15.022902
false
2024-11-21
2024-11-22
0
netcat420/MFANN3bv0.24
netcat420_MFANNv0.19_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/netcat420/MFANNv0.19" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">netcat420/MFANNv0.19</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/netcat420__MFANNv0.19-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
netcat420/MFANNv0.19
af26a25549b7ad291766c479bebda58f15fbff42
14.187656
llama3.1
0
8
true
true
true
false
false
0.957079
0.305674
30.56745
0.473138
24.924106
0.029456
2.945619
0.307047
7.606264
0.352698
2.720573
0.247257
16.361924
false
2024-07-27
2024-07-27
0
netcat420/MFANNv0.19
netcat420_MFANNv0.20_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/netcat420/MFANNv0.20" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">netcat420/MFANNv0.20</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/netcat420__MFANNv0.20-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
netcat420/MFANNv0.20
e612e57c933870b8990ac2bc217c434f3ffc84bd
16.524597
llama3.1
0
8
true
true
true
false
false
0.867884
0.347865
34.786478
0.457443
22.401697
0.053625
5.362538
0.290268
5.369128
0.387396
6.757813
0.320229
24.469932
false
2024-08-07
2024-08-08
0
netcat420/MFANNv0.20
netcat420_MFANNv0.21_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/netcat420/MFANNv0.21" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">netcat420/MFANNv0.21</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/netcat420__MFANNv0.21-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
netcat420/MFANNv0.21
8c71d0eb419f54c489fa1ddf55d4bd18a1fb27d8
15.898755
llama3
0
8
true
true
true
false
false
0.879411
0.32331
32.330993
0.457637
22.058432
0.058157
5.81571
0.278523
3.803132
0.399333
8.816667
0.303108
22.567598
false
2024-08-31
2024-09-02
2
netcat420/MFANNv0.20.12 (Merge)
netcat420_MFANNv0.22.1_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/netcat420/MFANNv0.22.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">netcat420/MFANNv0.22.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/netcat420__MFANNv0.22.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
netcat420/MFANNv0.22.1
98108142480b802a3e1bb27e3d47075a4ea3a4f1
15.71773
llama3.1
0
8
true
true
true
false
false
0.840529
0.308947
30.894693
0.466089
23.602793
0.056647
5.664653
0.276007
3.467562
0.375302
4.646094
0.334275
26.030585
false
2024-10-04
2024-10-05
1
netcat420/MFANNv0.22.1 (Merge)
netcat420_MFANNv0.23_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/netcat420/MFANNv0.23" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">netcat420/MFANNv0.23</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/netcat420__MFANNv0.23-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
netcat420/MFANNv0.23
cf7fb44a8c858602d7fcba58adcbd514c7e08ba4
16.652656
llama3.1
1
8
true
true
true
false
false
0.81038
0.312744
31.274352
0.48981
27.042345
0.049849
4.984894
0.284396
4.58613
0.376792
5.498958
0.338763
26.529255
false
2024-10-27
2024-10-29
1
netcat420/MFANNv0.23 (Merge)
netcat420_MFANNv0.24_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/netcat420/MFANNv0.24" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">netcat420/MFANNv0.24</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/netcat420__MFANNv0.24-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
netcat420/MFANNv0.24
57ce382fede1adce68bdb95a386255fa363077d7
16.398374
llama3.1
1
8
true
true
true
false
false
0.743903
0.316241
31.624091
0.479027
25.351725
0.061178
6.117825
0.284396
4.58613
0.375396
4.624479
0.334774
26.085993
false
2024-11-07
2024-11-09
1
netcat420/MFANNv0.24 (Merge)
netcat420_MFANNv0.25_float16
float16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/netcat420/MFANNv0.25" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">netcat420/MFANNv0.25</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/netcat420__MFANNv0.25-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
netcat420/MFANNv0.25
cff1e1772fc7f4f3e68ad53d8589df3f52556e38
16.559201
llama3.1
2
8
true
true
true
false
false
0.712288
0.346666
34.666574
0.479407
25.409784
0.055891
5.589124
0.280201
4.026846
0.368792
3.632292
0.334275
26.030585
false
2024-11-25
2024-11-26
1
netcat420/MFANNv0.25 (Merge)
newsbang_Homer-7B-v0.1_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/newsbang/Homer-7B-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">newsbang/Homer-7B-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/newsbang__Homer-7B-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
newsbang/Homer-7B-v0.1
c953cc313ef5e5029efd057c0d3809a3b8d1cf9f
31.333866
apache-2.0
0
7
true
true
true
false
false
0.690734
0.610872
61.087249
0.560139
37.309227
0.282477
28.247734
0.324664
9.955257
0.435698
12.795573
0.447473
38.608156
false
2024-11-14
2024-11-14
0
newsbang/Homer-7B-v0.1
newsbang_Homer-7B-v0.2_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/newsbang/Homer-7B-v0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">newsbang/Homer-7B-v0.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/newsbang__Homer-7B-v0.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
newsbang/Homer-7B-v0.2
50b4ca941657ed362f5660aed8274a59a6b3fe2d
33.114663
0
7
false
true
true
false
true
0.674598
0.749383
74.938275
0.551733
36.403486
0.253776
25.377644
0.332215
10.961969
0.42975
13.11875
0.440991
37.887855
false
2024-11-15
0
Removed
newsbang_Homer-v0.3-Qwen2.5-7B_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/newsbang/Homer-v0.3-Qwen2.5-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">newsbang/Homer-v0.3-Qwen2.5-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/newsbang__Homer-v0.3-Qwen2.5-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
newsbang/Homer-v0.3-Qwen2.5-7B
4fa38c6c590d8e9bbf2075b2fa9cc37e75cde5d4
31.088203
0
7
false
true
true
false
true
0.585603
0.515401
51.540136
0.548059
36.413677
0.295317
29.531722
0.333893
11.185682
0.474365
19.46224
0.445562
38.395759
false
2024-11-18
0
Removed
newsbang_Homer-v0.4-Qwen2.5-7B_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/newsbang/Homer-v0.4-Qwen2.5-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">newsbang/Homer-v0.4-Qwen2.5-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/newsbang__Homer-v0.4-Qwen2.5-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
newsbang/Homer-v0.4-Qwen2.5-7B
e5b73b06e63de7f77845463f8a11c93e82befd15
33.918837
0
7
false
true
true
false
true
0.63972
0.799941
79.994082
0.55331
36.603703
0.276435
27.643505
0.315436
8.724832
0.431083
13.185417
0.436253
37.36148
false
2024-11-18
0
Removed
newsbang_Homer-v0.5-Qwen2.5-7B_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/newsbang/Homer-v0.5-Qwen2.5-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">newsbang/Homer-v0.5-Qwen2.5-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/newsbang__Homer-v0.5-Qwen2.5-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> has been flagged! <a target="_blank" href="https://huggingface.co/spaces/open-llm-leaderboard/open_llm_leaderboard/discussions/1022" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">See discussion #1022</a>
newsbang/Homer-v0.5-Qwen2.5-7B
9dc7090b2226f9a2217f593518f734e3246001f9
34.600199
0
7
false
false
true
true
true
0.672584
0.788076
78.807564
0.554018
36.678089
0.362538
36.253776
0.302852
7.04698
0.419302
11.379427
0.436918
37.435358
false
2024-11-20
0
Removed
nguyentd_FinancialAdvice-Qwen2.5-7B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/nguyentd/FinancialAdvice-Qwen2.5-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nguyentd/FinancialAdvice-Qwen2.5-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nguyentd__FinancialAdvice-Qwen2.5-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
nguyentd/FinancialAdvice-Qwen2.5-7B
5c3421d5a980d0b2365b0d704ead30c9e534a019
20.935465
apache-2.0
1
7
true
true
true
false
false
0.654445
0.449606
44.960593
0.473093
25.630436
0.093656
9.365559
0.294463
5.928412
0.40249
9.144531
0.375249
30.583259
false
2024-10-21
2024-11-18
1
nguyentd/FinancialAdvice-Qwen2.5-7B (Merge)
nhyha_N3N_Delirium-v1_1030_0227_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/nhyha/N3N_Delirium-v1_1030_0227" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nhyha/N3N_Delirium-v1_1030_0227</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nhyha__N3N_Delirium-v1_1030_0227-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
nhyha/N3N_Delirium-v1_1030_0227
41eabc719bd611e2bd0094b0842df84916a57a46
31.14332
apache-2.0
0
10
true
true
true
false
true
2.131856
0.802289
80.228904
0.589069
40.77504
0.093656
9.365559
0.337248
11.63311
0.409812
9.859896
0.414977
34.997414
false
2024-10-30
2024-11-04
2
unsloth/gemma-2-9b-it
nhyha_N3N_Llama-3.1-8B-Instruct_1028_0216_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/nhyha/N3N_Llama-3.1-8B-Instruct_1028_0216" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nhyha/N3N_Llama-3.1-8B-Instruct_1028_0216</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nhyha__N3N_Llama-3.1-8B-Instruct_1028_0216-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
nhyha/N3N_Llama-3.1-8B-Instruct_1028_0216
d0715a631898112c9c3b729d0334588a2ff636d8
23.403604
apache-2.0
0
8
true
true
true
false
false
0.749506
0.478083
47.80826
0.505374
28.980464
0.167674
16.767372
0.306208
7.494407
0.405031
10.06224
0.36378
29.30888
false
2024-10-28
2024-11-04
2
meta-llama/Meta-Llama-3.1-8B
nhyha_N3N_gemma-2-9b-it_20241029_1532_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/nhyha/N3N_gemma-2-9b-it_20241029_1532" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nhyha/N3N_gemma-2-9b-it_20241029_1532</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nhyha__N3N_gemma-2-9b-it_20241029_1532-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
nhyha/N3N_gemma-2-9b-it_20241029_1532
6cfc55a717961ef206978b577bd74df97efe1455
32.022249
apache-2.0
2
10
true
true
true
false
false
2.394044
0.675194
67.519404
0.586312
40.986668
0.204683
20.468278
0.340604
12.080537
0.459354
16.385938
0.412234
34.692671
false
2024-10-29
2024-11-04
1
unsloth/gemma-2-9b-it
nhyha_N3N_gemma-2-9b-it_20241110_2026_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/nhyha/N3N_gemma-2-9b-it_20241110_2026" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nhyha/N3N_gemma-2-9b-it_20241110_2026</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nhyha__N3N_gemma-2-9b-it_20241110_2026-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
nhyha/N3N_gemma-2-9b-it_20241110_2026
2d4c24278ed9d8b42a4035da16a5aea745797441
28.741941
apache-2.0
0
10
true
true
true
false
true
2.54055
0.628283
62.828296
0.586715
40.944106
0.138218
13.821752
0.336409
11.521253
0.407302
9.779427
0.402011
33.556811
false
2024-11-12
2024-11-12
1
unsloth/gemma-2-9b-it
nhyha_merge_Qwen2.5-7B-Instruct_20241023_0314_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/nhyha/merge_Qwen2.5-7B-Instruct_20241023_0314" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nhyha/merge_Qwen2.5-7B-Instruct_20241023_0314</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nhyha__merge_Qwen2.5-7B-Instruct_20241023_0314-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
nhyha/merge_Qwen2.5-7B-Instruct_20241023_0314
4d93f65c1f870556f05c77a1ef4f26819d49daf7
29.208865
apache-2.0
0
7
true
true
true
false
false
0.695701
0.569457
56.945682
0.555853
36.365185
0.219789
21.978852
0.321309
9.50783
0.425062
11.099479
0.454205
39.356161
false
2024-10-23
2024-11-04
3
Qwen/Qwen2.5-7B
nidum_Nidum-Limitless-Gemma-2B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
GemmaForCausalLM
<a target="_blank" href="https://huggingface.co/nidum/Nidum-Limitless-Gemma-2B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nidum/Nidum-Limitless-Gemma-2B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nidum__Nidum-Limitless-Gemma-2B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
nidum/Nidum-Limitless-Gemma-2B
e209e3513d2b34c0e6c433ede26e17604c25cb1a
5.939422
apache-2.0
4
2
true
true
true
false
true
0.396814
0.242351
24.235141
0.30788
3.45106
0
0
0.264262
1.901566
0.374031
4.120573
0.117354
1.928191
false
2024-08-02
2024-08-07
0
nidum/Nidum-Limitless-Gemma-2B
nisten_franqwenstein-35b_float16
float16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/nisten/franqwenstein-35b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nisten/franqwenstein-35b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nisten__franqwenstein-35b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
nisten/franqwenstein-35b
7180aa73e82945a1d2ae0eb304508e21d57e4c27
35.941926
mit
7
34
true
true
true
false
false
5.01777
0.379863
37.986321
0.664658
52.227468
0.30287
30.287009
0.403523
20.469799
0.494021
22.119271
0.573055
52.561687
false
2024-10-03
2024-10-03
1
nisten/franqwenstein-35b (Merge)
nisten_franqwenstein-35b_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/nisten/franqwenstein-35b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nisten/franqwenstein-35b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nisten__franqwenstein-35b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
nisten/franqwenstein-35b
901351a987d664a1cd7f483115a167d3ae5694ec
34.451117
mit
7
34
true
true
true
false
true
6.328604
0.391354
39.135383
0.659113
51.680277
0.304381
30.438066
0.35906
14.541387
0.468104
19.679688
0.561087
51.2319
false
2024-10-03
2024-10-03
1
nisten/franqwenstein-35b (Merge)
nlpguy_Mistral-NeMo-Minitron-Upscale-v1_bfloat16
bfloat16
🟢 pretrained
🟢
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/nlpguy/Mistral-NeMo-Minitron-Upscale-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nlpguy/Mistral-NeMo-Minitron-Upscale-v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nlpguy__Mistral-NeMo-Minitron-Upscale-v1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
nlpguy/Mistral-NeMo-Minitron-Upscale-v1
9e6d747cbb81e1f25915a0f42802cbeb85b61c3e
10.87693
other
0
12
true
false
true
false
false
2.934306
0.16484
16.48404
0.4468
22.06891
0.007553
0.755287
0.280201
4.026846
0.380354
4.844271
0.25374
17.082225
false
2024-09-29
2024-09-29
1
nlpguy/Mistral-NeMo-Minitron-Upscale-v1 (Merge)
nlpguy_Mistral-NeMo-Minitron-Upscale-v2_bfloat16
bfloat16
🟢 pretrained
🟢
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/nlpguy/Mistral-NeMo-Minitron-Upscale-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nlpguy/Mistral-NeMo-Minitron-Upscale-v2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nlpguy__Mistral-NeMo-Minitron-Upscale-v2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
nlpguy/Mistral-NeMo-Minitron-Upscale-v2
4ac077e496705687fdcbe51f3b915be42e91bf79
8.232151
other
0
12
true
false
true
false
false
2.924619
0.157272
15.727159
0.394967
14.382673
0.006042
0.60423
0.27349
3.131991
0.379083
5.252083
0.192653
10.29477
false
2024-09-29
2024-09-29
1
nlpguy/Mistral-NeMo-Minitron-Upscale-v2 (Merge)
nlpguy_Mistral-NeMo-Minitron-Upscale-v3_bfloat16
bfloat16
🟢 pretrained
🟢
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/nlpguy/Mistral-NeMo-Minitron-Upscale-v3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nlpguy/Mistral-NeMo-Minitron-Upscale-v3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nlpguy__Mistral-NeMo-Minitron-Upscale-v3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
nlpguy/Mistral-NeMo-Minitron-Upscale-v3
6703b09d3d78cc020448ee93c53dc727312bcbaf
5.013437
other
1
12
true
false
true
false
false
6.044669
0.14121
14.120977
0.305245
3.398266
0
0
0.259228
1.230425
0.409844
9.430469
0.117104
1.900488
false
2024-10-04
2024-10-04
1
nlpguy/Mistral-NeMo-Minitron-Upscale-v3 (Merge)
nlpguy_StableProse_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/nlpguy/StableProse" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nlpguy/StableProse</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nlpguy__StableProse-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
nlpguy/StableProse
4937dc747684705e4b87df27b47eab5429f3a9c1
16.422495
1
12
false
true
true
false
false
1.794363
0.197239
19.723888
0.511656
30.180203
0.05287
5.287009
0.302852
7.04698
0.406708
8.871875
0.346825
27.425015
false
2024-08-16
2024-08-17
1
nlpguy/StableProse (Merge)
nlpguy_StarFusion-alpha1_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/nlpguy/StarFusion-alpha1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nlpguy/StarFusion-alpha1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nlpguy__StarFusion-alpha1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
nlpguy/StarFusion-alpha1
dccad965a710d7bee001b6387c8307e7c320291e
20.840912
apache-2.0
0
7
true
false
true
false
true
1.174406
0.566009
56.60093
0.442869
21.933182
0.072508
7.250755
0.295302
6.040268
0.408104
8.879688
0.319066
24.340647
false
2024-04-13
2024-06-26
1
nlpguy/StarFusion-alpha1 (Merge)
nothingiisreal_MN-12B-Starcannon-v2_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/nothingiisreal/MN-12B-Starcannon-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nothingiisreal/MN-12B-Starcannon-v2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nothingiisreal__MN-12B-Starcannon-v2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
nothingiisreal/MN-12B-Starcannon-v2
f2ff756e8c32d9107d4f6a3c18c730e3fe0cae88
18.030393
apache-2.0
5
12
true
false
true
false
true
1.722663
0.392527
39.252738
0.50045
28.424783
0.050604
5.060423
0.278523
3.803132
0.397812
7.993229
0.312832
23.64805
false
2024-08-13
2024-09-03
1
nothingiisreal/MN-12B-Starcannon-v2 (Merge)
nothingiisreal_MN-12B-Starcannon-v3_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/nothingiisreal/MN-12B-Starcannon-v3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nothingiisreal/MN-12B-Starcannon-v3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nothingiisreal__MN-12B-Starcannon-v3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
nothingiisreal/MN-12B-Starcannon-v3
169480b62121c4f070e93a05158545c679712644
18.993414
10
12
false
true
true
false
true
1.745671
0.380738
38.073755
0.517055
30.873002
0.068731
6.873112
0.27349
3.131991
0.404635
9.846094
0.326463
25.16253
false
2024-08-13
2024-09-03
1
nothingiisreal/MN-12B-Starcannon-v3 (Merge)
nvidia_Llama-3.1-Minitron-4B-Depth-Base_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/nvidia/Llama-3.1-Minitron-4B-Depth-Base" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nvidia/Llama-3.1-Minitron-4B-Depth-Base</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nvidia__Llama-3.1-Minitron-4B-Depth-Base-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
nvidia/Llama-3.1-Minitron-4B-Depth-Base
40d82bc951b4f39e9c9e11176334250c30975098
11.53217
other
19
4
true
true
true
false
false
0.467691
0.160694
16.069363
0.41707
19.44411
0.012085
1.208459
0.263423
1.789709
0.401063
10.699479
0.279837
19.9819
true
2024-08-13
2024-09-25
0
nvidia/Llama-3.1-Minitron-4B-Depth-Base
nvidia_Llama-3.1-Nemotron-70B-Instruct-HF_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/nvidia/Llama-3.1-Nemotron-70B-Instruct-HF" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nvidia/Llama-3.1-Nemotron-70B-Instruct-HF</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nvidia__Llama-3.1-Nemotron-70B-Instruct-HF-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
nvidia/Llama-3.1-Nemotron-70B-Instruct-HF
250db5cf2323e04a6d2025a2ca2b94a95c439e88
34.578372
llama3.1
1,731
70
true
true
true
false
true
13.628748
0.738067
73.806722
0.6316
47.10953
0.287009
28.700906
0.258389
1.118568
0.43276
13.195052
0.491855
43.53945
true
2024-10-12
2024-10-16
2
meta-llama/Meta-Llama-3.1-70B
nvidia_Minitron-4B-Base_bfloat16
bfloat16
🟢 pretrained
🟢
Original
NemotronForCausalLM
<a target="_blank" href="https://huggingface.co/nvidia/Minitron-4B-Base" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nvidia/Minitron-4B-Base</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nvidia__Minitron-4B-Base-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
nvidia/Minitron-4B-Base
d6321f64412982046a32d761701167e752fedc02
11.939973
other
127
4
true
true
true
false
false
1.189267
0.221794
22.179373
0.408388
17.215601
0.017372
1.73716
0.269295
2.572707
0.413375
9.938542
0.261968
17.996454
true
2024-07-19
2024-09-25
0
nvidia/Minitron-4B-Base
nvidia_Minitron-8B-Base_bfloat16
bfloat16
🟢 pretrained
🟢
Original
NemotronForCausalLM
<a target="_blank" href="https://huggingface.co/nvidia/Minitron-8B-Base" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nvidia/Minitron-8B-Base</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nvidia__Minitron-8B-Base-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
nvidia/Minitron-8B-Base
70fa5997afc42807f41eebd5d481f040556fdf97
14.178726
other
63
7
true
true
true
false
false
1.412521
0.242427
24.242676
0.439506
22.040793
0.023414
2.34139
0.27349
3.131991
0.402552
9.085677
0.318068
24.229832
true
2024-07-19
2024-09-25
0
nvidia/Minitron-8B-Base
nvidia_Mistral-NeMo-Minitron-8B-Base_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/nvidia/Mistral-NeMo-Minitron-8B-Base" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nvidia/Mistral-NeMo-Minitron-8B-Base</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nvidia__Mistral-NeMo-Minitron-8B-Base-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
nvidia/Mistral-NeMo-Minitron-8B-Base
cc94637b669b62c4829b1e0c3b9074fecd883b74
17.660162
other
160
7
true
true
true
false
false
3.404028
0.194566
19.456597
0.52191
31.822015
0.046073
4.607251
0.325503
10.067114
0.409156
8.944531
0.379571
31.06346
true
2024-08-19
2024-08-22
0
nvidia/Mistral-NeMo-Minitron-8B-Base
nvidia_Mistral-NeMo-Minitron-8B-Instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/nvidia/Mistral-NeMo-Minitron-8B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nvidia/Mistral-NeMo-Minitron-8B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nvidia__Mistral-NeMo-Minitron-8B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
nvidia/Mistral-NeMo-Minitron-8B-Instruct
27964e305f862f9947f577332a943d7013abc30f
21.722143
other
65
8
true
true
true
false
true
1.993898
0.500389
50.038897
0.532092
34.126491
0.005287
0.528701
0.287752
5.033557
0.388573
7.371615
0.399102
33.233599
true
2024-10-02
2024-10-04
1
nvidia/Mistral-NeMo-Minitron-8B-Instruct (Merge)
nvidia_Nemotron-Mini-4B-Instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
NemotronForCausalLM
<a target="_blank" href="https://huggingface.co/nvidia/Nemotron-Mini-4B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nvidia/Nemotron-Mini-4B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nvidia__Nemotron-Mini-4B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
nvidia/Nemotron-Mini-4B-Instruct
6a417790c444fd65a3da6a5c8821de6afc9654a6
17.935515
other
129
4
true
true
true
false
true
1.117314
0.666876
66.687611
0.386484
14.203825
0
0
0.280201
4.026846
0.376729
4.624479
0.262633
18.070331
true
2024-09-10
2024-09-25
1
nvidia/Minitron-4B-Base
nvidia_OpenMath2-Llama3.1-8B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/nvidia/OpenMath2-Llama3.1-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nvidia/OpenMath2-Llama3.1-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nvidia__OpenMath2-Llama3.1-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
nvidia/OpenMath2-Llama3.1-8B
4187cd28e77e76367261992b3274c77ffcbfd3d5
8.987818
llama3.1
24
8
true
true
true
false
false
0.701198
0.233059
23.305939
0.409552
16.29437
0.041541
4.154079
0.265101
2.013423
0.343552
2.010677
0.155336
6.148419
true
2024-09-30
2024-11-23
1
nvidia/OpenMath2-Llama3.1-8B (Merge)
nxmwxm_Beast-Soul-new_float16
float16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/nxmwxm/Beast-Soul-new" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nxmwxm/Beast-Soul-new</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nxmwxm__Beast-Soul-new-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
nxmwxm/Beast-Soul-new
dd2ae8a96b7d088eb94a1cfa6ff84c3489e8c010
21.830262
0
7
false
true
true
false
false
0.657023
0.486875
48.687483
0.522714
33.072759
0.074773
7.477341
0.281879
4.250559
0.445927
14.140885
0.310173
23.352541
false
2024-08-07
2024-08-07
1
nxmwxm/Beast-Soul-new (Merge)
occiglot_occiglot-7b-es-en-instruct_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/occiglot/occiglot-7b-es-en-instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">occiglot/occiglot-7b-es-en-instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/occiglot__occiglot-7b-es-en-instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
occiglot/occiglot-7b-es-en-instruct
5858f6ee118eef70896f1870fd61052348ff571e
12.394963
apache-2.0
2
7
true
true
true
false
true
0.688738
0.348514
34.851416
0.411097
17.23541
0.020393
2.039275
0.259228
1.230425
0.37375
4.452083
0.231051
14.56117
false
2024-03-05
2024-09-02
0
occiglot/occiglot-7b-es-en-instruct
olabs-ai_reflection_model_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Adapter
?
<a target="_blank" href="https://huggingface.co/olabs-ai/reflection_model" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">olabs-ai/reflection_model</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/olabs-ai__reflection_model-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
olabs-ai/reflection_model
a8b0fc584b10e0110e04f9d21c7f10d24391c1d5
14.016225
apache-2.0
1
9
true
true
true
false
false
2.407543
0.159869
15.986915
0.471251
25.206882
0.047583
4.758308
0.300336
6.711409
0.350833
5.754167
0.331117
25.679669
false
2024-09-08
2024-09-08
0
olabs-ai/reflection_model
oobabooga_CodeBooga-34B-v0.1_float16
float16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/oobabooga/CodeBooga-34B-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">oobabooga/CodeBooga-34B-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/oobabooga__CodeBooga-34B-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
oobabooga/CodeBooga-34B-v0.1
8a4e1e16ac46333cbd0c17d733d3d70a956071a6
15.095241
llama2
144
33
true
true
true
false
true
2.087004
0.525018
52.501806
0.342744
8.562466
0.005287
0.528701
0.256711
0.894855
0.431021
12.977604
0.235954
15.106014
false
2023-10-19
2024-07-29
0
oobabooga/CodeBooga-34B-v0.1
oopere_pruned20-llama-1b_float16
float16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/oopere/pruned20-llama-1b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">oopere/pruned20-llama-1b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/oopere__pruned20-llama-1b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
oopere/pruned20-llama-1b
3351c9a062055ce6c16dd2c9f0c229fb5dd7396b
4.863638
llama3.2
0
1
true
true
true
false
false
0.401478
0.199362
19.936214
0.303136
3.185394
0.003021
0.302115
0.25
0
0.363146
4.393229
0.112284
1.364879
false
2024-11-16
2024-11-16
1
oopere/pruned20-llama-1b (Merge)
oopere_pruned40-llama-1b_float16
float16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/oopere/pruned40-llama-1b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">oopere/pruned40-llama-1b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/oopere__pruned40-llama-1b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
oopere/pruned40-llama-1b
3de470d9c61cb57cea821e93b43fb250aa14b975
6.495064
llama3.2
0
0
true
true
true
false
false
0.376621
0.228438
22.843832
0.296916
2.655309
0.000755
0.075529
0.243289
0
0.428667
12.483333
0.108211
0.912382
false
2024-11-16
2024-11-26
1
oopere/pruned40-llama-1b (Merge)
oopere_pruned60-llama-1b_float16
float16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/oopere/pruned60-llama-1b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">oopere/pruned60-llama-1b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/oopere__pruned60-llama-1b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
oopere/pruned60-llama-1b
86b157256928b50ee07cc3cf5b3884b70062f2fe
5.429802
llama3.2
0
0
true
true
true
false
false
0.382488
0.18285
18.285039
0.301619
2.942526
0
0
0.249161
0
0.408792
9.432292
0.117271
1.918957
false
2024-11-16
2024-11-25
1
oopere/pruned60-llama-1b (Merge)
openai-community_gpt2_bfloat16
bfloat16
🟢 pretrained
🟢
Original
GPT2LMHeadModel
<a target="_blank" href="https://huggingface.co/openai-community/gpt2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">openai-community/gpt2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/openai-community__gpt2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
openai-community/gpt2
607a30d783dfa663caf39e06633721c8d4cfcd7e
6.510807
mit
2,382
0
true
true
true
false
false
0.085941
0.179253
17.925327
0.303571
2.674981
0.002266
0.226586
0.258389
1.118568
0.447052
15.348177
0.115941
1.771203
true
2022-03-02
2024-06-12
0
openai-community/gpt2
openai-community_gpt2_float16
float16
🟢 pretrained
🟢
Original
GPT2LMHeadModel
<a target="_blank" href="https://huggingface.co/openai-community/gpt2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">openai-community/gpt2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/openai-community__gpt2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
openai-community/gpt2
607a30d783dfa663caf39e06633721c8d4cfcd7e
6.296471
mit
2,382
0
true
true
true
false
false
0.117387
0.177954
17.795449
0.301658
2.815911
0.003021
0.302115
0.258389
1.118568
0.439021
13.910938
0.116523
1.835845
true
2022-03-02
2024-08-12
0
openai-community/gpt2
openai-community_gpt2-large_bfloat16
bfloat16
🟢 pretrained
🟢
Original
GPT2LMHeadModel
<a target="_blank" href="https://huggingface.co/openai-community/gpt2-large" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">openai-community/gpt2-large</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/openai-community__gpt2-large-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
openai-community/gpt2-large
32b71b12589c2f8d625668d2335a01cac3249519
5.47959
mit
275
0
true
true
true
false
false
0.180462
0.204782
20.47822
0.306884
3.253791
0.006798
0.679758
0.259228
1.230425
0.378865
5.658073
0.114195
1.577275
true
2022-03-02
2024-06-12
0
openai-community/gpt2-large
openai-community_gpt2-medium_bfloat16
bfloat16
🟢 pretrained
🟢
Original
GPT2LMHeadModel
<a target="_blank" href="https://huggingface.co/openai-community/gpt2-medium" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">openai-community/gpt2-medium</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/openai-community__gpt2-medium-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
openai-community/gpt2-medium
6dcaa7a952f72f9298047fd5137cd6e4f05f41da
5.826812
mit
157
0
true
true
true
false
false
0.121062
0.220844
22.084403
0.305028
2.719972
0.003021
0.302115
0.262584
1.677852
0.388448
6.15599
0.118185
2.020538
true
2022-03-02
2024-06-12
0
openai-community/gpt2-medium
openai-community_gpt2-xl_bfloat16
bfloat16
🟢 pretrained
🟢
Original
GPT2LMHeadModel
<a target="_blank" href="https://huggingface.co/openai-community/gpt2-xl" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">openai-community/gpt2-xl</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/openai-community__gpt2-xl-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
openai-community/gpt2-xl
15ea56dee5df4983c59b2538573817e1667135e2
4.980188
mit
312
1
true
true
true
false
false
0.215314
0.203858
20.385799
0.300858
2.580961
0.003021
0.302115
0.258389
1.118568
0.370958
4.036458
0.113115
1.457225
true
2022-03-02
2024-06-12
0
openai-community/gpt2-xl
openbmb_MiniCPM-S-1B-sft-llama-format_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/openbmb/MiniCPM-S-1B-sft-llama-format" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">openbmb/MiniCPM-S-1B-sft-llama-format</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/openbmb__MiniCPM-S-1B-sft-llama-format-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
openbmb/MiniCPM-S-1B-sft-llama-format
7de07f8895c168a7ee01f624f50c44f6966c9735
8.870185
apache-2.0
4
1
true
true
true
false
true
0.540037
0.332877
33.287677
0.304931
3.898455
0.023414
2.34139
0.270973
2.796421
0.331677
1.359635
0.185838
9.53753
false
2024-06-14
2024-11-19
0
openbmb/MiniCPM-S-1B-sft-llama-format
openchat_openchat-3.5-0106_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/openchat/openchat-3.5-0106" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">openchat/openchat-3.5-0106</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/openchat__openchat-3.5-0106-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
openchat/openchat-3.5-0106
ff058fda49726ecf4ea53dc1635f917cdb8ba36b
22.658683
apache-2.0
347
7
true
true
true
false
true
2.354959
0.595135
59.513535
0.461698
24.038711
0.074773
7.477341
0.307886
7.718121
0.425437
11.746354
0.329122
25.458038
true
2024-01-07
2024-06-27
1
mistralai/Mistral-7B-v0.1
openchat_openchat-3.5-1210_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/openchat/openchat-3.5-1210" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">openchat/openchat-3.5-1210</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/openchat__openchat-3.5-1210-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
openchat/openchat-3.5-1210
801f5459b7577241500785f11c2b026912badd6e
22.690085
apache-2.0
276
7
true
true
true
false
true
0.516451
0.603678
60.367824
0.453536
23.236297
0.076284
7.628399
0.301174
6.823266
0.441438
14.279688
0.314245
23.805038
true
2023-12-12
2024-06-12
1
mistralai/Mistral-7B-v0.1
openchat_openchat-3.6-8b-20240522_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/openchat/openchat-3.6-8b-20240522" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">openchat/openchat-3.6-8b-20240522</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/openchat__openchat-3.6-8b-20240522-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
openchat/openchat-3.6-8b-20240522
2264eb98558978f708e88ae52afb78e43b832801
22.830377
llama3
151
8
true
true
true
false
true
3.267332
0.534336
53.433556
0.533841
33.232937
0.083082
8.308157
0.317953
9.060403
0.399854
8.181771
0.322889
24.76544
true
2024-05-07
2024-06-26
1
meta-llama/Meta-Llama-3-8B
openchat_openchat_3.5_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/openchat/openchat_3.5" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">openchat/openchat_3.5</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/openchat__openchat_3.5-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
openchat/openchat_3.5
0fc98e324280bc4bf5d2c30ecf7b97b84fb8a19b
21.648415
apache-2.0
1,117
7
true
true
true
false
true
0.501211
0.593112
59.311183
0.442632
21.582167
0.073263
7.326284
0.298658
6.487696
0.422865
11.258073
0.315326
23.925089
true
2023-10-30
2024-06-12
0
openchat/openchat_3.5
openchat_openchat_v3.2_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/openchat/openchat_v3.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">openchat/openchat_v3.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/openchat__openchat_v3.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
openchat/openchat_v3.2
acc7ce92558681e749678648189812f15c1465fe
13.845734
llama2
42
13
true
true
true
false
false
5.302455
0.298056
29.805583
0.433056
20.323003
0.013595
1.359517
0.270134
2.684564
0.433625
13.103125
0.242188
15.798611
true
2023-07-30
2024-06-12
0
openchat/openchat_v3.2
openchat_openchat_v3.2_super_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/openchat/openchat_v3.2_super" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">openchat/openchat_v3.2_super</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/openchat__openchat_v3.2_super-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
openchat/openchat_v3.2_super
9479cc37d43234a57a33628637d1aca0293d745a
12.848046
llama2
36
13
true
true
true
false
false
5.027694
0.286191
28.619064
0.422121
19.15354
0.016616
1.661631
0.264262
1.901566
0.416135
9.916927
0.24252
15.83555
true
2023-09-04
2024-06-12
0
openchat/openchat_v3.2_super
orai-nlp_Llama-eus-8B_bfloat16
bfloat16
🟩 continuously pretrained
🟩
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/orai-nlp/Llama-eus-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">orai-nlp/Llama-eus-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/orai-nlp__Llama-eus-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
orai-nlp/Llama-eus-8B
75b5645d222047b517a7a9190922ea1b5382c71f
13.90599
5
8
false
true
true
false
false
0.869258
0.216123
21.612322
0.441825
20.961371
0.044562
4.456193
0.28943
5.257271
0.391885
8.285677
0.305768
22.863106
false
2024-09-04
2024-09-30
1
meta-llama/Meta-Llama-3.1-8B
paloalma_ECE-TW3-JRGL-V1_float16
float16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/paloalma/ECE-TW3-JRGL-V1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">paloalma/ECE-TW3-JRGL-V1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/paloalma__ECE-TW3-JRGL-V1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
paloalma/ECE-TW3-JRGL-V1
2f08c7ab9db03b1b9f455c7beee6a41e99aa910e
30.223413
apache-2.0
1
68
true
false
true
false
false
6.191694
0.553495
55.349473
0.628367
46.697139
0.130665
13.066465
0.347315
12.975391
0.462083
17.460417
0.422124
35.791593
false
2024-04-03
2024-08-04
0
paloalma/ECE-TW3-JRGL-V1
paloalma_ECE-TW3-JRGL-V2_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/paloalma/ECE-TW3-JRGL-V2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">paloalma/ECE-TW3-JRGL-V2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/paloalma__ECE-TW3-JRGL-V2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
paloalma/ECE-TW3-JRGL-V2
f2c15045f1a7a7a34540ab18abcee8a566a74ca6
25.679422
apache-2.0
0
72
true
false
true
false
false
12.546249
0.225489
22.548948
0.603099
43.173268
0.178248
17.824773
0.331376
10.850112
0.479323
19.815365
0.458777
39.864066
false
2024-04-04
2024-09-19
0
paloalma/ECE-TW3-JRGL-V2
paloalma_ECE-TW3-JRGL-V5_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/paloalma/ECE-TW3-JRGL-V5" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">paloalma/ECE-TW3-JRGL-V5</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/paloalma__ECE-TW3-JRGL-V5-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
paloalma/ECE-TW3-JRGL-V5
4061fa10de22945790cad825f7f4dec96d55b204
29.454282
apache-2.0
0
72
true
false
true
false
false
23.031124
0.455251
45.525096
0.602471
43.462514
0.181269
18.126888
0.341443
12.192394
0.462052
16.889844
0.464761
40.52896
false
2024-04-11
2024-08-30
0
paloalma/ECE-TW3-JRGL-V5
paloalma_Le_Triomphant-ECE-TW3_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/paloalma/Le_Triomphant-ECE-TW3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">paloalma/Le_Triomphant-ECE-TW3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/paloalma__Le_Triomphant-ECE-TW3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
paloalma/Le_Triomphant-ECE-TW3
f72399253bb3e65c0f55e50461488c098f658a49
31.933354
apache-2.0
3
72
true
false
true
false
false
10.418391
0.540206
54.020554
0.611206
44.963294
0.191088
19.108761
0.348993
13.199105
0.4725
18.495833
0.476313
41.812574
false
2024-04-01
2024-07-25
0
paloalma/Le_Triomphant-ECE-TW3
paloalma_TW3-JRGL-v2_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/paloalma/TW3-JRGL-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">paloalma/TW3-JRGL-v2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/paloalma__TW3-JRGL-v2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
paloalma/TW3-JRGL-v2
aca3f0ba2bfb90038a9e2cd5b486821d4c181b46
32.399598
apache-2.0
0
72
true
false
true
false
false
20.896294
0.531613
53.161279
0.613753
45.61111
0.175227
17.522659
0.35906
14.541387
0.485833
20.695833
0.485788
42.865322
false
2024-04-01
2024-08-29
0
paloalma/TW3-JRGL-v2