eval_name
stringlengths 12
111
| Precision
stringclasses 3
values | Type
stringclasses 6
values | T
stringclasses 6
values | Weight type
stringclasses 2
values | Architecture
stringclasses 49
values | Model
stringlengths 355
650
| fullname
stringlengths 4
102
| Model sha
stringlengths 0
40
| Average ⬆️
float64 1.41
51.2
| Hub License
stringclasses 25
values | Hub ❤️
int64 0
5.84k
| #Params (B)
int64 -1
140
| Available on the hub
bool 2
classes | Not_Merged
bool 2
classes | MoE
bool 2
classes | Flagged
bool 1
class | Chat Template
bool 2
classes | CO₂ cost (kg)
float64 0.04
107
| IFEval Raw
float64 0
0.87
| IFEval
float64 0
86.7
| BBH Raw
float64 0.28
0.75
| BBH
float64 0.81
63.5
| MATH Lvl 5 Raw
float64 0
0.51
| MATH Lvl 5
float64 0
50.7
| GPQA Raw
float64 0.22
0.44
| GPQA
float64 0
24.9
| MUSR Raw
float64 0.29
0.59
| MUSR
float64 0
36.4
| MMLU-PRO Raw
float64 0.1
0.7
| MMLU-PRO
float64 0
66.8
| Maintainer's Highlight
bool 2
classes | Upload To Hub Date
stringlengths 0
10
| Submission Date
stringclasses 154
values | Generation
int64 0
8
| Base Model
stringlengths 4
102
|
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
spow12_ChatWaifu_22B_v2.0_preview_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/spow12/ChatWaifu_22B_v2.0_preview" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">spow12/ChatWaifu_22B_v2.0_preview</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/spow12__ChatWaifu_22B_v2.0_preview-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | spow12/ChatWaifu_22B_v2.0_preview | 36af7ec06bc85405e8641986ad45c6d21353b114 | 29.4075 | cc-by-nc-4.0 | 6 | 22 | true | false | true | false | true | 1.494204 | 0.674495 | 67.449478 | 0.617015 | 45.488294 | 0.180514 | 18.05136 | 0.315436 | 8.724832 | 0.368542 | 3.534375 | 0.39877 | 33.196661 | false | 2024-09-23 | 2024-09-24 | 1 | spow12/ChatWaifu_22B_v2.0_preview (Merge) |
spow12_ChatWaifu_v1.4_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/spow12/ChatWaifu_v1.4" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">spow12/ChatWaifu_v1.4</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/spow12__ChatWaifu_v1.4-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | spow12/ChatWaifu_v1.4 | c5b2b30a8e9fa23722b6e30aa2ca1dab7fe1c2b5 | 25.379443 | cc-by-nc-4.0 | 14 | 12 | true | false | true | false | true | 1.442143 | 0.569057 | 56.905677 | 0.517625 | 31.630554 | 0.086103 | 8.610272 | 0.307047 | 7.606264 | 0.474333 | 20.025 | 0.34749 | 27.498892 | false | 2024-09-03 | 2024-09-05 | 1 | spow12/ChatWaifu_v1.4 (Merge) |
spow12_ChatWaifu_v2.0_22B_float16 | float16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/spow12/ChatWaifu_v2.0_22B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">spow12/ChatWaifu_v2.0_22B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/spow12__ChatWaifu_v2.0_22B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | spow12/ChatWaifu_v2.0_22B | 54771319920ed791ba3f0262b036f37a92b880f2 | 28.838098 | cc-by-nc-4.0 | 6 | 22 | true | false | true | false | true | 2.739835 | 0.651089 | 65.108911 | 0.59263 | 42.286228 | 0.185801 | 18.58006 | 0.324664 | 9.955257 | 0.384198 | 5.591406 | 0.383561 | 31.506723 | false | 2024-10-11 | 2024-10-11 | 1 | spow12/ChatWaifu_v2.0_22B (Merge) |
spow12_ChatWaifu_v2.0_22B_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/spow12/ChatWaifu_v2.0_22B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">spow12/ChatWaifu_v2.0_22B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/spow12__ChatWaifu_v2.0_22B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | spow12/ChatWaifu_v2.0_22B | a6e7c206d9af77d3f85faf0ce4a711d62815b2ab | 28.868659 | cc-by-nc-4.0 | 6 | 22 | true | false | true | false | true | 1.39586 | 0.651738 | 65.17385 | 0.590805 | 42.019798 | 0.193353 | 19.335347 | 0.323826 | 9.8434 | 0.384198 | 5.591406 | 0.381233 | 31.248153 | false | 2024-10-11 | 2024-10-14 | 1 | spow12/ChatWaifu_v2.0_22B (Merge) |
ssmits_Qwen2.5-95B-Instruct_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/ssmits/Qwen2.5-95B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ssmits/Qwen2.5-95B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ssmits__Qwen2.5-95B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | ssmits/Qwen2.5-95B-Instruct | 9c0e7df57a4fcf4d364efd916a0fc0abdd2d20a3 | 37.440125 | other | 3 | 94 | true | true | true | false | true | 19.233495 | 0.843105 | 84.310518 | 0.70378 | 58.530351 | 0.061178 | 6.117825 | 0.364094 | 15.212528 | 0.428385 | 13.614844 | 0.521692 | 46.854684 | false | 2024-09-24 | 2024-09-26 | 1 | ssmits/Qwen2.5-95B-Instruct (Merge) |
stabilityai_StableBeluga2_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/stabilityai/StableBeluga2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">stabilityai/StableBeluga2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/stabilityai__StableBeluga2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | stabilityai/StableBeluga2 | cb47d3db70ea3ddc2cabdeb358c303b328f65900 | 22.682842 | 884 | 68 | true | true | true | false | false | 6.254674 | 0.378714 | 37.871403 | 0.582413 | 41.263261 | 0.036254 | 3.625378 | 0.316275 | 8.836689 | 0.472969 | 18.654427 | 0.332613 | 25.845892 | true | 2023-07-20 | 2024-06-13 | 0 | stabilityai/StableBeluga2 |
|
stabilityai_stablelm-2-12b_bfloat16 | bfloat16 | 🟢 pretrained | 🟢 | Original | StableLmForCausalLM | <a target="_blank" href="https://huggingface.co/stabilityai/stablelm-2-12b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">stabilityai/stablelm-2-12b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/stabilityai__stablelm-2-12b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | stabilityai/stablelm-2-12b | fead13ddbf4492970666650c3cd6f85f485411ec | 13.935722 | other | 116 | 12 | true | true | true | false | false | 1.473279 | 0.156921 | 15.692141 | 0.450865 | 22.685797 | 0.039275 | 3.927492 | 0.278523 | 3.803132 | 0.447885 | 14.485677 | 0.307181 | 23.020095 | true | 2024-03-21 | 2024-06-12 | 0 | stabilityai/stablelm-2-12b |
stabilityai_stablelm-2-12b-chat_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | StableLmForCausalLM | <a target="_blank" href="https://huggingface.co/stabilityai/stablelm-2-12b-chat" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">stabilityai/stablelm-2-12b-chat</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/stabilityai__stablelm-2-12b-chat-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | stabilityai/stablelm-2-12b-chat | b6b62cd451b84e848514c00fafa66d9ead9297c5 | 16.249477 | other | 86 | 12 | true | true | true | false | true | 1.088097 | 0.408165 | 40.816478 | 0.467202 | 25.253697 | 0.021903 | 2.190332 | 0.266779 | 2.237136 | 0.391427 | 7.728385 | 0.273438 | 19.270833 | true | 2024-04-04 | 2024-06-12 | 0 | stabilityai/stablelm-2-12b-chat |
stabilityai_stablelm-2-1_6b_float16 | float16 | 🟢 pretrained | 🟢 | Original | StableLmForCausalLM | <a target="_blank" href="https://huggingface.co/stabilityai/stablelm-2-1_6b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">stabilityai/stablelm-2-1_6b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/stabilityai__stablelm-2-1_6b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | stabilityai/stablelm-2-1_6b | 8879812cccd176fbbe9ceb747b815bcc7d6499f8 | 5.216127 | other | 186 | 1 | true | true | true | false | false | 0.549872 | 0.115705 | 11.570522 | 0.338458 | 8.632695 | 0.001511 | 0.151057 | 0.248322 | 0 | 0.388198 | 5.791406 | 0.14636 | 5.151079 | true | 2024-01-18 | 2024-06-12 | 0 | stabilityai/stablelm-2-1_6b |
stabilityai_stablelm-2-1_6b-chat_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | StableLmForCausalLM | <a target="_blank" href="https://huggingface.co/stabilityai/stablelm-2-1_6b-chat" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">stabilityai/stablelm-2-1_6b-chat</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/stabilityai__stablelm-2-1_6b-chat-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | stabilityai/stablelm-2-1_6b-chat | f3fe67057c2789ae1bb1fe42b038da99840d4f13 | 8.640775 | other | 32 | 1 | true | true | true | false | true | 0.495427 | 0.305999 | 30.599919 | 0.339017 | 7.493378 | 0.011329 | 1.132931 | 0.247483 | 0 | 0.357969 | 5.71276 | 0.162151 | 6.905659 | true | 2024-04-08 | 2024-06-12 | 0 | stabilityai/stablelm-2-1_6b-chat |
stabilityai_stablelm-2-zephyr-1_6b_float16 | float16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | StableLmForCausalLM | <a target="_blank" href="https://huggingface.co/stabilityai/stablelm-2-zephyr-1_6b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">stabilityai/stablelm-2-zephyr-1_6b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/stabilityai__stablelm-2-zephyr-1_6b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | stabilityai/stablelm-2-zephyr-1_6b | 2f275b1127d59fc31e4f7c7426d528768ada9ea4 | 9.281934 | other | 181 | 1 | true | true | true | false | true | 0.473089 | 0.327931 | 32.7931 | 0.335161 | 6.70871 | 0.022659 | 2.265861 | 0.243289 | 0 | 0.351146 | 5.993229 | 0.171376 | 7.930703 | true | 2024-01-19 | 2024-06-12 | 0 | stabilityai/stablelm-2-zephyr-1_6b |
stabilityai_stablelm-3b-4e1t_bfloat16 | bfloat16 | 🟢 pretrained | 🟢 | Original | StableLmForCausalLM | <a target="_blank" href="https://huggingface.co/stabilityai/stablelm-3b-4e1t" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">stabilityai/stablelm-3b-4e1t</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/stabilityai__stablelm-3b-4e1t-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | stabilityai/stablelm-3b-4e1t | fa4a6a92fca83c3b4223a3c9bf792887090ebfba | 7.263251 | cc-by-sa-4.0 | 309 | 2 | true | true | true | false | false | 0.434265 | 0.22032 | 22.031986 | 0.350421 | 9.01307 | 0.006798 | 0.679758 | 0.237416 | 0 | 0.377781 | 4.422656 | 0.166888 | 7.432033 | true | 2023-09-29 | 2024-08-10 | 0 | stabilityai/stablelm-3b-4e1t |
stabilityai_stablelm-zephyr-3b_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | StableLmForCausalLM | <a target="_blank" href="https://huggingface.co/stabilityai/stablelm-zephyr-3b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">stabilityai/stablelm-zephyr-3b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/stabilityai__stablelm-zephyr-3b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | stabilityai/stablelm-zephyr-3b | a14f62d95754d96aea2be6e24c0f6966636797b9 | 12.356619 | other | 248 | 2 | true | true | true | false | true | 0.384024 | 0.368323 | 36.832272 | 0.386636 | 14.759119 | 0.042296 | 4.229607 | 0.239094 | 0 | 0.418302 | 9.78776 | 0.176779 | 8.530954 | true | 2023-11-21 | 2024-06-12 | 0 | stabilityai/stablelm-zephyr-3b |
sthenno-com_miscii-14b-1028_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/sthenno-com/miscii-14b-1028" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sthenno-com/miscii-14b-1028</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sthenno-com__miscii-14b-1028-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | sthenno-com/miscii-14b-1028 | a60c866621ee35d04e84cf366e972f2466d617b1 | 35.054416 | apache-2.0 | 11 | 14 | true | true | true | false | true | 1.533728 | 0.823671 | 82.367119 | 0.644833 | 49.262668 | 0.063444 | 6.344411 | 0.356544 | 14.205817 | 0.418156 | 12.002865 | 0.515293 | 46.143617 | false | 2024-11-12 | 2024-11-17 | 1 | sthenno-com/miscii-14b-1028 (Merge) |
suayptalha_HomerCreativeAnvita-Mix-Qw7B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/suayptalha/HomerCreativeAnvita-Mix-Qw7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">suayptalha/HomerCreativeAnvita-Mix-Qw7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/suayptalha__HomerCreativeAnvita-Mix-Qw7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | suayptalha/HomerCreativeAnvita-Mix-Qw7B | 5be9b48b59652687d3e5b88f9e935b51869756ad | 34.620978 | 3 | 7 | false | true | true | false | true | 0.649881 | 0.780782 | 78.078166 | 0.556465 | 36.984168 | 0.310423 | 31.042296 | 0.314597 | 8.612975 | 0.441594 | 14.732552 | 0.444481 | 38.275709 | false | 2024-11-22 | 2024-11-24 | 1 | suayptalha/HomerCreativeAnvita-Mix-Qw7B (Merge) |
|
suayptalha_Komodo-Llama-3.2-3B-v2-fp16_float16 | float16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/suayptalha/Komodo-Llama-3.2-3B-v2-fp16" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">suayptalha/Komodo-Llama-3.2-3B-v2-fp16</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/suayptalha__Komodo-Llama-3.2-3B-v2-fp16-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | suayptalha/Komodo-Llama-3.2-3B-v2-fp16 | 1ff4b55d952597429c249ca71dc08b823eba17c0 | 19.587262 | apache-2.0 | 1 | 3 | true | true | true | false | true | 0.598065 | 0.634053 | 63.40532 | 0.4355 | 20.204329 | 0.062689 | 6.268882 | 0.277685 | 3.691275 | 0.340573 | 3.371615 | 0.285239 | 20.582151 | false | 2024-11-19 | 2024-11-19 | 1 | suayptalha/Komodo-Llama-3.2-3B-v2-fp16 (Merge) |
suayptalha_Rombos-2.5-T.E-8.1_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/suayptalha/Rombos-2.5-T.E-8.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">suayptalha/Rombos-2.5-T.E-8.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/suayptalha__Rombos-2.5-T.E-8.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | suayptalha/Rombos-2.5-T.E-8.1 | c0ee2950b07377e1d0e01fc013a0f200b0306ea2 | 27.335179 | cc-by-nc-sa-4.0 | 3 | 7 | true | false | true | false | true | 0.686016 | 0.692505 | 69.250478 | 0.551464 | 36.499861 | 0.008308 | 0.830816 | 0.311242 | 8.165548 | 0.416635 | 10.979427 | 0.444564 | 38.284944 | false | 2024-11-16 | 2024-11-16 | 1 | suayptalha/Rombos-2.5-T.E-8.1 (Merge) |
sumink_ftgpt_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | GPT2LMHeadModel | <a target="_blank" href="https://huggingface.co/sumink/ftgpt" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sumink/ftgpt</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sumink__ftgpt-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | sumink/ftgpt | fea7c59fff2443a73a7fd11a78b1d80eb5f0c4e6 | 3.951784 | mit | 0 | 0 | true | true | true | false | false | 0.052818 | 0.07871 | 7.871004 | 0.291909 | 1.931277 | 0 | 0 | 0.264262 | 1.901566 | 0.413844 | 10.097135 | 0.117188 | 1.909722 | false | 2024-11-06 | 2024-11-20 | 0 | sumink/ftgpt |
sunbaby_BrainCog-8B-0.1-Instruct_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/sunbaby/BrainCog-8B-0.1-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sunbaby/BrainCog-8B-0.1-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sunbaby__BrainCog-8B-0.1-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | sunbaby/BrainCog-8B-0.1-Instruct | 6c03cb7af723c7f7785df9eee5d5838247619bee | 18.040754 | apache-2.0 | 0 | 8 | true | true | true | false | true | 0.834554 | 0.4253 | 42.530043 | 0.461822 | 24.283468 | 0.076284 | 7.628399 | 0.301174 | 6.823266 | 0.365594 | 6.332552 | 0.285821 | 20.646794 | false | 2024-07-31 | 2024-08-27 | 1 | meta-llama/Meta-Llama-3-8B |
swap-uniba_LLaMAntino-3-ANITA-8B-Inst-DPO-ITA_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/swap-uniba/LLaMAntino-3-ANITA-8B-Inst-DPO-ITA" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">swap-uniba/LLaMAntino-3-ANITA-8B-Inst-DPO-ITA</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/swap-uniba__LLaMAntino-3-ANITA-8B-Inst-DPO-ITA-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | swap-uniba/LLaMAntino-3-ANITA-8B-Inst-DPO-ITA | 2b6e46e4c9d341dc8bf8350a167492c880116b66 | 21.752024 | llama3 | 22 | 8 | true | true | true | false | false | 0.816639 | 0.481505 | 48.150463 | 0.49357 | 27.990828 | 0.043807 | 4.380665 | 0.298658 | 6.487696 | 0.43874 | 13.242448 | 0.37234 | 30.260047 | false | 2024-04-29 | 2024-10-25 | 1 | meta-llama/Meta-Llama-3-8B-Instruct |
talha2001_Beast-Soul-new_float16 | float16 | 🤝 base merges and moerges | 🤝 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/talha2001/Beast-Soul-new" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">talha2001/Beast-Soul-new</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/talha2001__Beast-Soul-new-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | talha2001/Beast-Soul-new | e6cf8caa60264a3005df2ff4b9d967f684519d4b | 21.804866 | 0 | 7 | false | true | true | false | false | 0.642883 | 0.485351 | 48.535109 | 0.522714 | 33.072759 | 0.074773 | 7.477341 | 0.281879 | 4.250559 | 0.445927 | 14.140885 | 0.310173 | 23.352541 | false | 2024-08-07 | 2024-08-07 | 1 | talha2001/Beast-Soul-new (Merge) |
|
tangledgroup_tangled-llama-pints-1.5b-v0.1-instruct_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/tangledgroup/tangled-llama-pints-1.5b-v0.1-instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">tangledgroup/tangled-llama-pints-1.5b-v0.1-instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/tangledgroup__tangled-llama-pints-1.5b-v0.1-instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | tangledgroup/tangled-llama-pints-1.5b-v0.1-instruct | 3e1429f20007740877c51e44ed63b870a57a2e17 | 4.190264 | apache-2.0 | 0 | 1 | true | true | true | false | true | 0.295434 | 0.150902 | 15.090183 | 0.314344 | 3.842195 | 0.001511 | 0.151057 | 0.239933 | 0 | 0.376135 | 4.85026 | 0.110871 | 1.20789 | false | 2024-08-27 | 2024-08-29 | 1 | pints-ai/1.5-Pints-16K-v0.1 |
tangledgroup_tangled-llama-pints-1.5b-v0.2-instruct_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/tangledgroup/tangled-llama-pints-1.5b-v0.2-instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">tangledgroup/tangled-llama-pints-1.5b-v0.2-instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/tangledgroup__tangled-llama-pints-1.5b-v0.2-instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | tangledgroup/tangled-llama-pints-1.5b-v0.2-instruct | 5c229e26f3ab3d0f0f613ed242f3f0f57c930155 | 4.65774 | apache-2.0 | 0 | 1 | true | true | true | false | true | 0.297811 | 0.172409 | 17.240921 | 0.315835 | 4.080205 | 0.007553 | 0.755287 | 0.241611 | 0 | 0.364292 | 4.569792 | 0.111702 | 1.300236 | false | 2024-09-14 | 2024-09-15 | 1 | pints-ai/1.5-Pints-16K-v0.1 |
tanliboy_lambda-gemma-2-9b-dpo_float16 | float16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/tanliboy/lambda-gemma-2-9b-dpo" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">tanliboy/lambda-gemma-2-9b-dpo</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/tanliboy__lambda-gemma-2-9b-dpo-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | tanliboy/lambda-gemma-2-9b-dpo | b141471308bc41ffe15180a6668c735396c3949b | 21.33689 | gemma | 1 | 9 | true | true | true | false | true | 2.241587 | 0.45008 | 45.008023 | 0.547172 | 35.554545 | 0 | 0 | 0.313758 | 8.501119 | 0.401656 | 7.940365 | 0.379156 | 31.017287 | false | 2024-07-24 | 2024-09-18 | 2 | google/gemma-2-9b |
tanliboy_lambda-gemma-2-9b-dpo_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/tanliboy/lambda-gemma-2-9b-dpo" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">tanliboy/lambda-gemma-2-9b-dpo</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/tanliboy__lambda-gemma-2-9b-dpo-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | tanliboy/lambda-gemma-2-9b-dpo | b141471308bc41ffe15180a6668c735396c3949b | 16.970109 | gemma | 1 | 9 | true | true | true | false | true | 2.903576 | 0.182925 | 18.292464 | 0.548791 | 35.739663 | 0 | 0 | 0.310403 | 8.053691 | 0.405625 | 8.569792 | 0.380485 | 31.165041 | false | 2024-07-24 | 2024-09-18 | 2 | google/gemma-2-9b |
tanliboy_lambda-qwen2.5-14b-dpo-test_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/tanliboy/lambda-qwen2.5-14b-dpo-test" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">tanliboy/lambda-qwen2.5-14b-dpo-test</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/tanliboy__lambda-qwen2.5-14b-dpo-test-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | tanliboy/lambda-qwen2.5-14b-dpo-test | 96607eea3c67f14f73e576580610dba7530c5dd9 | 33.516192 | apache-2.0 | 7 | 14 | true | true | true | false | true | 1.800743 | 0.823122 | 82.312154 | 0.639351 | 48.45444 | 0 | 0 | 0.362416 | 14.988814 | 0.426031 | 12.58724 | 0.484791 | 42.754507 | false | 2024-09-20 | 2024-09-20 | 2 | Qwen/Qwen2.5-14B |
tanliboy_lambda-qwen2.5-32b-dpo-test_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/tanliboy/lambda-qwen2.5-32b-dpo-test" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">tanliboy/lambda-qwen2.5-32b-dpo-test</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/tanliboy__lambda-qwen2.5-32b-dpo-test-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | tanliboy/lambda-qwen2.5-32b-dpo-test | 675b60d6e859455a6139e6e284bbe1844b8ddf46 | 35.753394 | apache-2.0 | 4 | 32 | true | true | true | false | true | 5.499303 | 0.808384 | 80.838398 | 0.67639 | 54.407961 | 0 | 0 | 0.356544 | 14.205817 | 0.427427 | 13.328385 | 0.565658 | 51.739805 | false | 2024-09-22 | 2024-09-30 | 2 | Qwen/Qwen2.5-32B |
teknium_CollectiveCognition-v1.1-Mistral-7B_float16 | float16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/teknium/CollectiveCognition-v1.1-Mistral-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">teknium/CollectiveCognition-v1.1-Mistral-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/teknium__CollectiveCognition-v1.1-Mistral-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | teknium/CollectiveCognition-v1.1-Mistral-7B | 5f57f70ec99450c70da2540e94dd7fd67be4b23c | 14.268985 | apache-2.0 | 78 | 7 | true | true | true | false | false | 0.429318 | 0.279046 | 27.904626 | 0.449343 | 23.476134 | 0.031722 | 3.172205 | 0.286913 | 4.9217 | 0.386927 | 5.732552 | 0.28366 | 20.406693 | true | 2023-10-04 | 2024-06-12 | 1 | mistralai/Mistral-7B-v0.1 |
teknium_OpenHermes-13B_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/teknium/OpenHermes-13B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">teknium/OpenHermes-13B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/teknium__OpenHermes-13B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | teknium/OpenHermes-13B | bcad6fff9f8591e091d2d57356a3f102197e8c5f | 12.169676 | mit | 54 | 13 | true | true | true | false | false | 31.119117 | 0.266807 | 26.680652 | 0.420644 | 18.213328 | 0.011329 | 1.132931 | 0.272651 | 3.020134 | 0.40426 | 8.532552 | 0.238946 | 15.43846 | true | 2023-09-06 | 2024-06-12 | 1 | NousResearch/Llama-2-13b-hf |
teknium_OpenHermes-2-Mistral-7B_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/teknium/OpenHermes-2-Mistral-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">teknium/OpenHermes-2-Mistral-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/teknium__OpenHermes-2-Mistral-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | teknium/OpenHermes-2-Mistral-7B | 4c6e34123b140ce773a8433cae5410949289102c | 21.4153 | apache-2.0 | 255 | 7 | true | true | true | false | true | 0.47503 | 0.528615 | 52.861519 | 0.494752 | 29.251839 | 0.043807 | 4.380665 | 0.283557 | 4.474273 | 0.451979 | 16.064062 | 0.293135 | 21.459441 | true | 2023-10-12 | 2024-06-12 | 1 | mistralai/Mistral-7B-v0.1 |
teknium_OpenHermes-2.5-Mistral-7B_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/teknium/OpenHermes-2.5-Mistral-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">teknium/OpenHermes-2.5-Mistral-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/teknium__OpenHermes-2.5-Mistral-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | teknium/OpenHermes-2.5-Mistral-7B | 24c0bea14d53e6f67f1fbe2eca5bfe7cae389b33 | 21.266837 | apache-2.0 | 815 | 7 | true | true | true | false | true | 0.472783 | 0.557142 | 55.714172 | 0.487001 | 27.770026 | 0.047583 | 4.758308 | 0.283557 | 4.474273 | 0.424198 | 12.058073 | 0.305436 | 22.826167 | true | 2023-10-29 | 2024-06-12 | 1 | mistralai/Mistral-7B-v0.1 |
teknium_OpenHermes-7B_float16 | float16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/teknium/OpenHermes-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">teknium/OpenHermes-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/teknium__OpenHermes-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | teknium/OpenHermes-7B | 9f55d6eb15f1edd52ee1fd863a220aa682e78a00 | 9.481132 | mit | 13 | 7 | true | true | true | false | false | 2.48309 | 0.181251 | 18.12513 | 0.362034 | 12.081395 | 0.010574 | 1.057402 | 0.269295 | 2.572707 | 0.432385 | 12.68151 | 0.193318 | 10.368647 | true | 2023-09-14 | 2024-06-12 | 1 | NousResearch/Llama-2-7b-hf |
tensoropera_Fox-1-1.6B_bfloat16 | bfloat16 | 🟢 pretrained | 🟢 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/tensoropera/Fox-1-1.6B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">tensoropera/Fox-1-1.6B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/tensoropera__Fox-1-1.6B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | tensoropera/Fox-1-1.6B | 6389dde4d7e52aa1200ad954c565f03c7fdcf8db | 7.739189 | apache-2.0 | 30 | 1 | true | true | true | false | false | 1.34282 | 0.276598 | 27.659831 | 0.330737 | 7.399761 | 0.015861 | 1.586103 | 0.263423 | 1.789709 | 0.35499 | 3.873698 | 0.137134 | 4.126034 | false | 2024-06-13 | 2024-06-29 | 0 | tensoropera/Fox-1-1.6B |
tenyx_Llama3-TenyxChat-70B_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/tenyx/Llama3-TenyxChat-70B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">tenyx/Llama3-TenyxChat-70B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/tenyx__Llama3-TenyxChat-70B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | tenyx/Llama3-TenyxChat-70B | a85d31e3af8fcc847cc9169f1144cf02f5351fab | 36.872248 | llama3 | 63 | 70 | true | true | true | false | true | 9.367007 | 0.808709 | 80.870867 | 0.651149 | 49.61562 | 0.246224 | 24.622356 | 0.301174 | 6.823266 | 0.426031 | 12.520573 | 0.521027 | 46.780807 | false | 2024-04-26 | 2024-08-04 | 0 | tenyx/Llama3-TenyxChat-70B |
theprint_Boptruth-Agatha-7B_float16 | float16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/theprint/Boptruth-Agatha-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">theprint/Boptruth-Agatha-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/theprint__Boptruth-Agatha-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | theprint/Boptruth-Agatha-7B | ef7c7570be29a58f4a8358a6d4c75f59a5282191 | 17.44944 | 0 | 7 | false | true | true | false | false | 0.388105 | 0.312419 | 31.241883 | 0.498394 | 29.286422 | 0.05136 | 5.135952 | 0.299497 | 6.599553 | 0.427667 | 11.758333 | 0.28607 | 20.674498 | false | 2024-09-11 | 2024-09-30 | 0 | theprint/Boptruth-Agatha-7B |
|
theprint_CleverBoi-7B-v2_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Adapter | ? | <a target="_blank" href="https://huggingface.co/theprint/CleverBoi-7B-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">theprint/CleverBoi-7B-v2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/theprint__CleverBoi-7B-v2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | theprint/CleverBoi-7B-v2 | 1d82629c1e6778cf8568b532a3c09b668805b15a | 15.032974 | apache-2.0 | 0 | 7 | true | true | true | false | false | 1.522398 | 0.216998 | 21.699757 | 0.453173 | 23.444181 | 0.022659 | 2.265861 | 0.288591 | 5.145414 | 0.469531 | 18.658073 | 0.270861 | 18.98456 | false | 2024-09-12 | 2024-09-13 | 2 | mistralai/Mistral-7B-v0.3 |
theprint_CleverBoi-7B-v3_float16 | float16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Adapter | ? | <a target="_blank" href="https://huggingface.co/theprint/CleverBoi-7B-v3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">theprint/CleverBoi-7B-v3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/theprint__CleverBoi-7B-v3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | theprint/CleverBoi-7B-v3 | 1d82629c1e6778cf8568b532a3c09b668805b15a | 13.589762 | apache-2.0 | 0 | 7 | true | true | true | false | false | 1.60289 | 0.23823 | 23.823012 | 0.441443 | 21.936747 | 0.033988 | 3.398792 | 0.26594 | 2.12528 | 0.407177 | 9.497135 | 0.286818 | 20.757609 | false | 2024-09-14 | 2024-09-22 | 2 | mistralai/Mistral-7B-v0.3 |
theprint_CleverBoi-Llama-3.1-8B-Instruct_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Adapter | ? | <a target="_blank" href="https://huggingface.co/theprint/CleverBoi-Llama-3.1-8B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">theprint/CleverBoi-Llama-3.1-8B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/theprint__CleverBoi-Llama-3.1-8B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | theprint/CleverBoi-Llama-3.1-8B-Instruct | 3514c510ea4ba4d650522f467d4d0cef7de4a43c | 13.605339 | apache-2.0 | 1 | 16 | true | true | true | false | false | 1.870223 | 0.168163 | 16.81627 | 0.455962 | 24.048603 | 0.02719 | 2.719033 | 0.300336 | 6.711409 | 0.401438 | 8.279688 | 0.307513 | 23.057033 | false | 2024-08-27 | 2024-09-13 | 3 | meta-llama/Meta-Llama-3.1-8B |
theprint_CleverBoi-Llama-3.1-8B-v2_float16 | float16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Adapter | ? | <a target="_blank" href="https://huggingface.co/theprint/CleverBoi-Llama-3.1-8B-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">theprint/CleverBoi-Llama-3.1-8B-v2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/theprint__CleverBoi-Llama-3.1-8B-v2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | theprint/CleverBoi-Llama-3.1-8B-v2 | a8b0fc584b10e0110e04f9d21c7f10d24391c1d5 | 14.095235 | apache-2.0 | 0 | 9 | true | true | true | false | false | 2.521379 | 0.19614 | 19.613958 | 0.466782 | 24.132845 | 0.049849 | 4.984894 | 0.286074 | 4.809843 | 0.373469 | 6.716927 | 0.318816 | 24.312943 | false | 2024-09-15 | 2024-09-22 | 2 | meta-llama/Meta-Llama-3.1-8B |
theprint_CleverBoi-Nemo-12B-v2_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Adapter | ? | <a target="_blank" href="https://huggingface.co/theprint/CleverBoi-Nemo-12B-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">theprint/CleverBoi-Nemo-12B-v2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/theprint__CleverBoi-Nemo-12B-v2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | theprint/CleverBoi-Nemo-12B-v2 | cd1f9ee1c484f857bb0e5ae6aac37dc434911f10 | 17.68216 | apache-2.0 | 3 | 13 | true | true | true | false | false | 3.505513 | 0.204583 | 20.458273 | 0.524109 | 31.652695 | 0.0929 | 9.29003 | 0.313758 | 8.501119 | 0.418677 | 11.434635 | 0.322806 | 24.756206 | false | 2024-09-16 | 2024-09-24 | 1 | unsloth/Mistral-Nemo-Instruct-2407-bnb-4bit |
theprint_Code-Llama-Bagel-8B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/theprint/Code-Llama-Bagel-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">theprint/Code-Llama-Bagel-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/theprint__Code-Llama-Bagel-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | theprint/Code-Llama-Bagel-8B | 7fa415f3f758ab7930d7e1df27b2d16207513125 | 14.526782 | llama3 | 1 | 8 | true | false | true | false | false | 0.818047 | 0.252968 | 25.296768 | 0.469742 | 25.338155 | 0.05287 | 5.287009 | 0.276007 | 3.467562 | 0.367979 | 7.530729 | 0.282164 | 20.24047 | false | 2024-06-21 | 2024-09-13 | 1 | theprint/Code-Llama-Bagel-8B (Merge) |
theprint_Llama-3.2-3B-VanRossum_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Adapter | ? | <a target="_blank" href="https://huggingface.co/theprint/Llama-3.2-3B-VanRossum" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">theprint/Llama-3.2-3B-VanRossum</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/theprint__Llama-3.2-3B-VanRossum-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | theprint/Llama-3.2-3B-VanRossum | 7048abecd492a1f5d53981cb175431ec01bbced0 | 17.521868 | apache-2.0 | 0 | 3 | true | true | true | false | false | 1.854588 | 0.478282 | 47.828207 | 0.427874 | 19.366362 | 0.093656 | 9.365559 | 0.267617 | 2.348993 | 0.344167 | 6.554167 | 0.277011 | 19.667923 | false | 2024-11-14 | 2024-11-14 | 2 | meta-llama/Llama-3.2-3B-Instruct |
theprint_ReWiz-7B_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Adapter | ? | <a target="_blank" href="https://huggingface.co/theprint/ReWiz-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">theprint/ReWiz-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/theprint__ReWiz-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | theprint/ReWiz-7B | d9f28e67d52181d1478e7788e3edf252f5bf32a8 | 17.598219 | apache-2.0 | 0 | 7 | true | true | true | false | false | 1.445406 | 0.404793 | 40.479262 | 0.456422 | 23.50443 | 0.029456 | 2.945619 | 0.275168 | 3.355705 | 0.461156 | 16.744531 | 0.267038 | 18.559767 | false | 2024-10-08 | 2024-10-08 | 3 | mistralai/Mistral-7B-v0.3 |
theprint_ReWiz-Llama-3.1-8B-v2_float16 | float16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Adapter | ? | <a target="_blank" href="https://huggingface.co/theprint/ReWiz-Llama-3.1-8B-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">theprint/ReWiz-Llama-3.1-8B-v2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/theprint__ReWiz-Llama-3.1-8B-v2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | theprint/ReWiz-Llama-3.1-8B-v2 | a8b0fc584b10e0110e04f9d21c7f10d24391c1d5 | 15.681926 | apache-2.0 | 1 | 9 | true | true | true | false | false | 2.327395 | 0.237306 | 23.73059 | 0.463243 | 23.773287 | 0.045317 | 4.531722 | 0.302852 | 7.04698 | 0.381375 | 9.338542 | 0.331034 | 25.670434 | false | 2024-11-02 | 2024-11-03 | 2 | meta-llama/Meta-Llama-3.1-8B |
theprint_ReWiz-Llama-3.2-3B_float16 | float16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/theprint/ReWiz-Llama-3.2-3B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">theprint/ReWiz-Llama-3.2-3B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/theprint__ReWiz-Llama-3.2-3B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | theprint/ReWiz-Llama-3.2-3B | e6aed95ad8f104f105b8423cd5f87c75705a828c | 17.984844 | apache-2.0 | 2 | 3 | true | true | true | false | false | 1.30972 | 0.464893 | 46.489315 | 0.434326 | 19.293728 | 0.097432 | 9.743202 | 0.283557 | 4.474273 | 0.361375 | 6.938542 | 0.28873 | 20.970006 | false | 2024-10-18 | 2024-10-28 | 1 | theprint/ReWiz-Llama-3.2-3B (Merge) |
theprint_ReWiz-Nemo-12B-Instruct_float16 | float16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/theprint/ReWiz-Nemo-12B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">theprint/ReWiz-Nemo-12B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/theprint__ReWiz-Nemo-12B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | theprint/ReWiz-Nemo-12B-Instruct | 6f8ea24f8d19b48850d68bef1b5c50837d37761b | 15.631853 | apache-2.0 | 2 | 12 | true | true | true | false | false | 1.17003 | 0.106238 | 10.623811 | 0.509241 | 29.926389 | 0.071752 | 7.175227 | 0.323826 | 9.8434 | 0.409563 | 10.228646 | 0.333943 | 25.993647 | false | 2024-10-31 | 2024-11-02 | 1 | unsloth/Mistral-Nemo-Instruct-2407-bnb-4bit |
theprint_ReWiz-Qwen-2.5-14B_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Adapter | ? | <a target="_blank" href="https://huggingface.co/theprint/ReWiz-Qwen-2.5-14B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">theprint/ReWiz-Qwen-2.5-14B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/theprint__ReWiz-Qwen-2.5-14B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | theprint/ReWiz-Qwen-2.5-14B | e5524628f15c30d7542427c53a565e6e2d3ff760 | 29.641502 | apache-2.0 | 5 | 16 | true | true | true | false | false | 5.928266 | 0.278546 | 27.854648 | 0.617949 | 44.861873 | 0.268882 | 26.888218 | 0.380034 | 17.337808 | 0.453896 | 15.436979 | 0.509225 | 45.469489 | false | 2024-11-05 | 2024-11-10 | 2 | Qwen/Qwen2.5-14B |
theprint_ReWiz-Worldbuilder-7B_float16 | float16 | 🤝 base merges and moerges | 🤝 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/theprint/ReWiz-Worldbuilder-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">theprint/ReWiz-Worldbuilder-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/theprint__ReWiz-Worldbuilder-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | theprint/ReWiz-Worldbuilder-7B | e88c715097d824f115f59a97e612d662ffb1031f | 15.664819 | 0 | 7 | false | true | true | false | false | 0.610867 | 0.25102 | 25.101952 | 0.463616 | 25.076347 | 0.029456 | 2.945619 | 0.269295 | 2.572707 | 0.45725 | 16.389583 | 0.297124 | 21.902704 | false | 2024-10-28 | 2024-10-28 | 1 | theprint/ReWiz-Worldbuilder-7B (Merge) |
|
theprint_RuDolph-Hermes-7B_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/theprint/RuDolph-Hermes-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">theprint/RuDolph-Hermes-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/theprint__RuDolph-Hermes-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | theprint/RuDolph-Hermes-7B | e07aea56963bbfe5c6753d1056566a56acc30d4a | 19.024425 | 0 | 7 | false | true | true | false | false | 0.502067 | 0.360429 | 36.042922 | 0.505293 | 30.709648 | 0.050604 | 5.060423 | 0.312081 | 8.277405 | 0.422615 | 11.026823 | 0.307264 | 23.029329 | false | 2024-11-10 | 2024-11-10 | 1 | theprint/RuDolph-Hermes-7B (Merge) |
|
theprint_WorldBuilder-12B_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Adapter | ? | <a target="_blank" href="https://huggingface.co/theprint/WorldBuilder-12B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">theprint/WorldBuilder-12B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/theprint__WorldBuilder-12B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | theprint/WorldBuilder-12B | 20cfd0e98fb2628b00867147b2c6f423d27f3561 | 14.377937 | apache-2.0 | 0 | 13 | true | true | true | false | false | 2.831275 | 0.137438 | 13.743755 | 0.50101 | 29.277996 | 0.036254 | 3.625378 | 0.29698 | 6.263982 | 0.406646 | 8.997396 | 0.319232 | 24.359116 | false | 2024-10-27 | 2024-11-18 | 1 | unsloth/mistral-nemo-base-2407-bnb-4bit |
theprint_phi-3-mini-4k-python_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Adapter | ? | <a target="_blank" href="https://huggingface.co/theprint/phi-3-mini-4k-python" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">theprint/phi-3-mini-4k-python</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/theprint__phi-3-mini-4k-python-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | theprint/phi-3-mini-4k-python | 81453e5718775630581ab9950e6c0ccf0d7a4177 | 17.564493 | apache-2.0 | 0 | 4 | true | true | true | false | false | 1.375551 | 0.240878 | 24.087754 | 0.493759 | 28.446016 | 0.095166 | 9.516616 | 0.291107 | 5.480984 | 0.392167 | 9.220833 | 0.357713 | 28.634752 | false | 2024-06-03 | 2024-09-13 | 1 | unsloth/Phi-3-mini-4k-instruct-bnb-4bit |
thomas-yanxin_XinYuan-Qwen2-1_5B_float16 | float16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/thomas-yanxin/XinYuan-Qwen2-1_5B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">thomas-yanxin/XinYuan-Qwen2-1_5B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/thomas-yanxin__XinYuan-Qwen2-1_5B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | thomas-yanxin/XinYuan-Qwen2-1_5B | a01b362887832bea08d686737861ac3d5b437a32 | 11.515091 | other | 1 | 1 | true | true | true | false | true | 1.352364 | 0.298556 | 29.855561 | 0.363549 | 12.12558 | 0.067221 | 6.722054 | 0.270134 | 2.684564 | 0.363396 | 2.624479 | 0.235705 | 15.07831 | false | 2024-08-25 | 2024-09-04 | 1 | Removed |
thomas-yanxin_XinYuan-Qwen2-7B_float16 | float16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/thomas-yanxin/XinYuan-Qwen2-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">thomas-yanxin/XinYuan-Qwen2-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/thomas-yanxin__XinYuan-Qwen2-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | thomas-yanxin/XinYuan-Qwen2-7B | c62d83eee2f4812ac17fc17d307f4aa1a77c5359 | 22.217714 | other | 1 | 7 | true | true | true | false | true | 3.276154 | 0.44376 | 44.376033 | 0.493663 | 28.401489 | 0.132931 | 13.293051 | 0.291107 | 5.480984 | 0.405812 | 9.259896 | 0.392453 | 32.494829 | false | 2024-08-21 | 2024-09-03 | 0 | thomas-yanxin/XinYuan-Qwen2-7B |
thomas-yanxin_XinYuan-Qwen2-7B-0917_float16 | float16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/thomas-yanxin/XinYuan-Qwen2-7B-0917" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">thomas-yanxin/XinYuan-Qwen2-7B-0917</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/thomas-yanxin__XinYuan-Qwen2-7B-0917-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | thomas-yanxin/XinYuan-Qwen2-7B-0917 | 6cee1b155fca9ae1f558f434953dfdadb9596af0 | 22.721617 | other | 4 | 7 | true | true | true | false | true | 1.485564 | 0.37192 | 37.191984 | 0.516922 | 32.619938 | 0.088369 | 8.836858 | 0.309564 | 7.941834 | 0.440104 | 13.679688 | 0.424535 | 36.059397 | false | 2024-09-17 | 2024-09-17 | 0 | thomas-yanxin/XinYuan-Qwen2-7B-0917 |
thomas-yanxin_XinYuan-Qwen2.5-7B-0917_float16 | float16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/thomas-yanxin/XinYuan-Qwen2.5-7B-0917" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">thomas-yanxin/XinYuan-Qwen2.5-7B-0917</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/thomas-yanxin__XinYuan-Qwen2.5-7B-0917-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | thomas-yanxin/XinYuan-Qwen2.5-7B-0917 | bbbeafd1003c4d5e13f09b7223671957384b961a | 18.175037 | other | 4 | 7 | true | true | true | false | true | 0.971225 | 0.357706 | 35.770644 | 0.518411 | 33.439669 | 0 | 0 | 0.28104 | 4.138702 | 0.367552 | 3.677344 | 0.388215 | 32.023862 | false | 2024-09-17 | 2024-09-24 | 0 | thomas-yanxin/XinYuan-Qwen2.5-7B-0917 |
tiiuae_falcon-11B_bfloat16 | bfloat16 | 🟢 pretrained | 🟢 | Original | FalconForCausalLM | <a target="_blank" href="https://huggingface.co/tiiuae/falcon-11B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">tiiuae/falcon-11B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/tiiuae__falcon-11B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | tiiuae/falcon-11B | 066e3bf4e2d9aaeefa129af0a6d39727d27816b3 | 13.814138 | unknown | 212 | 11 | true | true | true | false | false | 1.082871 | 0.326132 | 32.613244 | 0.439164 | 21.937999 | 0.02568 | 2.567976 | 0.270973 | 2.796421 | 0.398646 | 7.530729 | 0.238946 | 15.43846 | true | 2024-05-09 | 2024-06-09 | 0 | tiiuae/falcon-11B |
tiiuae_falcon-40b_bfloat16 | bfloat16 | 🟢 pretrained | 🟢 | Original | FalconForCausalLM | <a target="_blank" href="https://huggingface.co/tiiuae/falcon-40b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">tiiuae/falcon-40b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/tiiuae__falcon-40b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | tiiuae/falcon-40b | 4a70170c215b36a3cce4b4253f6d0612bb7d4146 | 11.36354 | apache-2.0 | 2,417 | 40 | true | true | true | false | false | 21.793584 | 0.249645 | 24.964539 | 0.401853 | 16.583305 | 0.015861 | 1.586103 | 0.27349 | 3.131991 | 0.363146 | 5.193229 | 0.250499 | 16.722074 | true | 2023-05-24 | 2024-06-09 | 0 | tiiuae/falcon-40b |
tiiuae_falcon-40b-instruct_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | FalconForCausalLM | <a target="_blank" href="https://huggingface.co/tiiuae/falcon-40b-instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">tiiuae/falcon-40b-instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/tiiuae__falcon-40b-instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | tiiuae/falcon-40b-instruct | ecb78d97ac356d098e79f0db222c9ce7c5d9ee5f | 10.434154 | apache-2.0 | 1,172 | 40 | true | true | true | false | false | 19.733245 | 0.245449 | 24.544874 | 0.405387 | 17.220114 | 0.016616 | 1.661631 | 0.25 | 0 | 0.376229 | 5.161979 | 0.226147 | 14.016327 | true | 2023-05-25 | 2024-06-09 | 0 | tiiuae/falcon-40b-instruct |
tiiuae_falcon-7b_bfloat16 | bfloat16 | 🟢 pretrained | 🟢 | Original | FalconForCausalLM | <a target="_blank" href="https://huggingface.co/tiiuae/falcon-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">tiiuae/falcon-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/tiiuae__falcon-7b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | tiiuae/falcon-7b | 898df1396f35e447d5fe44e0a3ccaaaa69f30d36 | 5.110504 | apache-2.0 | 1,078 | 7 | true | true | true | false | false | 0.785841 | 0.182051 | 18.20514 | 0.328524 | 5.963937 | 0.006042 | 0.60423 | 0.244966 | 0 | 0.377844 | 4.497135 | 0.112533 | 1.392583 | true | 2023-04-24 | 2024-06-09 | 0 | tiiuae/falcon-7b |
tiiuae_falcon-7b-instruct_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | FalconForCausalLM | <a target="_blank" href="https://huggingface.co/tiiuae/falcon-7b-instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">tiiuae/falcon-7b-instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/tiiuae__falcon-7b-instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | tiiuae/falcon-7b-instruct | cf4b3c42ce2fdfe24f753f0f0d179202fea59c99 | 5.015869 | apache-2.0 | 922 | 7 | true | true | true | false | false | 0.766215 | 0.196889 | 19.68887 | 0.320342 | 4.823178 | 0.006042 | 0.60423 | 0.247483 | 0 | 0.363365 | 3.253906 | 0.115525 | 1.72503 | true | 2023-04-25 | 2024-06-09 | 0 | tiiuae/falcon-7b-instruct |
tiiuae_falcon-mamba-7b_bfloat16 | bfloat16 | 🟢 pretrained | 🟢 | Original | FalconMambaForCausalLM | <a target="_blank" href="https://huggingface.co/tiiuae/falcon-mamba-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">tiiuae/falcon-mamba-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/tiiuae__falcon-mamba-7b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | tiiuae/falcon-mamba-7b | 5337fd73f19847e111ba2291f3f0e1617b90c37d | 15.116297 | other | 217 | 7 | true | true | true | false | false | 3.610408 | 0.333576 | 33.357602 | 0.428485 | 19.876878 | 0.040785 | 4.07855 | 0.310403 | 8.053691 | 0.421031 | 10.86224 | 0.230219 | 14.468824 | true | 2024-07-17 | 2024-07-23 | 0 | tiiuae/falcon-mamba-7b |
tklohj_WindyFloLLM_float16 | float16 | 🟢 pretrained | 🟢 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/tklohj/WindyFloLLM" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">tklohj/WindyFloLLM</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/tklohj__WindyFloLLM-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | tklohj/WindyFloLLM | 21f4241ab3f091d1d309e9076a8d8e3f014908a8 | 14.205891 | 0 | 13 | false | true | true | false | false | 1.098512 | 0.266856 | 26.685639 | 0.463662 | 24.398763 | 0.013595 | 1.359517 | 0.275168 | 3.355705 | 0.425313 | 11.864063 | 0.258145 | 17.571661 | false | 2024-06-30 | 2024-07-10 | 1 | tklohj/WindyFloLLM (Merge) |
|
togethercomputer_GPT-JT-6B-v1_float16 | float16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | GPTJForCausalLM | <a target="_blank" href="https://huggingface.co/togethercomputer/GPT-JT-6B-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">togethercomputer/GPT-JT-6B-v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/togethercomputer__GPT-JT-6B-v1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | togethercomputer/GPT-JT-6B-v1 | f34aa35f906895602c1f86f5685e598afdea8051 | 6.827354 | apache-2.0 | 301 | 6 | true | true | true | false | false | 37.958811 | 0.206106 | 20.610646 | 0.330266 | 7.318524 | 0.007553 | 0.755287 | 0.260906 | 1.454139 | 0.373656 | 3.873698 | 0.162566 | 6.951832 | true | 2022-11-24 | 2024-06-12 | 0 | togethercomputer/GPT-JT-6B-v1 |
togethercomputer_GPT-NeoXT-Chat-Base-20B_float16 | float16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | GPTNeoXForCausalLM | <a target="_blank" href="https://huggingface.co/togethercomputer/GPT-NeoXT-Chat-Base-20B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">togethercomputer/GPT-NeoXT-Chat-Base-20B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/togethercomputer__GPT-NeoXT-Chat-Base-20B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | togethercomputer/GPT-NeoXT-Chat-Base-20B | d386708e84d862a65f7d2b4989f64750cb657227 | 4.964062 | apache-2.0 | 695 | 20 | true | true | true | false | false | 2.983588 | 0.182976 | 18.297562 | 0.332097 | 6.830795 | 0.01284 | 1.283988 | 0.25 | 0 | 0.346063 | 1.757812 | 0.114528 | 1.614214 | true | 2023-03-03 | 2024-06-12 | 0 | togethercomputer/GPT-NeoXT-Chat-Base-20B |
togethercomputer_LLaMA-2-7B-32K_float16 | float16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/togethercomputer/LLaMA-2-7B-32K" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">togethercomputer/LLaMA-2-7B-32K</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/togethercomputer__LLaMA-2-7B-32K-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | togethercomputer/LLaMA-2-7B-32K | 46c24bb5aef59722fa7aa6d75e832afd1d64b980 | 6.737011 | llama2 | 533 | 7 | true | true | true | false | false | 0.584573 | 0.186497 | 18.649738 | 0.339952 | 8.089984 | 0.008308 | 0.830816 | 0.25 | 0 | 0.375365 | 4.320573 | 0.176779 | 8.530954 | true | 2023-07-26 | 2024-06-12 | 0 | togethercomputer/LLaMA-2-7B-32K |
togethercomputer_Llama-2-7B-32K-Instruct_float16 | float16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/togethercomputer/Llama-2-7B-32K-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">togethercomputer/Llama-2-7B-32K-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/togethercomputer__Llama-2-7B-32K-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | togethercomputer/Llama-2-7B-32K-Instruct | d27380af003252f5eb0d218e104938b4e673e3f3 | 8.20819 | llama2 | 159 | 7 | true | true | true | false | false | 0.589909 | 0.213 | 21.300039 | 0.344347 | 8.56347 | 0.01284 | 1.283988 | 0.251678 | 0.223714 | 0.405594 | 9.199219 | 0.178108 | 8.678709 | true | 2023-08-08 | 2024-06-12 | 0 | togethercomputer/Llama-2-7B-32K-Instruct |
togethercomputer_RedPajama-INCITE-7B-Base_float16 | float16 | 🟢 pretrained | 🟢 | Original | GPTNeoXForCausalLM | <a target="_blank" href="https://huggingface.co/togethercomputer/RedPajama-INCITE-7B-Base" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">togethercomputer/RedPajama-INCITE-7B-Base</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/togethercomputer__RedPajama-INCITE-7B-Base-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | togethercomputer/RedPajama-INCITE-7B-Base | 78f7e482443971f4873ba3239f0ac810a367833b | 5.486286 | apache-2.0 | 94 | 7 | true | true | true | false | false | 1.220607 | 0.20823 | 20.822972 | 0.319489 | 5.087242 | 0.011329 | 1.132931 | 0.255034 | 0.671141 | 0.362 | 3.016667 | 0.119681 | 2.186761 | true | 2023-05-04 | 2024-06-12 | 0 | togethercomputer/RedPajama-INCITE-7B-Base |
togethercomputer_RedPajama-INCITE-7B-Chat_float16 | float16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | GPTNeoXForCausalLM | <a target="_blank" href="https://huggingface.co/togethercomputer/RedPajama-INCITE-7B-Chat" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">togethercomputer/RedPajama-INCITE-7B-Chat</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/togethercomputer__RedPajama-INCITE-7B-Chat-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | togethercomputer/RedPajama-INCITE-7B-Chat | 47b94a739e2f3164b438501c8684acc5d5acc146 | 3.962784 | apache-2.0 | 92 | 7 | true | true | true | false | false | 1.219336 | 0.155798 | 15.579773 | 0.317545 | 4.502174 | 0.001511 | 0.151057 | 0.252517 | 0.33557 | 0.34476 | 1.861719 | 0.112118 | 1.34641 | true | 2023-05-04 | 2024-06-13 | 0 | togethercomputer/RedPajama-INCITE-7B-Chat |
togethercomputer_RedPajama-INCITE-7B-Instruct_float16 | float16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | GPTNeoXForCausalLM | <a target="_blank" href="https://huggingface.co/togethercomputer/RedPajama-INCITE-7B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">togethercomputer/RedPajama-INCITE-7B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/togethercomputer__RedPajama-INCITE-7B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | togethercomputer/RedPajama-INCITE-7B-Instruct | 7f36397b9985a3f981cdb618f8fec1c565ca5927 | 6.356021 | apache-2.0 | 104 | 7 | true | true | true | false | false | 1.181119 | 0.205507 | 20.550694 | 0.337744 | 7.905416 | 0.015106 | 1.510574 | 0.250839 | 0.111857 | 0.36851 | 5.030469 | 0.127244 | 3.027113 | true | 2023-05-05 | 2024-06-12 | 0 | togethercomputer/RedPajama-INCITE-7B-Instruct |
togethercomputer_RedPajama-INCITE-Base-3B-v1_float16 | float16 | 🟢 pretrained | 🟢 | Original | GPTNeoXForCausalLM | <a target="_blank" href="https://huggingface.co/togethercomputer/RedPajama-INCITE-Base-3B-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">togethercomputer/RedPajama-INCITE-Base-3B-v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/togethercomputer__RedPajama-INCITE-Base-3B-v1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | togethercomputer/RedPajama-INCITE-Base-3B-v1 | 094fbdd0c911feb485ce55de1952ab2e75277e1e | 5.445562 | apache-2.0 | 90 | 3 | true | true | true | false | false | 0.776102 | 0.229363 | 22.936254 | 0.30604 | 3.518608 | 0.009819 | 0.981873 | 0.243289 | 0 | 0.373875 | 4.001042 | 0.11112 | 1.235594 | true | 2023-05-04 | 2024-06-12 | 0 | togethercomputer/RedPajama-INCITE-Base-3B-v1 |
togethercomputer_RedPajama-INCITE-Chat-3B-v1_float16 | float16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | GPTNeoXForCausalLM | <a target="_blank" href="https://huggingface.co/togethercomputer/RedPajama-INCITE-Chat-3B-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">togethercomputer/RedPajama-INCITE-Chat-3B-v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/togethercomputer__RedPajama-INCITE-Chat-3B-v1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | togethercomputer/RedPajama-INCITE-Chat-3B-v1 | f0e0995eba801096ed04cb87931d96a8316871af | 4.748119 | apache-2.0 | 152 | 3 | true | true | true | false | false | 0.774909 | 0.165215 | 16.521496 | 0.321669 | 5.164728 | 0.003021 | 0.302115 | 0.244128 | 0 | 0.368448 | 5.089323 | 0.112699 | 1.411052 | true | 2023-05-05 | 2024-06-13 | 0 | togethercomputer/RedPajama-INCITE-Chat-3B-v1 |
togethercomputer_RedPajama-INCITE-Instruct-3B-v1_float16 | float16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | GPTNeoXForCausalLM | <a target="_blank" href="https://huggingface.co/togethercomputer/RedPajama-INCITE-Instruct-3B-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">togethercomputer/RedPajama-INCITE-Instruct-3B-v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/togethercomputer__RedPajama-INCITE-Instruct-3B-v1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | togethercomputer/RedPajama-INCITE-Instruct-3B-v1 | 0c66778ee09a036886741707733620b91057909a | 5.676527 | apache-2.0 | 93 | 3 | true | true | true | false | false | 0.760671 | 0.212426 | 21.242636 | 0.314602 | 4.510786 | 0.006798 | 0.679758 | 0.247483 | 0 | 0.388604 | 6.408854 | 0.110954 | 1.217125 | true | 2023-05-05 | 2024-06-12 | 0 | togethercomputer/RedPajama-INCITE-Instruct-3B-v1 |
tokyotech-llm_Llama-3-Swallow-8B-Instruct-v0.1_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/tokyotech-llm/Llama-3-Swallow-8B-Instruct-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">tokyotech-llm/Llama-3-Swallow-8B-Instruct-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/tokyotech-llm__Llama-3-Swallow-8B-Instruct-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | tokyotech-llm/Llama-3-Swallow-8B-Instruct-v0.1 | 1fae784584dd03680b72dd4de7eefbc5b7cabcd5 | 22.307385 | llama3 | 16 | 8 | true | true | true | false | true | 0.85811 | 0.550772 | 55.077195 | 0.500939 | 29.267966 | 0.072508 | 7.250755 | 0.28943 | 5.257271 | 0.435698 | 13.795573 | 0.30876 | 23.195553 | false | 2024-06-26 | 2024-09-12 | 0 | tokyotech-llm/Llama-3-Swallow-8B-Instruct-v0.1 |
upstage_SOLAR-10.7B-Instruct-v1.0_float16 | float16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/upstage/SOLAR-10.7B-Instruct-v1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">upstage/SOLAR-10.7B-Instruct-v1.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/upstage__SOLAR-10.7B-Instruct-v1.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | upstage/SOLAR-10.7B-Instruct-v1.0 | c08c25ed66414a878fe0401a3596d536c083606c | 19.628255 | cc-by-nc-4.0 | 615 | 10 | true | true | true | false | true | 0.782776 | 0.473661 | 47.3661 | 0.516249 | 31.872402 | 0 | 0 | 0.308725 | 7.829978 | 0.389938 | 6.942188 | 0.31383 | 23.758865 | true | 2023-12-12 | 2024-06-12 | 1 | upstage/SOLAR-10.7B-Instruct-v1.0 (Merge) |
upstage_SOLAR-10.7B-v1.0_float16 | float16 | 🟢 pretrained | 🟢 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/upstage/SOLAR-10.7B-v1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">upstage/SOLAR-10.7B-v1.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/upstage__SOLAR-10.7B-v1.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | upstage/SOLAR-10.7B-v1.0 | a45090b8e56bdc2b8e32e46b3cd782fc0bea1fa5 | 4.916448 | apache-2.0 | 291 | 10 | true | true | true | false | false | 1.519194 | 0.171585 | 17.158473 | 0.299835 | 2.147163 | 0.023414 | 2.34139 | 0.260906 | 1.454139 | 0.368198 | 4.52474 | 0.116855 | 1.872784 | true | 2023-12-12 | 2024-06-12 | 0 | upstage/SOLAR-10.7B-v1.0 |
upstage_solar-pro-preview-instruct_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | SolarForCausalLM | <a target="_blank" href="https://huggingface.co/upstage/solar-pro-preview-instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">upstage/solar-pro-preview-instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/upstage__solar-pro-preview-instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | upstage/solar-pro-preview-instruct | b4db141b5fb08b23f8bc323bc34e2cff3e9675f8 | 39.900891 | mit | 428 | 22 | true | true | true | false | true | 1.741763 | 0.841581 | 84.158145 | 0.681684 | 54.822351 | 0.218278 | 21.827795 | 0.370805 | 16.107383 | 0.441656 | 15.007031 | 0.527344 | 47.482639 | true | 2024-09-09 | 2024-09-11 | 0 | upstage/solar-pro-preview-instruct |
uukuguy_speechless-code-mistral-7b-v1.0_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/uukuguy/speechless-code-mistral-7b-v1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">uukuguy/speechless-code-mistral-7b-v1.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/uukuguy__speechless-code-mistral-7b-v1.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | uukuguy/speechless-code-mistral-7b-v1.0 | 1862e0a712efc6002112e9c1235a197d58419b37 | 18.091887 | apache-2.0 | 18 | 7 | true | true | true | false | false | 0.646398 | 0.366524 | 36.652416 | 0.457171 | 24.091412 | 0.046073 | 4.607251 | 0.284396 | 4.58613 | 0.450177 | 14.772135 | 0.314578 | 23.841977 | false | 2023-10-10 | 2024-06-26 | 0 | uukuguy/speechless-code-mistral-7b-v1.0 |
uukuguy_speechless-codellama-34b-v2.0_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/uukuguy/speechless-codellama-34b-v2.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">uukuguy/speechless-codellama-34b-v2.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/uukuguy__speechless-codellama-34b-v2.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | uukuguy/speechless-codellama-34b-v2.0 | 419bc42a254102d6a5486a1a854068e912c4047c | 17.209358 | llama2 | 17 | 34 | true | true | true | false | false | 1.991254 | 0.460422 | 46.042168 | 0.481313 | 25.993293 | 0.043051 | 4.305136 | 0.269295 | 2.572707 | 0.378708 | 7.205208 | 0.254239 | 17.137633 | false | 2023-10-04 | 2024-06-26 | 0 | uukuguy/speechless-codellama-34b-v2.0 |
uukuguy_speechless-coder-ds-6.7b_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/uukuguy/speechless-coder-ds-6.7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">uukuguy/speechless-coder-ds-6.7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/uukuguy__speechless-coder-ds-6.7b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | uukuguy/speechless-coder-ds-6.7b | c813a5268c6dfe267a720ad3b51773f1ab0feb59 | 9.639323 | apache-2.0 | 5 | 6 | true | true | true | false | false | 0.788604 | 0.25047 | 25.046986 | 0.403637 | 15.897457 | 0.016616 | 1.661631 | 0.264262 | 1.901566 | 0.381938 | 5.342188 | 0.171875 | 7.986111 | false | 2023-12-30 | 2024-06-26 | 0 | uukuguy/speechless-coder-ds-6.7b |
uukuguy_speechless-instruct-mistral-7b-v0.2_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/uukuguy/speechless-instruct-mistral-7b-v0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">uukuguy/speechless-instruct-mistral-7b-v0.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/uukuguy__speechless-instruct-mistral-7b-v0.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | uukuguy/speechless-instruct-mistral-7b-v0.2 | 87a4d214f7d028d61c3dc013a7410b3c34a24072 | 18.018597 | apache-2.0 | 0 | 7 | true | true | true | false | false | 0.61762 | 0.326132 | 32.613244 | 0.460667 | 24.558747 | 0.043807 | 4.380665 | 0.281879 | 4.250559 | 0.490177 | 21.172135 | 0.290226 | 21.136229 | false | 2024-05-22 | 2024-06-26 | 0 | uukuguy/speechless-instruct-mistral-7b-v0.2 |
uukuguy_speechless-llama2-hermes-orca-platypus-wizardlm-13b_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/uukuguy/speechless-llama2-hermes-orca-platypus-wizardlm-13b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">uukuguy/speechless-llama2-hermes-orca-platypus-wizardlm-13b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/uukuguy__speechless-llama2-hermes-orca-platypus-wizardlm-13b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | uukuguy/speechless-llama2-hermes-orca-platypus-wizardlm-13b | 954cc87b0ed5fa280126de546daf648861031512 | 18.600891 | 32 | 13 | false | true | true | false | false | 0.979524 | 0.456175 | 45.617517 | 0.484554 | 26.791727 | 0.01435 | 1.435045 | 0.270134 | 2.684564 | 0.4655 | 17.754167 | 0.255901 | 17.322326 | false | 2023-09-01 | 2024-06-26 | 0 | uukuguy/speechless-llama2-hermes-orca-platypus-wizardlm-13b |
|
uukuguy_speechless-mistral-dolphin-orca-platypus-samantha-7b_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/uukuguy/speechless-mistral-dolphin-orca-platypus-samantha-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">uukuguy/speechless-mistral-dolphin-orca-platypus-samantha-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/uukuguy__speechless-mistral-dolphin-orca-platypus-samantha-7b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | uukuguy/speechless-mistral-dolphin-orca-platypus-samantha-7b | b1de043468a15198b55a6509293a4ee585139043 | 18.340089 | llama2 | 17 | 7 | true | true | true | false | false | 0.655719 | 0.370022 | 37.002154 | 0.498277 | 29.653129 | 0.029456 | 2.945619 | 0.283557 | 4.474273 | 0.436135 | 13.85026 | 0.299036 | 22.1151 | false | 2023-10-13 | 2024-06-26 | 0 | uukuguy/speechless-mistral-dolphin-orca-platypus-samantha-7b |
uukuguy_speechless-zephyr-code-functionary-7b_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/uukuguy/speechless-zephyr-code-functionary-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">uukuguy/speechless-zephyr-code-functionary-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/uukuguy__speechless-zephyr-code-functionary-7b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | uukuguy/speechless-zephyr-code-functionary-7b | d66fc775ece679966e352195c42444e9c70af7fa | 16.360129 | apache-2.0 | 2 | 7 | true | true | true | false | false | 0.634 | 0.269579 | 26.957916 | 0.466428 | 25.983623 | 0.036254 | 3.625378 | 0.300336 | 6.711409 | 0.426771 | 11.613021 | 0.309425 | 23.26943 | false | 2024-01-23 | 2024-06-26 | 0 | uukuguy/speechless-zephyr-code-functionary-7b |
v000000_L3.1-Niitorm-8B-DPO-t0.0001_float16 | float16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/v000000/L3.1-Niitorm-8B-DPO-t0.0001" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">v000000/L3.1-Niitorm-8B-DPO-t0.0001</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/v000000__L3.1-Niitorm-8B-DPO-t0.0001-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | v000000/L3.1-Niitorm-8B-DPO-t0.0001 | a34150b5f63de4bc83d79b1de127faff3750289f | 28.100642 | 7 | 8 | false | true | true | false | true | 0.878109 | 0.768867 | 76.886661 | 0.513423 | 30.513173 | 0.161631 | 16.163142 | 0.294463 | 5.928412 | 0.387979 | 7.264063 | 0.386636 | 31.848404 | false | 2024-09-19 | 2024-09-19 | 1 | v000000/L3.1-Niitorm-8B-DPO-t0.0001 (Merge) |
|
v000000_L3.1-Storniitova-8B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/v000000/L3.1-Storniitova-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">v000000/L3.1-Storniitova-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/v000000__L3.1-Storniitova-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | v000000/L3.1-Storniitova-8B | 05b126857f43d1b1383e50f8c97d214ceb199723 | 28.281707 | 7 | 8 | false | true | true | false | true | 0.81354 | 0.781656 | 78.165601 | 0.515145 | 30.810993 | 0.146526 | 14.652568 | 0.28943 | 5.257271 | 0.402896 | 9.961979 | 0.377576 | 30.841829 | false | 2024-09-12 | 2024-09-18 | 1 | v000000/L3.1-Storniitova-8B (Merge) |
|
v000000_Qwen2.5-14B-Gutenberg-1e-Delta_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/v000000/Qwen2.5-14B-Gutenberg-1e-Delta" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">v000000/Qwen2.5-14B-Gutenberg-1e-Delta</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/v000000__Qwen2.5-14B-Gutenberg-1e-Delta-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | v000000/Qwen2.5-14B-Gutenberg-1e-Delta | f624854b4380e01322e752ce4daadd49ac86580f | 32.105096 | apache-2.0 | 4 | 14 | true | true | true | false | true | 1.802387 | 0.804512 | 80.451203 | 0.63985 | 48.616672 | 0 | 0 | 0.328859 | 10.514541 | 0.407302 | 9.379427 | 0.493019 | 43.668735 | false | 2024-09-20 | 2024-09-28 | 1 | v000000/Qwen2.5-14B-Gutenberg-1e-Delta (Merge) |
v000000_Qwen2.5-14B-Gutenberg-Instruct-Slerpeno_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/v000000/Qwen2.5-14B-Gutenberg-Instruct-Slerpeno" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">v000000/Qwen2.5-14B-Gutenberg-Instruct-Slerpeno</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/v000000__Qwen2.5-14B-Gutenberg-Instruct-Slerpeno-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | v000000/Qwen2.5-14B-Gutenberg-Instruct-Slerpeno | 1069abb4c25855e67ffaefa08a0befbb376e7ca7 | 33.665023 | apache-2.0 | 4 | 14 | true | false | true | false | false | 2.23179 | 0.485476 | 48.547631 | 0.651079 | 49.7394 | 0.213746 | 21.374622 | 0.364094 | 15.212528 | 0.469063 | 18.432812 | 0.538148 | 48.683141 | false | 2024-09-20 | 2024-09-28 | 1 | v000000/Qwen2.5-14B-Gutenberg-Instruct-Slerpeno (Merge) |
v000000_Qwen2.5-Lumen-14B_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/v000000/Qwen2.5-Lumen-14B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">v000000/Qwen2.5-Lumen-14B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/v000000__Qwen2.5-Lumen-14B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | v000000/Qwen2.5-Lumen-14B | fbb1d184ed01dac52d307737893ebb6b0ace444c | 32.200288 | apache-2.0 | 15 | 14 | true | true | true | false | true | 1.836693 | 0.80636 | 80.636046 | 0.639081 | 48.507861 | 0 | 0 | 0.32802 | 10.402685 | 0.411396 | 10.291146 | 0.490276 | 43.363992 | false | 2024-09-20 | 2024-09-20 | 1 | v000000/Qwen2.5-Lumen-14B (Merge) |
vhab10_Llama-3.1-8B-Base-Instruct-SLERP_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/vhab10/Llama-3.1-8B-Base-Instruct-SLERP" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">vhab10/Llama-3.1-8B-Base-Instruct-SLERP</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/vhab10__Llama-3.1-8B-Base-Instruct-SLERP-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | vhab10/Llama-3.1-8B-Base-Instruct-SLERP | eccb4bde0dc91f586954109ecdce7c94f47e2625 | 19.249617 | mit | 1 | 8 | true | false | true | false | false | 0.806721 | 0.290712 | 29.071198 | 0.505744 | 29.926042 | 0.11858 | 11.858006 | 0.296141 | 6.152125 | 0.401063 | 9.366146 | 0.362118 | 29.124187 | false | 2024-09-16 | 2024-09-29 | 1 | vhab10/Llama-3.1-8B-Base-Instruct-SLERP (Merge) |
vhab10_Llama-3.2-Instruct-3B-TIES_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/vhab10/Llama-3.2-Instruct-3B-TIES" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">vhab10/Llama-3.2-Instruct-3B-TIES</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/vhab10__Llama-3.2-Instruct-3B-TIES-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | vhab10/Llama-3.2-Instruct-3B-TIES | 0e8661730f40a6a279bd273cfe9fe46bbd0507dd | 17.296562 | mit | 0 | 1 | true | false | true | false | false | 1.122926 | 0.472737 | 47.273678 | 0.433236 | 19.183159 | 0.095921 | 9.592145 | 0.269295 | 2.572707 | 0.349656 | 3.873698 | 0.291556 | 21.283983 | false | 2024-10-06 | 2024-11-23 | 1 | vhab10/Llama-3.2-Instruct-3B-TIES (Merge) |
vhab10_llama-3-8b-merged-linear_float16 | float16 | 🤝 base merges and moerges | 🤝 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/vhab10/llama-3-8b-merged-linear" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">vhab10/llama-3-8b-merged-linear</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/vhab10__llama-3-8b-merged-linear-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | vhab10/llama-3-8b-merged-linear | c37e7671b5ccfadbf3065fa5b48af05cd4f13292 | 23.911368 | mit | 0 | 4 | true | true | true | false | true | 1.304943 | 0.591663 | 59.166345 | 0.493709 | 27.816051 | 0.081571 | 8.1571 | 0.299497 | 6.599553 | 0.419052 | 11.68151 | 0.370429 | 30.047651 | false | 2024-09-26 | 2024-09-26 | 1 | vhab10/llama-3-8b-merged-linear (Merge) |
vicgalle_CarbonBeagle-11B_float16 | float16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/vicgalle/CarbonBeagle-11B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">vicgalle/CarbonBeagle-11B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/vicgalle__CarbonBeagle-11B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | vicgalle/CarbonBeagle-11B | 3fe9bf5327606d013b182fed17a472f5f043759b | 22.470186 | apache-2.0 | 9 | 10 | true | false | true | false | true | 0.915379 | 0.54153 | 54.152981 | 0.529365 | 33.060604 | 0.061934 | 6.193353 | 0.302013 | 6.935123 | 0.402031 | 9.18724 | 0.327626 | 25.291814 | false | 2024-01-21 | 2024-06-26 | 1 | vicgalle/CarbonBeagle-11B (Merge) |
vicgalle_CarbonBeagle-11B-truthy_float16 | float16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/vicgalle/CarbonBeagle-11B-truthy" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">vicgalle/CarbonBeagle-11B-truthy</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/vicgalle__CarbonBeagle-11B-truthy-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | vicgalle/CarbonBeagle-11B-truthy | 476cd2a6d938bddb38dfbeb4cb21e3e34303413d | 21.357727 | apache-2.0 | 9 | 10 | true | true | true | false | true | 0.907273 | 0.521221 | 52.122147 | 0.534842 | 33.988376 | 0.05136 | 5.135952 | 0.299497 | 6.599553 | 0.373969 | 4.11276 | 0.335688 | 26.187574 | false | 2024-02-10 | 2024-07-13 | 0 | vicgalle/CarbonBeagle-11B-truthy |
vicgalle_Configurable-Hermes-2-Pro-Llama-3-8B_float16 | float16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/vicgalle/Configurable-Hermes-2-Pro-Llama-3-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">vicgalle/Configurable-Hermes-2-Pro-Llama-3-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/vicgalle__Configurable-Hermes-2-Pro-Llama-3-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | vicgalle/Configurable-Hermes-2-Pro-Llama-3-8B | 3cb5792509966a963645be24fdbeb2e7dc6cac15 | 22.351954 | apache-2.0 | 6 | 8 | true | true | true | false | true | 0.748927 | 0.576251 | 57.625101 | 0.505484 | 30.509625 | 0.063444 | 6.344411 | 0.29698 | 6.263982 | 0.418365 | 10.06224 | 0.309757 | 23.306368 | false | 2024-05-02 | 2024-07-24 | 2 | NousResearch/Meta-Llama-3-8B |
vicgalle_Configurable-Llama-3.1-8B-Instruct_float16 | float16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/vicgalle/Configurable-Llama-3.1-8B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">vicgalle/Configurable-Llama-3.1-8B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/vicgalle__Configurable-Llama-3.1-8B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | vicgalle/Configurable-Llama-3.1-8B-Instruct | 133b3ab1a5385ff9b3d17da2addfe3fc1fd6f733 | 28.010111 | apache-2.0 | 12 | 8 | true | true | true | false | true | 0.79661 | 0.83124 | 83.124 | 0.504476 | 29.661398 | 0.172961 | 17.296073 | 0.274329 | 3.243848 | 0.384542 | 5.934375 | 0.359209 | 28.800975 | false | 2024-07-24 | 2024-08-05 | 0 | vicgalle/Configurable-Llama-3.1-8B-Instruct |
vicgalle_Configurable-Yi-1.5-9B-Chat_float16 | float16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/vicgalle/Configurable-Yi-1.5-9B-Chat" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">vicgalle/Configurable-Yi-1.5-9B-Chat</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/vicgalle__Configurable-Yi-1.5-9B-Chat-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | vicgalle/Configurable-Yi-1.5-9B-Chat | 992cb2232caae78eff6a836b2e0642f7cbf6018e | 23.972567 | apache-2.0 | 2 | 8 | true | true | true | false | true | 0.941909 | 0.432345 | 43.234507 | 0.54522 | 35.334445 | 0.073263 | 7.326284 | 0.343121 | 12.416107 | 0.427115 | 12.022656 | 0.401513 | 33.501404 | false | 2024-05-12 | 2024-06-26 | 0 | vicgalle/Configurable-Yi-1.5-9B-Chat |
vicgalle_ConfigurableBeagle-11B_float16 | float16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/vicgalle/ConfigurableBeagle-11B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">vicgalle/ConfigurableBeagle-11B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/vicgalle__ConfigurableBeagle-11B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | vicgalle/ConfigurableBeagle-11B | bbc16dbf94b8e8a99bb3e2ada6755faf9c2990dd | 22.635544 | apache-2.0 | 3 | 10 | true | true | true | false | true | 0.879857 | 0.583445 | 58.344526 | 0.528659 | 32.392023 | 0.043807 | 4.380665 | 0.302013 | 6.935123 | 0.395302 | 7.379427 | 0.337434 | 26.381501 | false | 2024-02-17 | 2024-06-26 | 0 | vicgalle/ConfigurableBeagle-11B |
vicgalle_ConfigurableHermes-7B_float16 | float16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/vicgalle/ConfigurableHermes-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">vicgalle/ConfigurableHermes-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/vicgalle__ConfigurableHermes-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | vicgalle/ConfigurableHermes-7B | 1333a88eaf6591836b2d9825d1eaec7260f336c9 | 19.536295 | apache-2.0 | 3 | 7 | true | true | true | false | true | 0.617282 | 0.54108 | 54.107989 | 0.457297 | 23.158164 | 0.047583 | 4.758308 | 0.276846 | 3.579418 | 0.405688 | 9.110938 | 0.302527 | 22.502955 | false | 2024-02-17 | 2024-06-26 | 0 | vicgalle/ConfigurableHermes-7B |
vicgalle_ConfigurableSOLAR-10.7B_float16 | float16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/vicgalle/ConfigurableSOLAR-10.7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">vicgalle/ConfigurableSOLAR-10.7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/vicgalle__ConfigurableSOLAR-10.7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | vicgalle/ConfigurableSOLAR-10.7B | 9d9baad88ea9dbaa61881f15e4f0d16e931033b4 | 19.045696 | apache-2.0 | 2 | 10 | true | true | true | false | true | 0.677681 | 0.509956 | 50.995581 | 0.486681 | 27.45095 | 0 | 0 | 0.298658 | 6.487696 | 0.380479 | 5.193229 | 0.31732 | 24.14672 | false | 2024-03-10 | 2024-06-26 | 0 | vicgalle/ConfigurableSOLAR-10.7B |
vicgalle_Humanish-RP-Llama-3.1-8B_float16 | float16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/vicgalle/Humanish-RP-Llama-3.1-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">vicgalle/Humanish-RP-Llama-3.1-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/vicgalle__Humanish-RP-Llama-3.1-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | vicgalle/Humanish-RP-Llama-3.1-8B | d27aa731db1d390a8d17b0a4565c9231ee5ae8b9 | 25.347671 | apache-2.0 | 6 | 8 | true | true | true | false | true | 0.753451 | 0.666926 | 66.692598 | 0.510039 | 29.95856 | 0.147281 | 14.728097 | 0.286913 | 4.9217 | 0.395208 | 8.267708 | 0.347656 | 27.517361 | false | 2024-08-03 | 2024-08-03 | 0 | vicgalle/Humanish-RP-Llama-3.1-8B |