eval_name
stringlengths 12
78
| Precision
stringclasses 2
values | Type
stringclasses 5
values | T
stringclasses 5
values | Weight type
stringclasses 1
value | Architecture
stringclasses 30
values | Model
stringlengths 355
551
| fullname
stringlengths 4
69
| Model sha
stringlengths 40
40
| Average ⬆️
float64 1.41
43.6
| Hub License
stringclasses 16
values | Hub ❤️
int64 0
5.38k
| #Params (B)
int64 0
140
| Available on the hub
bool 2
classes | Not_Merged
bool 2
classes | MoE
bool 2
classes | Flagged
bool 1
class | Chat Template
bool 2
classes | IFEval Raw
float64 0
0.84
| IFEval
float64 0
84.3
| BBH Raw
float64 0.29
0.71
| BBH
float64 1.46
57.7
| MATH Lvl 5 Raw
float64 0
0.36
| MATH Lvl 5
float64 0
36
| GPQA Raw
float64 0.22
0.4
| GPQA
float64 0
19.7
| MUSR Raw
float64 0.3
0.52
| MUSR
float64 0.78
25.8
| MMLU-PRO Raw
float64 0.1
0.57
| MMLU-PRO
float64 0
52.6
| Maintainer's Highlight
bool 2
classes | Upload To Hub Date
stringlengths 0
10
| Submission Date
stringclasses 24
values |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
meta-llama_Meta-Llama-3-8B_bfloat16 | bfloat16 | 🟢 pretrained | 🟢 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/meta-llama/Meta-Llama-3-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">meta-llama/Meta-Llama-3-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/meta-llama__Meta-Llama-3-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | meta-llama/Meta-Llama-3-8B | 62bd457b6fe961a42a631306577e622c83876cb6 | 13.412859 | llama3 | 5,382 | 8 | true | true | true | false | false | 0.145506 | 14.550615 | 0.459791 | 24.500764 | 0.032477 | 3.247734 | 0.305369 | 7.38255 | 0.361406 | 6.242448 | 0.320977 | 24.553044 | true | 2024-04-17 | 2024-06-12 |
meta-llama_Meta-Llama-3.1-8B_bfloat16 | bfloat16 | 🟢 pretrained | 🟢 | Original | Unknown | <a target="_blank" href="https://huggingface.co/meta-llama/Meta-Llama-3.1-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">meta-llama/Meta-Llama-3.1-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/meta-llama__Meta-Llama-3.1-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | meta-llama/Meta-Llama-3.1-8B | e5c39e551424c763dbc3e58e32ef2999d33a6d8d | 13.780949 | 0 | 0 | true | true | true | false | false | 0.126996 | 12.699637 | 0.466614 | 25.29478 | 0.046073 | 4.607251 | 0.296141 | 6.152125 | 0.382521 | 8.981771 | 0.324551 | 24.950133 | true | |||
meta-llama_Meta-Llama-3-70B_bfloat16 | bfloat16 | 🟢 pretrained | 🟢 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/meta-llama/Meta-Llama-3-70B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">meta-llama/Meta-Llama-3-70B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/meta-llama__Meta-Llama-3-70B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | meta-llama/Meta-Llama-3-70B | b4d08b7db49d488da3ac49adf25a6b9ac01ae338 | 26.365471 | llama3 | 774 | 70 | true | true | true | false | false | 0.160319 | 16.031906 | 0.646107 | 48.709813 | 0.165408 | 16.540785 | 0.397651 | 19.686801 | 0.451823 | 16.011198 | 0.470911 | 41.212323 | true | 2024-04-17 | 2024-06-12 |
meta-llama_Meta-Llama-3.1-70B_bfloat16 | bfloat16 | 🟢 pretrained | 🟢 | Original | Unknown | <a target="_blank" href="https://huggingface.co/meta-llama/Meta-Llama-3.1-70B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">meta-llama/Meta-Llama-3.1-70B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/meta-llama__Meta-Llama-3.1-70B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | meta-llama/Meta-Llama-3.1-70B | f7d3cc45ed4ff669a354baf2e0f05e65799a0bee | 25.910689 | 0 | 0 | true | true | true | false | false | 0.168438 | 16.843752 | 0.626007 | 46.399413 | 0.166918 | 16.691843 | 0.387584 | 18.344519 | 0.457187 | 16.581771 | 0.465426 | 40.602837 | true | |||
meta-llama_Meta-Llama-3-8B-Instruct_float16 | float16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/meta-llama/Meta-Llama-3-8B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">meta-llama/Meta-Llama-3-8B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/meta-llama__Meta-Llama-3-8B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | meta-llama/Meta-Llama-3-8B-Instruct | e1945c40cd546c78e41f1151f4db032b271faeaa | 20.483278 | llama3 | 3,178 | 8 | true | true | true | false | false | 0.478232 | 47.82322 | 0.491026 | 26.795284 | 0.083837 | 8.383686 | 0.292785 | 5.704698 | 0.380542 | 5.401042 | 0.359126 | 28.791741 | true | 2024-04-17 | 2024-07-08 |
meta-llama_Meta-Llama-3-8B-Instruct_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/meta-llama/Meta-Llama-3-8B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">meta-llama/Meta-Llama-3-8B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/meta-llama__Meta-Llama-3-8B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | meta-llama/Meta-Llama-3-8B-Instruct | e1945c40cd546c78e41f1151f4db032b271faeaa | 23.908736 | llama3 | 3,178 | 8 | true | true | true | false | true | 0.74084 | 74.083986 | 0.498871 | 28.24495 | 0.086858 | 8.685801 | 0.259228 | 1.230425 | 0.356823 | 1.602865 | 0.366439 | 29.604388 | true | 2024-04-17 | 2024-06-12 |
meta-llama_Meta-Llama-3-70B-Instruct_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/meta-llama/Meta-Llama-3-70B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">meta-llama/Meta-Llama-3-70B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/meta-llama__Meta-Llama-3-70B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | meta-llama/Meta-Llama-3-70B-Instruct | 7129260dd854a80eb10ace5f61c20324b472b31c | 36.183402 | llama3 | 1,343 | 70 | true | true | true | false | true | 0.809908 | 80.990771 | 0.65467 | 50.185133 | 0.233384 | 23.338369 | 0.286913 | 4.9217 | 0.415365 | 10.920573 | 0.520695 | 46.743868 | true | 2024-04-17 | 2024-06-12 |
meta-llama_Meta-Llama-3.1-8B-Instruct_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | Unknown | <a target="_blank" href="https://huggingface.co/meta-llama/Meta-Llama-3.1-8B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">meta-llama/Meta-Llama-3.1-8B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/meta-llama__Meta-Llama-3.1-8B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | meta-llama/Meta-Llama-3.1-8B-Instruct | df34336b42332c6d360959e259cd6271c6a09fd4 | 26.590138 | 0 | 0 | true | true | true | false | false | 0.774037 | 77.403733 | 0.501997 | 28.846616 | 0.1571 | 15.70997 | 0.268456 | 2.46085 | 0.369875 | 4.601042 | 0.374668 | 30.518617 | true | |||
meta-llama_Meta-Llama-3.1-70B-Instruct_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | Unknown | <a target="_blank" href="https://huggingface.co/meta-llama/Meta-Llama-3.1-70B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">meta-llama/Meta-Llama-3.1-70B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/meta-llama__Meta-Llama-3.1-70B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | meta-llama/Meta-Llama-3.1-70B-Instruct | b9461463b511ed3c0762467538ea32cf7c9669f2 | 35.967547 | 0 | 0 | true | true | true | false | false | 0.84283 | 84.283036 | 0.680974 | 54.452244 | 0.02719 | 2.719033 | 0.316275 | 8.836689 | 0.455365 | 17.320573 | 0.533743 | 48.193706 | true | |||
meta-llama_Llama-2-7b-hf_float16 | float16 | 🟢 pretrained | 🟢 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/meta-llama/Llama-2-7b-hf" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">meta-llama/Llama-2-7b-hf</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/meta-llama__Llama-2-7b-hf-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | meta-llama/Llama-2-7b-hf | 01c7f73d771dfac7d292323805ebc428287df4f9 | 8.718241 | llama2 | 1,589 | 6 | true | true | true | false | false | 0.251894 | 25.189386 | 0.34962 | 10.351417 | 0.012085 | 1.208459 | 0.266779 | 2.237136 | 0.370062 | 3.757813 | 0.186087 | 9.565233 | true | 2023-07-13 | 2024-06-12 |
meta-llama_Llama-2-70b-hf_float16 | float16 | 🟢 pretrained | 🟢 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/meta-llama/Llama-2-70b-hf" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">meta-llama/Llama-2-70b-hf</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/meta-llama__Llama-2-70b-hf-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | meta-llama/Llama-2-70b-hf | 3aba440b59558f995867ba6e1f58f21d0336b5bb | 18.246717 | llama2 | 821 | 68 | true | true | true | false | false | 0.240678 | 24.067807 | 0.547259 | 35.900062 | 0.024924 | 2.492447 | 0.302852 | 7.04698 | 0.412354 | 9.777604 | 0.371759 | 30.195405 | true | 2023-07-11 | 2024-06-12 |
meta-llama_Llama-2-13b-hf_float16 | float16 | 🟢 pretrained | 🟢 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/meta-llama/Llama-2-13b-hf" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">meta-llama/Llama-2-13b-hf</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/meta-llama__Llama-2-13b-hf-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | meta-llama/Llama-2-13b-hf | 5c31dfb671ce7cfe2d7bb7c04375e44c55e815b1 | 10.989657 | llama2 | 563 | 13 | true | true | true | false | false | 0.248247 | 24.824687 | 0.412562 | 17.22256 | 0.010574 | 1.057402 | 0.28104 | 4.138702 | 0.35375 | 3.385417 | 0.237783 | 15.309176 | true | 2023-07-13 | 2024-06-12 |
meta-llama_Llama-2-70b-chat-hf_float16 | float16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/meta-llama/Llama-2-70b-chat-hf" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">meta-llama/Llama-2-70b-chat-hf</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/meta-llama__Llama-2-70b-chat-hf-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | meta-llama/Llama-2-70b-chat-hf | e9149a12809580e8602995856f8098ce973d1080 | 12.733817 | llama2 | 2,124 | 68 | true | true | true | false | true | 0.495792 | 49.579228 | 0.304247 | 4.613767 | 0.009063 | 0.906344 | 0.264262 | 1.901566 | 0.368667 | 3.483333 | 0.243268 | 15.918661 | true | 2023-07-14 | 2024-06-12 |
meta-llama_Llama-2-7b-chat-hf_float16 | float16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/meta-llama/Llama-2-7b-chat-hf" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">meta-llama/Llama-2-7b-chat-hf</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/meta-llama__Llama-2-7b-chat-hf-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | meta-llama/Llama-2-7b-chat-hf | f5db02db724555f92da89c216ac04704f23d4590 | 9.396624 | llama2 | 3,708 | 6 | true | true | true | false | true | 0.396525 | 39.652455 | 0.311188 | 4.486525 | 0.006798 | 0.679758 | 0.254195 | 0.559284 | 0.368885 | 3.477344 | 0.167719 | 7.524379 | true | 2023-07-13 | 2024-06-12 |
meta-llama_Llama-2-13b-chat-hf_float16 | float16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/meta-llama/Llama-2-13b-chat-hf" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">meta-llama/Llama-2-13b-chat-hf</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/meta-llama__Llama-2-13b-chat-hf-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | meta-llama/Llama-2-13b-chat-hf | a2cb7a712bb6e5e736ca7f8cd98167f81a0b5bd8 | 11.003754 | llama2 | 998 | 13 | true | true | true | false | true | 0.398473 | 39.847272 | 0.334274 | 7.15538 | 0.006042 | 0.60423 | 0.231544 | 0 | 0.400729 | 8.157813 | 0.19232 | 10.257831 | true | 2023-07-13 | 2024-06-12 |
pszemraj_Llama-3-6.3b-v0.1_bfloat16 | bfloat16 | 🟩 continuously pretrained | 🟩 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/pszemraj/Llama-3-6.3b-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">pszemraj/Llama-3-6.3b-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/pszemraj__Llama-3-6.3b-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | pszemraj/Llama-3-6.3b-v0.1 | 7000b39346162f95f19aa4ca3975242db61902d7 | 10.283602 | llama3 | 6 | 6 | true | true | true | false | false | 0.10439 | 10.438969 | 0.419681 | 18.679996 | 0.015106 | 1.510574 | 0.283557 | 4.474273 | 0.390833 | 6.154167 | 0.283993 | 20.443632 | false | 2024-05-17 | 2024-06-26 |
togethercomputer_LLaMA-2-7B-32K_float16 | float16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/togethercomputer/LLaMA-2-7B-32K" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">togethercomputer/LLaMA-2-7B-32K</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/togethercomputer__LLaMA-2-7B-32K-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | togethercomputer/LLaMA-2-7B-32K | 46c24bb5aef59722fa7aa6d75e832afd1d64b980 | 6.711835 | llama2 | 526 | 7 | true | true | true | false | false | 0.186497 | 18.649738 | 0.339952 | 8.089984 | 0.006798 | 0.679758 | 0.25 | 0 | 0.375365 | 4.320573 | 0.176779 | 8.530954 | true | 2023-07-26 | 2024-06-12 |
HiroseKoichi_Llama-Salad-4x8B-V3_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | MixtralForCausalLM | <a target="_blank" href="https://huggingface.co/HiroseKoichi/Llama-Salad-4x8B-V3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">HiroseKoichi/Llama-Salad-4x8B-V3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/HiroseKoichi__Llama-Salad-4x8B-V3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | HiroseKoichi/Llama-Salad-4x8B-V3 | a343915429779efbd1478f01ba1f7fd9d8d226c0 | 24.746468 | llama3 | 4 | 24 | true | false | false | false | true | 0.665352 | 66.535238 | 0.524465 | 31.928849 | 0.085347 | 8.534743 | 0.302852 | 7.04698 | 0.374031 | 6.453906 | 0.351812 | 27.979093 | false | 2024-06-17 | 2024-06-26 |
togethercomputer_Llama-2-7B-32K-Instruct_float16 | float16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/togethercomputer/Llama-2-7B-32K-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">togethercomputer/Llama-2-7B-32K-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/togethercomputer__Llama-2-7B-32K-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | togethercomputer/Llama-2-7B-32K-Instruct | d27380af003252f5eb0d218e104938b4e673e3f3 | 8.170425 | llama2 | 160 | 7 | true | true | true | false | false | 0.213 | 21.300039 | 0.344347 | 8.56347 | 0.010574 | 1.057402 | 0.251678 | 0.223714 | 0.405594 | 9.199219 | 0.178108 | 8.678709 | true | 2023-08-08 | 2024-06-12 |
win10_llama3-13.45b-Instruct_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/win10/llama3-13.45b-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">win10/llama3-13.45b-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/win10__llama3-13.45b-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | win10/llama3-13.45b-Instruct | 94cc0f415e355c6d3d47168a6ff5239ca586904a | 17.264693 | llama3 | 1 | 13 | true | false | true | false | true | 0.414435 | 41.443481 | 0.486542 | 26.67569 | 0.019637 | 1.963746 | 0.258389 | 1.118568 | 0.38476 | 6.328385 | 0.334525 | 26.058289 | false | 2024-06-09 | 2024-06-26 |
TencentARC_LLaMA-Pro-8B_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/TencentARC/LLaMA-Pro-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">TencentARC/LLaMA-Pro-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/TencentARC__LLaMA-Pro-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | TencentARC/LLaMA-Pro-8B | 7115e7179060e0623d1ee9ff4476faed7e478d8c | 8.778934 | llama2 | 170 | 8 | true | true | true | false | false | 0.227714 | 22.771358 | 0.34842 | 9.29395 | 0.016616 | 1.661631 | 0.260067 | 1.342282 | 0.401812 | 8.593229 | 0.1811 | 9.011155 | true | 2024-01-05 | 2024-06-12 |
migtissera_Llama-3-70B-Synthia-v3.5_float16 | float16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/migtissera/Llama-3-70B-Synthia-v3.5" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">migtissera/Llama-3-70B-Synthia-v3.5</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/migtissera__Llama-3-70B-Synthia-v3.5-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | migtissera/Llama-3-70B-Synthia-v3.5 | 8744db0bccfc18f1847633da9d29fc89b35b4190 | 35.204299 | llama3 | 5 | 70 | true | true | true | false | true | 0.60765 | 60.764992 | 0.648864 | 49.11816 | 0.189577 | 18.957704 | 0.387584 | 18.344519 | 0.492198 | 23.391406 | 0.465841 | 40.64901 | false | 2024-05-26 | 2024-06-26 |
PJMixers_LLaMa-3-CursedStock-v2.0-8B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/PJMixers/LLaMa-3-CursedStock-v2.0-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">PJMixers/LLaMa-3-CursedStock-v2.0-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/PJMixers__LLaMa-3-CursedStock-v2.0-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | PJMixers/LLaMa-3-CursedStock-v2.0-8B | d47cc29df363f71ffaf6cd21ac4bdeefa27359db | 24.027665 | llama3 | 8 | 8 | true | false | true | false | true | 0.633079 | 63.307912 | 0.527116 | 32.563612 | 0.086103 | 8.610272 | 0.274329 | 3.243848 | 0.385625 | 8.036458 | 0.355635 | 28.403886 | false | 2024-06-26 | 2024-06-27 |
gradientai_Llama-3-8B-Instruct-Gradient-1048k_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/gradientai/Llama-3-8B-Instruct-Gradient-1048k" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">gradientai/Llama-3-8B-Instruct-Gradient-1048k</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/gradientai__Llama-3-8B-Instruct-Gradient-1048k-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | gradientai/Llama-3-8B-Instruct-Gradient-1048k | 8697fb25cb77c852311e03b4464b8467471d56a4 | 18.119688 | llama3 | 657 | 8 | true | true | true | false | true | 0.445559 | 44.555889 | 0.43459 | 21.010529 | 0.043807 | 4.380665 | 0.277685 | 3.691275 | 0.42975 | 13.51875 | 0.294049 | 21.561022 | true | 2024-04-29 | 2024-06-12 |
TencentARC_LLaMA-Pro-8B-Instruct_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/TencentARC/LLaMA-Pro-8B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">TencentARC/LLaMA-Pro-8B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/TencentARC__LLaMA-Pro-8B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | TencentARC/LLaMA-Pro-8B-Instruct | 9850c8afce19a69d8fc4a1603a82441157514016 | 15.144991 | llama2 | 58 | 8 | true | true | true | false | true | 0.448606 | 44.860636 | 0.422421 | 19.485726 | 0.016616 | 1.661631 | 0.274329 | 3.243848 | 0.419021 | 11.110938 | 0.194564 | 10.507166 | true | 2024-01-06 | 2024-06-12 |
NousResearch_Hermes-2-Pro-Llama-3-8B_float16 | float16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/NousResearch/Hermes-2-Pro-Llama-3-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">NousResearch/Hermes-2-Pro-Llama-3-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/NousResearch__Hermes-2-Pro-Llama-3-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | NousResearch/Hermes-2-Pro-Llama-3-8B | bc265d1781299ed2045214289c927c207439a729 | 21.629392 | llama3 | 385 | 8 | true | true | true | false | true | 0.536184 | 53.618399 | 0.507113 | 30.667993 | 0.057402 | 5.740181 | 0.292785 | 5.704698 | 0.42624 | 11.246615 | 0.305186 | 22.798463 | true | 2024-04-30 | 2024-06-13 |
MaziyarPanahi_Llama-3-70B-Instruct-DPO-v0.2_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/MaziyarPanahi/Llama-3-70B-Instruct-DPO-v0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">MaziyarPanahi/Llama-3-70B-Instruct-DPO-v0.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/MaziyarPanahi__Llama-3-70B-Instruct-DPO-v0.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | MaziyarPanahi/Llama-3-70B-Instruct-DPO-v0.2 | 95366b974baedee4d95c1e841bc3d15e94753804 | 37.976418 | llama3 | 13 | 70 | true | true | true | false | true | 0.820849 | 82.084868 | 0.643543 | 48.571706 | 0.229607 | 22.960725 | 0.341443 | 12.192394 | 0.444573 | 15.304948 | 0.520695 | 46.743868 | false | 2024-04-27 | 2024-06-26 |
maldv_badger-kappa-llama-3-8b_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/maldv/badger-kappa-llama-3-8b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">maldv/badger-kappa-llama-3-8b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/maldv__badger-kappa-llama-3-8b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | maldv/badger-kappa-llama-3-8b | aa6863eb816ca6ad29453b8aaf846962c4328998 | 21.003043 | llama3 | 1 | 8 | true | true | true | false | true | 0.469464 | 46.946435 | 0.508493 | 30.153239 | 0.076284 | 7.628399 | 0.302852 | 7.04698 | 0.37651 | 4.297135 | 0.369515 | 29.94607 | false | 2024-06-02 | 2024-06-27 |
MaziyarPanahi_Llama-3-70B-Instruct-DPO-v0.4_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/MaziyarPanahi/Llama-3-70B-Instruct-DPO-v0.4" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">MaziyarPanahi/Llama-3-70B-Instruct-DPO-v0.4</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/MaziyarPanahi__Llama-3-70B-Instruct-DPO-v0.4-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | MaziyarPanahi/Llama-3-70B-Instruct-DPO-v0.4 | cb03e4d810b82d86e7cb01ab146bade09a5d06d1 | 32.18411 | llama3 | 10 | 70 | true | true | true | false | true | 0.502737 | 50.273718 | 0.641819 | 48.397766 | 0.226586 | 22.65861 | 0.339765 | 11.96868 | 0.428792 | 13.098958 | 0.520362 | 46.70693 | false | 2024-04-28 | 2024-06-26 |
AI-Sweden-Models_Llama-3-8B-instruct_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/AI-Sweden-Models/Llama-3-8B-instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">AI-Sweden-Models/Llama-3-8B-instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/AI-Sweden-Models__Llama-3-8B-instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | AI-Sweden-Models/Llama-3-8B-instruct | 4e1c955228bdb4d69c1c4560e8d5872312a8f033 | 13.777204 | llama3 | 7 | 8 | true | true | true | false | true | 0.240128 | 24.012841 | 0.417346 | 18.388096 | 0.004532 | 0.453172 | 0.26594 | 2.12528 | 0.477094 | 19.936719 | 0.259724 | 17.747119 | false | 2024-06-01 | 2024-06-27 |
RLHFlow_LLaMA3-iterative-DPO-final_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/RLHFlow/LLaMA3-iterative-DPO-final" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">RLHFlow/LLaMA3-iterative-DPO-final</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/RLHFlow__LLaMA3-iterative-DPO-final-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | RLHFlow/LLaMA3-iterative-DPO-final | 40b73bd07a019795837f80579fe95470484ca82b | 19.626351 | llama3 | 37 | 8 | true | true | true | false | true | 0.533411 | 53.341135 | 0.505826 | 29.78776 | 0 | 0 | 0.283557 | 4.474273 | 0.367271 | 5.075521 | 0.325715 | 25.079418 | false | 2024-05-17 | 2024-06-26 |
chujiezheng_Llama-3-Instruct-8B-SimPO-ExPO_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/chujiezheng/Llama-3-Instruct-8B-SimPO-ExPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">chujiezheng/Llama-3-Instruct-8B-SimPO-ExPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/chujiezheng__Llama-3-Instruct-8B-SimPO-ExPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | chujiezheng/Llama-3-Instruct-8B-SimPO-ExPO | 3fcaa9fe99691659eb197487e9a343f601bf63f2 | 21.972344 | llama3 | 14 | 8 | true | true | true | false | true | 0.643371 | 64.33707 | 0.476452 | 25.868282 | 0.005287 | 0.528701 | 0.286913 | 4.9217 | 0.39201 | 9.501302 | 0.340093 | 26.677009 | false | 2024-05-26 | 2024-06-26 |
NousResearch_Yarn-Llama-2-7b-64k_bfloat16 | bfloat16 | 🟢 pretrained | 🟢 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/NousResearch/Yarn-Llama-2-7b-64k" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">NousResearch/Yarn-Llama-2-7b-64k</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/NousResearch__Yarn-Llama-2-7b-64k-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | NousResearch/Yarn-Llama-2-7b-64k | 08491431ac3b50add7443f5d4c02850801d877be | 7.122178 | 23 | 7 | true | true | true | false | false | 0.169986 | 16.998564 | 0.332628 | 7.044055 | 0.009819 | 0.981873 | 0.264262 | 1.901566 | 0.393875 | 6.934375 | 0.179854 | 8.872636 | true | 2023-08-30 | 2024-06-13 |
|
NousResearch_Yarn-Llama-2-7b-128k_bfloat16 | bfloat16 | 🟢 pretrained | 🟢 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/NousResearch/Yarn-Llama-2-7b-128k" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">NousResearch/Yarn-Llama-2-7b-128k</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/NousResearch__Yarn-Llama-2-7b-128k-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | NousResearch/Yarn-Llama-2-7b-128k | e1ceedbbf2ed28b88086794441a6c05606d15437 | 6.68892 | 38 | 7 | true | true | true | false | false | 0.148478 | 14.847826 | 0.324803 | 6.144692 | 0.007553 | 0.755287 | 0.260067 | 1.342282 | 0.396698 | 8.253906 | 0.179106 | 8.789524 | true | 2023-08-31 | 2024-06-13 |
|
Magpie-Align_Llama-3-8B-Magpie-Align-v0.1_float16 | float16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/Magpie-Align/Llama-3-8B-Magpie-Align-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Magpie-Align/Llama-3-8B-Magpie-Align-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Magpie-Align__Llama-3-8B-Magpie-Align-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Magpie-Align/Llama-3-8B-Magpie-Align-v0.1 | a83ddac146fb2da1dd1bfa4069e336074d1439a8 | 16.257419 | llama3 | 9 | 8 | true | true | true | false | true | 0.402719 | 40.271923 | 0.478941 | 26.289712 | 0.032477 | 3.247734 | 0.276846 | 3.579418 | 0.308698 | 1.920573 | 0.300116 | 22.235151 | false | 2024-06-29 | 2024-07-03 |
Magpie-Align_Llama-3-8B-Magpie-Align-v0.1_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/Magpie-Align/Llama-3-8B-Magpie-Align-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Magpie-Align/Llama-3-8B-Magpie-Align-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Magpie-Align__Llama-3-8B-Magpie-Align-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Magpie-Align/Llama-3-8B-Magpie-Align-v0.1 | a83ddac146fb2da1dd1bfa4069e336074d1439a8 | 16.473094 | llama3 | 9 | 8 | true | true | true | false | true | 0.411812 | 41.181177 | 0.481144 | 26.691761 | 0.033988 | 3.398792 | 0.275168 | 3.355705 | 0.304698 | 1.920573 | 0.300615 | 22.290559 | false | 2024-06-29 | 2024-07-03 |
NousResearch_Yarn-Llama-2-13b-128k_bfloat16 | bfloat16 | 🟢 pretrained | 🟢 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/NousResearch/Yarn-Llama-2-13b-128k" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">NousResearch/Yarn-Llama-2-13b-128k</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/NousResearch__Yarn-Llama-2-13b-128k-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | NousResearch/Yarn-Llama-2-13b-128k | 4e3e87a067f64f8814c83dd5e3bad92dcf8a2391 | 8.393442 | 113 | 13 | true | true | true | false | false | 0.165464 | 16.54643 | 0.382682 | 13.505319 | 0.011329 | 1.132931 | 0.258389 | 1.118568 | 0.34575 | 3.385417 | 0.232048 | 14.671986 | true | 2023-08-30 | 2024-06-13 |
|
Salesforce_LLaMA-3-8B-SFR-Iterative-DPO-R_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/Salesforce/LLaMA-3-8B-SFR-Iterative-DPO-R" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Salesforce/LLaMA-3-8B-SFR-Iterative-DPO-R</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Salesforce__LLaMA-3-8B-SFR-Iterative-DPO-R-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Salesforce/LLaMA-3-8B-SFR-Iterative-DPO-R | ad7d1aed82eb6d8ca4b3aad627ff76f72ab34f70 | 17.029765 | llama3 | 73 | 8 | true | true | true | false | true | 0.381562 | 38.156203 | 0.501195 | 29.150289 | 0.001511 | 0.151057 | 0.287752 | 5.033557 | 0.363333 | 5.55 | 0.317237 | 24.137485 | true | 2024-05-09 | 2024-07-02 |
winglian_llama-3-8b-256k-PoSE_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/winglian/llama-3-8b-256k-PoSE" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">winglian/llama-3-8b-256k-PoSE</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/winglian__llama-3-8b-256k-PoSE-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | winglian/llama-3-8b-256k-PoSE | 93e7b0b6433c96583ffcef3bc47203e6fdcbbe8b | 6.545127 | 41 | 8 | false | true | true | false | true | 0.290911 | 29.091145 | 0.315658 | 5.502849 | 0.01435 | 1.435045 | 0.25755 | 1.006711 | 0.331552 | 0.94401 | 0.111619 | 1.291002 | false | 2024-04-26 | 2024-06-26 |
|
cognitivecomputations_dolphin-2.9-llama3-8b_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/cognitivecomputations/dolphin-2.9-llama3-8b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">cognitivecomputations/dolphin-2.9-llama3-8b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/cognitivecomputations__dolphin-2.9-llama3-8b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | cognitivecomputations/dolphin-2.9-llama3-8b | 5aeb036f9215c558b483a654a8c6e1cc22e841bf | 18.302168 | other | 377 | 8 | true | true | true | false | true | 0.385034 | 38.503393 | 0.494992 | 27.858929 | 0.050604 | 5.060423 | 0.286913 | 4.9217 | 0.437531 | 13.791406 | 0.277094 | 19.677157 | true | 2024-04-20 | 2024-06-12 |
Weyaxi_Einstein-v6.1-Llama3-8B_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/Weyaxi/Einstein-v6.1-Llama3-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Weyaxi/Einstein-v6.1-Llama3-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Weyaxi__Einstein-v6.1-Llama3-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Weyaxi/Einstein-v6.1-Llama3-8B | 5cab6d54666b6024d0f745d61abf1842edb934e0 | 19.993258 | other | 60 | 8 | true | true | true | false | true | 0.456825 | 45.682456 | 0.50083 | 29.383773 | 0.057402 | 5.740181 | 0.281879 | 4.250559 | 0.421281 | 11.226823 | 0.313082 | 23.675754 | false | 2024-04-19 | 2024-06-26 |
MaziyarPanahi_Llama-3-8B-Instruct-v0.10_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/MaziyarPanahi/Llama-3-8B-Instruct-v0.10" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">MaziyarPanahi/Llama-3-8B-Instruct-v0.10</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/MaziyarPanahi__Llama-3-8B-Instruct-v0.10-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | MaziyarPanahi/Llama-3-8B-Instruct-v0.10 | 4411eb9f6f5e4c462a6bdbc64c26dcc123100b66 | 26.658934 | other | 3 | 8 | true | true | true | false | true | 0.766743 | 76.674335 | 0.492431 | 27.924674 | 0.049094 | 4.909366 | 0.308725 | 7.829978 | 0.421437 | 10.813021 | 0.38622 | 31.802231 | false | 2024-06-04 | 2024-06-26 |
nbeerbower_llama-3-gutenberg-8B_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/nbeerbower/llama-3-gutenberg-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">nbeerbower/llama-3-gutenberg-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/nbeerbower__llama-3-gutenberg-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | nbeerbower/llama-3-gutenberg-8B | 4ed3aac5e30c078bee79ae193c2d301d38860b20 | 21.170348 | other | 7 | 8 | true | true | true | false | false | 0.437191 | 43.71911 | 0.49936 | 27.958133 | 0.070242 | 7.024169 | 0.301174 | 6.823266 | 0.407302 | 10.046094 | 0.383062 | 31.451315 | false | 2024-05-05 | 2024-07-10 |
chargoddard_prometheus-2-llama-3-8b_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/chargoddard/prometheus-2-llama-3-8b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">chargoddard/prometheus-2-llama-3-8b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/chargoddard__prometheus-2-llama-3-8b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | chargoddard/prometheus-2-llama-3-8b | 90a728ac98e5b4169f88ae4945e357cf45477568 | 19.155216 | apache-2.0 | 2 | 8 | true | false | true | false | true | 0.52889 | 52.889001 | 0.493114 | 27.803839 | 0.072508 | 7.250755 | 0.272651 | 3.020134 | 0.339583 | 0.78125 | 0.308677 | 23.186318 | false | 2024-05-26 | 2024-06-26 |
vicgalle_Roleplay-Llama-3-8B_float16 | float16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/vicgalle/Roleplay-Llama-3-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">vicgalle/Roleplay-Llama-3-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/vicgalle__Roleplay-Llama-3-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | vicgalle/Roleplay-Llama-3-8B | 57297eb57dcc2c116f061d9dda341094203da01b | 23.944654 | apache-2.0 | 32 | 8 | true | true | true | false | true | 0.732022 | 73.202215 | 0.501232 | 28.554604 | 0.086858 | 8.685801 | 0.260906 | 1.454139 | 0.352885 | 1.677344 | 0.370844 | 30.093824 | false | 2024-04-19 | 2024-06-26 |
MaziyarPanahi_Llama-3-8B-Instruct-v0.8_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/MaziyarPanahi/Llama-3-8B-Instruct-v0.8" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">MaziyarPanahi/Llama-3-8B-Instruct-v0.8</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/MaziyarPanahi__Llama-3-8B-Instruct-v0.8-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | MaziyarPanahi/Llama-3-8B-Instruct-v0.8 | bd80951b7ae97f633ed48b80334af8df96b49f36 | 26.747118 | other | 3 | 8 | true | true | true | false | true | 0.751231 | 75.123118 | 0.496278 | 28.270419 | 0.070997 | 7.099698 | 0.305369 | 7.38255 | 0.420198 | 10.92474 | 0.38514 | 31.682181 | false | 2024-05-01 | 2024-07-11 |
skymizer_Llama2-7b-sft-chat-custom-template-dpo_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/skymizer/Llama2-7b-sft-chat-custom-template-dpo" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">skymizer/Llama2-7b-sft-chat-custom-template-dpo</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/skymizer__Llama2-7b-sft-chat-custom-template-dpo-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | skymizer/Llama2-7b-sft-chat-custom-template-dpo | 22302ebd8c551a5f302fcb8366cc61fdeedf0e00 | 10.065019 | llama2 | 0 | 6 | true | true | true | false | false | 0.235282 | 23.528238 | 0.368847 | 11.238865 | 0.009819 | 0.981873 | 0.239094 | 0 | 0.442865 | 14.12474 | 0.194648 | 10.516401 | false | 2024-06-11 | 2024-07-01 |
refuelai_Llama-3-Refueled_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/refuelai/Llama-3-Refueled" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">refuelai/Llama-3-Refueled</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/refuelai__Llama-3-Refueled-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | refuelai/Llama-3-Refueled | ff6d1c3ba37b31d4af421951c2300f2256fb3691 | 22.728276 | cc-by-nc-4.0 | 185 | 8 | true | true | true | false | true | 0.461995 | 46.199528 | 0.587077 | 41.721971 | 0.039275 | 3.927492 | 0.299497 | 6.599553 | 0.445406 | 14.642448 | 0.309508 | 23.278664 | true | 2024-05-03 | 2024-06-12 |
VAGOsolutions_Llama-3-SauerkrautLM-70b-Instruct_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/VAGOsolutions/Llama-3-SauerkrautLM-70b-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">VAGOsolutions/Llama-3-SauerkrautLM-70b-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/VAGOsolutions__Llama-3-SauerkrautLM-70b-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | VAGOsolutions/Llama-3-SauerkrautLM-70b-Instruct | 707cfd1a93875247c0223e0c7e3d86d58c432318 | 37.816766 | other | 15 | 70 | true | true | true | false | true | 0.804462 | 80.446216 | 0.666325 | 52.02958 | 0.216767 | 21.676737 | 0.32802 | 10.402685 | 0.433938 | 13.542188 | 0.539229 | 48.803191 | false | 2024-04-24 | 2024-06-26 |
collaiborateorg_Collaiborator-MEDLLM-Llama-3-8B-v2_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/collaiborateorg/Collaiborator-MEDLLM-Llama-3-8B-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">collaiborateorg/Collaiborator-MEDLLM-Llama-3-8B-v2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/collaiborateorg__Collaiborator-MEDLLM-Llama-3-8B-v2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | collaiborateorg/Collaiborator-MEDLLM-Llama-3-8B-v2 | 2560556d655d0ecaefec10f579c92292d65fb28b | 17.888695 | 0 | 8 | false | true | true | false | false | 0.380887 | 38.088716 | 0.464803 | 23.648503 | 0.053625 | 5.362538 | 0.333054 | 11.073826 | 0.343427 | 1.595052 | 0.348072 | 27.563534 | false | 2024-06-27 |
||
maldv_badger-mu-llama-3-8b_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/maldv/badger-mu-llama-3-8b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">maldv/badger-mu-llama-3-8b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/maldv__badger-mu-llama-3-8b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | maldv/badger-mu-llama-3-8b | 952a269bb1e6c18ee772c6d088e74d305df4425d | 19.768293 | cc-by-nc-4.0 | 0 | 8 | true | true | true | false | true | 0.491946 | 49.194581 | 0.514288 | 30.513965 | 0.022659 | 2.265861 | 0.259228 | 1.230425 | 0.355458 | 5.698958 | 0.367354 | 29.705969 | false | 2024-06-27 | 2024-06-27 |
maldv_badger-lambda-llama-3-8b_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/maldv/badger-lambda-llama-3-8b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">maldv/badger-lambda-llama-3-8b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/maldv__badger-lambda-llama-3-8b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | maldv/badger-lambda-llama-3-8b | 8ef157d0d3c12212ca5e70d354869aed90e03f22 | 20.755028 | cc-by-nc-4.0 | 8 | 8 | true | true | true | false | true | 0.486076 | 48.607583 | 0.496349 | 28.10305 | 0.083082 | 8.308157 | 0.281879 | 4.250559 | 0.375365 | 4.520573 | 0.376662 | 30.740248 | false | 2024-06-10 | 2024-06-26 |
NousResearch_Hermes-2-Theta-Llama-3-8B_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/NousResearch/Hermes-2-Theta-Llama-3-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">NousResearch/Hermes-2-Theta-Llama-3-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/NousResearch__Hermes-2-Theta-Llama-3-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | NousResearch/Hermes-2-Theta-Llama-3-8B | 885173e97ab8572b444f7db1290d5d0386e26816 | 24.624731 | apache-2.0 | 183 | 8 | true | true | true | false | true | 0.651788 | 65.178837 | 0.520667 | 32.046074 | 0.086858 | 8.685801 | 0.303691 | 7.158837 | 0.394896 | 8.361979 | 0.336852 | 26.316859 | true | 2024-05-05 | 2024-07-11 |
NousResearch_Nous-Hermes-llama-2-7b_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/NousResearch/Nous-Hermes-llama-2-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">NousResearch/Nous-Hermes-llama-2-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/NousResearch__Nous-Hermes-llama-2-7b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | NousResearch/Nous-Hermes-llama-2-7b | b7c3ec54b754175e006ef75696a2ba3802697078 | 9.278952 | mit | 67 | 6 | true | true | true | false | false | 0.172908 | 17.290788 | 0.382394 | 13.78942 | 0.006798 | 0.679758 | 0.263423 | 1.789709 | 0.425719 | 11.68151 | 0.193983 | 10.442524 | true | 2023-07-25 | 2024-06-12 |
IDEA-CCNL_Ziya-LLaMA-13B-v1_float16 | float16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/IDEA-CCNL/Ziya-LLaMA-13B-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">IDEA-CCNL/Ziya-LLaMA-13B-v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/IDEA-CCNL__Ziya-LLaMA-13B-v1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | IDEA-CCNL/Ziya-LLaMA-13B-v1 | 64d931f346e1a49ea3bbca07a83137075bab1c66 | 3.906425 | gpl-3.0 | 273 | 13 | true | true | true | false | false | 0.169686 | 16.968643 | 0.287703 | 1.463617 | 0 | 0 | 0.249161 | 0 | 0.375052 | 3.88151 | 0.110123 | 1.124778 | true | 2023-05-16 | 2024-06-12 |
xinchen9_llama3-b8-ft-dis_float16 | float16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/xinchen9/llama3-b8-ft-dis" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">xinchen9/llama3-b8-ft-dis</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/xinchen9__llama3-b8-ft-dis-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | xinchen9/llama3-b8-ft-dis | e4da730f28f79543262de37908943c35f8df81fe | 13.847611 | apache-2.0 | 0 | 8 | true | true | true | false | false | 0.154599 | 15.459869 | 0.462579 | 24.727457 | 0.031722 | 3.172205 | 0.312919 | 8.389262 | 0.365375 | 6.405208 | 0.324385 | 24.931664 | false | 2024-06-28 | 2024-07-11 |
maldv_badger-writer-llama-3-8b_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/maldv/badger-writer-llama-3-8b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">maldv/badger-writer-llama-3-8b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/maldv__badger-writer-llama-3-8b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | maldv/badger-writer-llama-3-8b | 1d8134d01af87e994571ae16ccd7b31cce42418f | 20.933242 | cc-by-nc-4.0 | 3 | 8 | true | false | true | false | true | 0.530314 | 53.031401 | 0.486389 | 26.878361 | 0.06571 | 6.570997 | 0.28943 | 5.257271 | 0.358094 | 3.195052 | 0.375997 | 30.666371 | false | 2024-06-17 | 2024-06-26 |
Weyaxi_Einstein-v6.1-developed-by-Weyaxi-Llama3-8B_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/Weyaxi/Einstein-v6.1-developed-by-Weyaxi-Llama3-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Weyaxi/Einstein-v6.1-developed-by-Weyaxi-Llama3-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Weyaxi__Einstein-v6.1-developed-by-Weyaxi-Llama3-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Weyaxi/Einstein-v6.1-developed-by-Weyaxi-Llama3-8B | b7507e94146c0832c26609e9ab8115934d3e25b3 | 19.054392 | other | 1 | 8 | true | true | true | false | true | 0.392702 | 39.270247 | 0.504384 | 29.694447 | 0.055891 | 5.589124 | 0.27349 | 3.131991 | 0.43325 | 13.389583 | 0.309259 | 23.25096 | false | 2024-06-23 | 2024-06-26 |
johnsutor_Llama-3-8B-Instruct_ties-density-0.3_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/johnsutor/Llama-3-8B-Instruct_ties-density-0.3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">johnsutor/Llama-3-8B-Instruct_ties-density-0.3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/johnsutor__Llama-3-8B-Instruct_ties-density-0.3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | johnsutor/Llama-3-8B-Instruct_ties-density-0.3 | 8d051f3eec3fc93a4521073c2d290c4ff9144fc1 | 18.691324 | apache-2.0 | 0 | 8 | true | false | true | false | false | 0.362628 | 36.262783 | 0.490611 | 27.724507 | 0.057402 | 5.740181 | 0.296141 | 6.152125 | 0.40249 | 10.477865 | 0.332114 | 25.790485 | false | 2024-06-07 | 2024-06-26 |
Replete-AI_Replete-Coder-Llama3-8B_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/Replete-AI/Replete-Coder-Llama3-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Replete-AI/Replete-Coder-Llama3-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Replete-AI__Replete-Coder-Llama3-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Replete-AI/Replete-Coder-Llama3-8B | 2aca75c53e7eb2f523889ab1a279e349b8f1b0e8 | 11.655859 | other | 33 | 8 | true | true | true | false | true | 0.472936 | 47.293625 | 0.327128 | 7.055476 | 0.029456 | 2.945619 | 0.260906 | 1.454139 | 0.395302 | 7.51276 | 0.133062 | 3.673537 | false | 2024-06-24 | 2024-06-26 |
Enno-Ai_EnnoAi-Pro-Llama-3-8B-v0.3_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/Enno-Ai/EnnoAi-Pro-Llama-3-8B-v0.3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Enno-Ai/EnnoAi-Pro-Llama-3-8B-v0.3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Enno-Ai__EnnoAi-Pro-Llama-3-8B-v0.3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Enno-Ai/EnnoAi-Pro-Llama-3-8B-v0.3 | cf29b8b484a909132e3a1f85ce891d28347c0d13 | 17.498882 | creativeml-openrail-m | 0 | 8 | true | true | true | false | true | 0.508257 | 50.825698 | 0.410058 | 16.668386 | 0.010574 | 1.057402 | 0.265101 | 2.013423 | 0.423573 | 12.313281 | 0.299036 | 22.1151 | false | 2024-06-26 | 2024-06-26 |
Enno-Ai_EnnoAi-Pro-Llama-3-8B_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/Enno-Ai/EnnoAi-Pro-Llama-3-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Enno-Ai/EnnoAi-Pro-Llama-3-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Enno-Ai__EnnoAi-Pro-Llama-3-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Enno-Ai/EnnoAi-Pro-Llama-3-8B | 6a5d745bdd304753244fe601e2a958d37d13cd71 | 12.174667 | creativeml-openrail-m | 0 | 8 | true | true | true | false | true | 0.319538 | 31.953772 | 0.415158 | 17.507545 | 0.001511 | 0.151057 | 0.261745 | 1.565996 | 0.407052 | 9.08151 | 0.215093 | 12.788121 | false | 2024-07-01 | 2024-07-08 |
johnsutor_Llama-3-8B-Instruct_breadcrumbs-density-0.9-gamma-0.01_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/johnsutor/Llama-3-8B-Instruct_breadcrumbs-density-0.9-gamma-0.01" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">johnsutor/Llama-3-8B-Instruct_breadcrumbs-density-0.9-gamma-0.01</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/johnsutor__Llama-3-8B-Instruct_breadcrumbs-density-0.9-gamma-0.01-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | johnsutor/Llama-3-8B-Instruct_breadcrumbs-density-0.9-gamma-0.01 | c88c6b65f751156e7bc04c738947387eb55747e9 | 18.483613 | apache-2.0 | 0 | 8 | true | false | true | false | false | 0.278996 | 27.89964 | 0.486115 | 27.224869 | 0 | 0 | 0.294463 | 5.928412 | 0.51501 | 24.242969 | 0.330452 | 25.605792 | false | 2024-06-08 | 2024-06-26 |
johnsutor_Llama-3-8B-Instruct_breadcrumbs-density-0.7-gamma-0.1_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/johnsutor/Llama-3-8B-Instruct_breadcrumbs-density-0.7-gamma-0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">johnsutor/Llama-3-8B-Instruct_breadcrumbs-density-0.7-gamma-0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/johnsutor__Llama-3-8B-Instruct_breadcrumbs-density-0.7-gamma-0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | johnsutor/Llama-3-8B-Instruct_breadcrumbs-density-0.7-gamma-0.1 | 139a9bccd0ffb284e670a181a5986a01b1420c6c | 21.734982 | apache-2.0 | 0 | 8 | true | false | true | false | false | 0.430222 | 43.022181 | 0.51571 | 31.163508 | 0.060423 | 6.042296 | 0.307886 | 7.718121 | 0.433156 | 12.877865 | 0.366273 | 29.585919 | false | 2024-06-08 | 2024-06-26 |
johnsutor_Llama-3-8B-Instruct_breadcrumbs-density-0.3-gamma-0.01_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/johnsutor/Llama-3-8B-Instruct_breadcrumbs-density-0.3-gamma-0.01" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">johnsutor/Llama-3-8B-Instruct_breadcrumbs-density-0.3-gamma-0.01</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/johnsutor__Llama-3-8B-Instruct_breadcrumbs-density-0.3-gamma-0.01-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | johnsutor/Llama-3-8B-Instruct_breadcrumbs-density-0.3-gamma-0.01 | 4a432be239528ffc654955338982f1f32eb12901 | 20.103542 | apache-2.0 | 0 | 8 | true | false | true | false | false | 0.337748 | 33.774829 | 0.491714 | 28.135682 | 0 | 0 | 0.312081 | 8.277405 | 0.501771 | 22.288021 | 0.353308 | 28.145316 | false | 2024-06-07 | 2024-06-26 |
johnsutor_Llama-3-8B-Instruct_breadcrumbs-density-0.1-gamma-0.01_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/johnsutor/Llama-3-8B-Instruct_breadcrumbs-density-0.1-gamma-0.01" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">johnsutor/Llama-3-8B-Instruct_breadcrumbs-density-0.1-gamma-0.01</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/johnsutor__Llama-3-8B-Instruct_breadcrumbs-density-0.1-gamma-0.01-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | johnsutor/Llama-3-8B-Instruct_breadcrumbs-density-0.1-gamma-0.01 | f4ebbf27d586e94c63f0a7293f565cbd947b824f | 22.303658 | apache-2.0 | 0 | 8 | true | false | true | false | false | 0.427124 | 42.712447 | 0.503552 | 29.550014 | 0.037009 | 3.700906 | 0.322148 | 9.619687 | 0.46376 | 17.803385 | 0.37392 | 30.435505 | false | 2024-06-07 | 2024-06-26 |
BEE-spoke-data_smol_llama-220M-GQA_bfloat16 | bfloat16 | 🟢 pretrained | 🟢 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/BEE-spoke-data/smol_llama-220M-GQA" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BEE-spoke-data/smol_llama-220M-GQA</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BEE-spoke-data__smol_llama-220M-GQA-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | BEE-spoke-data/smol_llama-220M-GQA | 8845b1d3c0bc73522ef2700aab467183cbdca9f7 | 6.401567 | apache-2.0 | 11 | 0 | true | true | true | false | false | 0.238605 | 23.860468 | 0.303167 | 3.037843 | 0 | 0 | 0.255872 | 0.782998 | 0.405875 | 9.067708 | 0.114943 | 1.660387 | false | 2023-12-22 | 2024-06-26 |
johnsutor_Llama-3-8B-Instruct_dare_ties-density-0.7_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/johnsutor/Llama-3-8B-Instruct_dare_ties-density-0.7" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">johnsutor/Llama-3-8B-Instruct_dare_ties-density-0.7</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/johnsutor__Llama-3-8B-Instruct_dare_ties-density-0.7-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | johnsutor/Llama-3-8B-Instruct_dare_ties-density-0.7 | b14b5cd07feb749e42b0567b1e387b390bed033e | 16.721678 | apache-2.0 | 0 | 8 | true | false | true | false | false | 0.203384 | 20.338369 | 0.472286 | 25.253546 | 0 | 0 | 0.303691 | 7.158837 | 0.51101 | 23.709635 | 0.314827 | 23.869681 | false | 2024-06-07 | 2024-06-26 |
johnsutor_Llama-3-8B-Instruct_breadcrumbs-density-0.9-gamma-0.1_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/johnsutor/Llama-3-8B-Instruct_breadcrumbs-density-0.9-gamma-0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">johnsutor/Llama-3-8B-Instruct_breadcrumbs-density-0.9-gamma-0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/johnsutor__Llama-3-8B-Instruct_breadcrumbs-density-0.9-gamma-0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | johnsutor/Llama-3-8B-Instruct_breadcrumbs-density-0.9-gamma-0.1 | 818f7e586444b551200862fb234c39bd48d69ae8 | 21.818373 | apache-2.0 | 0 | 8 | true | false | true | false | false | 0.422278 | 42.227844 | 0.515376 | 31.124766 | 0.067221 | 6.722054 | 0.307886 | 7.718121 | 0.438427 | 13.670052 | 0.365027 | 29.4474 | false | 2024-06-08 | 2024-06-26 |
johnsutor_Llama-3-8B-Instruct_breadcrumbs-density-0.5-gamma-0.01_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/johnsutor/Llama-3-8B-Instruct_breadcrumbs-density-0.5-gamma-0.01" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">johnsutor/Llama-3-8B-Instruct_breadcrumbs-density-0.5-gamma-0.01</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/johnsutor__Llama-3-8B-Instruct_breadcrumbs-density-0.5-gamma-0.01-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | johnsutor/Llama-3-8B-Instruct_breadcrumbs-density-0.5-gamma-0.01 | 6ab1392c825907b08eff8fbed4c97a3e6e0d6dd9 | 19.384591 | apache-2.0 | 0 | 8 | true | false | true | false | false | 0.320362 | 32.036219 | 0.488358 | 27.665795 | 0 | 0 | 0.302013 | 6.935123 | 0.509771 | 23.621354 | 0.334441 | 26.049054 | false | 2024-06-07 | 2024-06-26 |
UCLA-AGI_Llama-3-Instruct-8B-SPPO-Iter3_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/UCLA-AGI/Llama-3-Instruct-8B-SPPO-Iter3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">UCLA-AGI/Llama-3-Instruct-8B-SPPO-Iter3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/UCLA-AGI__Llama-3-Instruct-8B-SPPO-Iter3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | UCLA-AGI/Llama-3-Instruct-8B-SPPO-Iter3 | f73dafc2923acd56f115f21f76e9d14f8d19a63e | 23.305761 | apache-2.0 | 71 | 8 | true | true | true | false | true | 0.682813 | 68.281271 | 0.507958 | 29.739684 | 0.073263 | 7.326284 | 0.265101 | 2.013423 | 0.366062 | 3.091146 | 0.364445 | 29.382757 | false | 2024-06-25 | 2024-07-02 |
UCLA-AGI_Llama-3-Instruct-8B-SPPO-Iter3_float16 | float16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/UCLA-AGI/Llama-3-Instruct-8B-SPPO-Iter3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">UCLA-AGI/Llama-3-Instruct-8B-SPPO-Iter3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/UCLA-AGI__Llama-3-Instruct-8B-SPPO-Iter3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | UCLA-AGI/Llama-3-Instruct-8B-SPPO-Iter3 | f73dafc2923acd56f115f21f76e9d14f8d19a63e | 23.05947 | apache-2.0 | 71 | 8 | true | true | true | false | true | 0.670298 | 67.029814 | 0.507641 | 29.716701 | 0.071752 | 7.175227 | 0.265101 | 2.013423 | 0.364729 | 2.891146 | 0.365775 | 29.530511 | false | 2024-06-25 | 2024-06-28 |
johnsutor_Llama-3-8B-Instruct_breadcrumbs-density-0.7-gamma-0.01_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/johnsutor/Llama-3-8B-Instruct_breadcrumbs-density-0.7-gamma-0.01" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">johnsutor/Llama-3-8B-Instruct_breadcrumbs-density-0.7-gamma-0.01</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/johnsutor__Llama-3-8B-Instruct_breadcrumbs-density-0.7-gamma-0.01-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | johnsutor/Llama-3-8B-Instruct_breadcrumbs-density-0.7-gamma-0.01 | 61f4b44fb917cdb46f0ade9f8fc2a382e0cf67af | 18.442433 | apache-2.0 | 0 | 8 | true | false | true | false | false | 0.281444 | 28.144435 | 0.485433 | 27.164431 | 0 | 0 | 0.290268 | 5.369128 | 0.516313 | 24.472396 | 0.329538 | 25.504211 | false | 2024-06-08 | 2024-06-26 |
Enno-Ai_EnnoAi-Pro-French-Llama-3-8B-v0.4_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/Enno-Ai/EnnoAi-Pro-French-Llama-3-8B-v0.4" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Enno-Ai/EnnoAi-Pro-French-Llama-3-8B-v0.4</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Enno-Ai__EnnoAi-Pro-French-Llama-3-8B-v0.4-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Enno-Ai/EnnoAi-Pro-French-Llama-3-8B-v0.4 | 328722ae96e3a112ec900dbe77d410788a526c5c | 15.180945 | creativeml-openrail-m | 0 | 8 | true | true | true | false | true | 0.418881 | 41.888079 | 0.407495 | 16.875928 | 0.006042 | 0.60423 | 0.270973 | 2.796421 | 0.417 | 10.758333 | 0.263464 | 18.162677 | false | 2024-06-27 | 2024-06-30 |
johnsutor_Llama-3-8B-Instruct_breadcrumbs_ties-density-0.1-gamma-0.01_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/johnsutor/Llama-3-8B-Instruct_breadcrumbs_ties-density-0.1-gamma-0.01" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">johnsutor/Llama-3-8B-Instruct_breadcrumbs_ties-density-0.1-gamma-0.01</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/johnsutor__Llama-3-8B-Instruct_breadcrumbs_ties-density-0.1-gamma-0.01-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | johnsutor/Llama-3-8B-Instruct_breadcrumbs_ties-density-0.1-gamma-0.01 | 861347cd643d396877d8e560367cf0717c671228 | 22.086113 | apache-2.0 | 0 | 8 | true | false | true | false | false | 0.435892 | 43.589232 | 0.504094 | 29.530013 | 0.043051 | 4.305136 | 0.310403 | 8.053691 | 0.453156 | 16.344531 | 0.376247 | 30.694075 | false | 2024-06-07 | 2024-06-26 |
BEE-spoke-data_smol_llama-220M-GQA-fineweb_edu_bfloat16 | bfloat16 | 🟩 continuously pretrained | 🟩 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/BEE-spoke-data/smol_llama-220M-GQA-fineweb_edu" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">BEE-spoke-data/smol_llama-220M-GQA-fineweb_edu</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/BEE-spoke-data__smol_llama-220M-GQA-fineweb_edu-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | BEE-spoke-data/smol_llama-220M-GQA-fineweb_edu | dec16b41d5e94070dbc1f8449a554373fd4cc1d1 | 6.516558 | apache-2.0 | 1 | 0 | true | true | true | false | false | 0.198812 | 19.881248 | 0.292905 | 2.314902 | 0 | 0 | 0.259228 | 1.230425 | 0.43676 | 14.261719 | 0.112699 | 1.411052 | false | 2024-06-08 | 2024-06-26 |
johnsutor_Llama-3-8B-Instruct_breadcrumbs_ties-density-0.9-gamma-0.1_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/johnsutor/Llama-3-8B-Instruct_breadcrumbs_ties-density-0.9-gamma-0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">johnsutor/Llama-3-8B-Instruct_breadcrumbs_ties-density-0.9-gamma-0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/johnsutor__Llama-3-8B-Instruct_breadcrumbs_ties-density-0.9-gamma-0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | johnsutor/Llama-3-8B-Instruct_breadcrumbs_ties-density-0.9-gamma-0.1 | 378a7cad3e34a1a8b11e77edd95b02ff0d228da2 | 21.223028 | apache-2.0 | 0 | 8 | true | false | true | false | false | 0.416233 | 41.623337 | 0.513861 | 30.841602 | 0.069486 | 6.94864 | 0.29698 | 6.263982 | 0.431729 | 12.499479 | 0.36245 | 29.161126 | false | 2024-06-07 | 2024-06-26 |
invisietch_EtherealRainbow-v0.3-8B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/invisietch/EtherealRainbow-v0.3-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">invisietch/EtherealRainbow-v0.3-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/invisietch__EtherealRainbow-v0.3-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | invisietch/EtherealRainbow-v0.3-8B | c986c4ca5a5b8474820a59d3e911a431cf26938d | 19.614998 | llama3 | 5 | 8 | true | false | true | false | false | 0.368223 | 36.822298 | 0.509676 | 30.080258 | 0.06571 | 6.570997 | 0.30453 | 7.270694 | 0.390396 | 7.766146 | 0.362616 | 29.179595 | false | 2024-06-19 | 2024-07-01 |
pankajmathur_model_007_13b_v2_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/pankajmathur/model_007_13b_v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">pankajmathur/model_007_13b_v2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/pankajmathur__model_007_13b_v2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | pankajmathur/model_007_13b_v2 | 2c6ddf25cdb134f22e2543121b5a36b41342a9e2 | 15.856346 | llama2 | 4 | 13 | true | true | true | false | false | 0.305649 | 30.564901 | 0.470229 | 25.45442 | 0.012085 | 1.208459 | 0.283557 | 4.474273 | 0.461094 | 17.203385 | 0.246094 | 16.232639 | false | 2023-08-12 | 2024-06-26 |
openchat_openchat_v3.2_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/openchat/openchat_v3.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">openchat/openchat_v3.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/openchat__openchat_v3.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | openchat/openchat_v3.2 | acc7ce92558681e749678648189812f15c1465fe | 13.807969 | llama2 | 42 | 13 | true | true | true | false | false | 0.298056 | 29.805583 | 0.433056 | 20.323003 | 0.011329 | 1.132931 | 0.270134 | 2.684564 | 0.433625 | 13.103125 | 0.242188 | 15.798611 | true | 2023-07-30 | 2024-06-12 |
openchat_openchat-3.6-8b-20240522_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/openchat/openchat-3.6-8b-20240522" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">openchat/openchat-3.6-8b-20240522</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/openchat__openchat-3.6-8b-20240522-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | openchat/openchat-3.6-8b-20240522 | 2264eb98558978f708e88ae52afb78e43b832801 | 22.692127 | llama3 | 131 | 8 | true | true | true | false | true | 0.535859 | 53.58593 | 0.533841 | 33.232937 | 0.073263 | 7.326284 | 0.317953 | 9.060403 | 0.399854 | 8.181771 | 0.322889 | 24.76544 | true | 2024-05-07 | 2024-06-26 |
lmsys_vicuna-7b-v1.5_float16 | float16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/lmsys/vicuna-7b-v1.5" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">lmsys/vicuna-7b-v1.5</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/lmsys__vicuna-7b-v1.5-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | lmsys/vicuna-7b-v1.5 | 3321f76e3f527bd14065daf69dad9344000a201d | 10.784447 | llama2 | 268 | 7 | true | true | true | false | false | 0.235157 | 23.515716 | 0.394704 | 15.152509 | 0.007553 | 0.755287 | 0.258389 | 1.118568 | 0.423115 | 11.422656 | 0.214678 | 12.741947 | true | 2023-07-29 | 2024-06-12 |
WizardLMTeam_WizardLM-13B-V1.2_float16 | float16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/WizardLMTeam/WizardLM-13B-V1.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">WizardLMTeam/WizardLM-13B-V1.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/WizardLMTeam__WizardLM-13B-V1.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | WizardLMTeam/WizardLM-13B-V1.2 | cf5f40382559f19e13874e45b39575171ca46ef8 | 15.152357 | llama2 | 218 | 13 | true | true | true | false | false | 0.339247 | 33.924653 | 0.4462 | 22.888655 | 0.017372 | 1.73716 | 0.260906 | 1.454139 | 0.437844 | 14.030469 | 0.251912 | 16.879063 | true | 2023-07-25 | 2024-06-12 |
WizardLMTeam_WizardLM-70B-V1.0_float16 | float16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/WizardLMTeam/WizardLM-70B-V1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">WizardLMTeam/WizardLM-70B-V1.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/WizardLMTeam__WizardLM-70B-V1.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | WizardLMTeam/WizardLM-70B-V1.0 | 54aaecaff7d0790eb9f0ecea1cc267a94cc66949 | 22.321913 | llama2 | 231 | 70 | true | true | true | false | false | 0.495143 | 49.514289 | 0.559037 | 37.543355 | 0.034743 | 3.47432 | 0.26594 | 2.12528 | 0.439115 | 14.089323 | 0.344664 | 27.184914 | true | 2023-08-09 | 2024-06-12 |
openchat_openchat_v3.2_super_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/openchat/openchat_v3.2_super" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">openchat/openchat_v3.2_super</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/openchat__openchat_v3.2_super-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | openchat/openchat_v3.2_super | 9479cc37d43234a57a33628637d1aca0293d745a | 12.835458 | llama2 | 36 | 13 | true | true | true | false | false | 0.286191 | 28.619064 | 0.422121 | 19.15354 | 0.015861 | 1.586103 | 0.264262 | 1.901566 | 0.416135 | 9.916927 | 0.24252 | 15.83555 | true | 2023-09-04 | 2024-06-12 |
pankajmathur_orca_mini_v5_8b_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/pankajmathur/orca_mini_v5_8b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">pankajmathur/orca_mini_v5_8b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/pankajmathur__orca_mini_v5_8b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | pankajmathur/orca_mini_v5_8b | f57c84d4cc0b3b74549458c0d38e868bd7fffad1 | 20.158422 | llama3 | 2 | 8 | true | true | true | false | false | 0.480605 | 48.06048 | 0.506424 | 29.345795 | 0.07855 | 7.854985 | 0.286913 | 4.9217 | 0.40001 | 7.701302 | 0.307596 | 23.066268 | false | 2024-05-26 | 2024-06-26 |
pankajmathur_Al_Dente_v1_8b_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/pankajmathur/Al_Dente_v1_8b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">pankajmathur/Al_Dente_v1_8b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/pankajmathur__Al_Dente_v1_8b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | pankajmathur/Al_Dente_v1_8b | 149d70e04085ecd90510a60f916efc55da1294e7 | 17.123825 | llama3 | 1 | 8 | true | true | true | false | false | 0.369372 | 36.937215 | 0.483474 | 27.247898 | 0.030211 | 3.021148 | 0.299497 | 6.599553 | 0.398708 | 8.271875 | 0.285987 | 20.665263 | false | 2024-06-02 | 2024-06-26 |
pankajmathur_orca_mini_v6_8b_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/pankajmathur/orca_mini_v6_8b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">pankajmathur/orca_mini_v6_8b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/pankajmathur__orca_mini_v6_8b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | pankajmathur/orca_mini_v6_8b | e95dc8e4c6b6ca5957b657cc2d905683142eaf3e | 1.413398 | llama3 | 1 | 8 | true | true | true | false | true | 0.011116 | 1.111606 | 0.30287 | 3.21981 | 0 | 0 | 0.238255 | 0 | 0.355458 | 2.765625 | 0.11245 | 1.383348 | false | 2024-06-02 | 2024-06-26 |
pankajmathur_orca_mini_v6_8b_dpo_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/pankajmathur/orca_mini_v6_8b_dpo" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">pankajmathur/orca_mini_v6_8b_dpo</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/pankajmathur__orca_mini_v6_8b_dpo-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | pankajmathur/orca_mini_v6_8b_dpo | ebb11b63839d38e8c03c7ecac012e047fcb2346e | 20.291787 | llama3 | 1 | 8 | true | true | true | false | false | 0.388256 | 38.825649 | 0.520281 | 32.478826 | 0.055136 | 5.513595 | 0.301174 | 6.823266 | 0.409031 | 9.26224 | 0.359624 | 28.847148 | false | 2024-06-21 | 2024-06-26 |
pankajmathur_orca_mini_v5_8b_orpo_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/pankajmathur/orca_mini_v5_8b_orpo" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">pankajmathur/orca_mini_v5_8b_orpo</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/pankajmathur__orca_mini_v5_8b_orpo-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | pankajmathur/orca_mini_v5_8b_orpo | 4cdc018043ef439f15bd8a09c4f09c6bc528dfc7 | 12.880437 | llama3 | 1 | 8 | true | true | true | false | false | 0.082432 | 8.243239 | 0.496374 | 27.877628 | 0.059668 | 5.966767 | 0.284396 | 4.58613 | 0.413125 | 8.973958 | 0.294714 | 21.6349 | false | 2024-05-31 | 2024-06-26 |
pankajmathur_orca_mini_v5_8b_dpo_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/pankajmathur/orca_mini_v5_8b_dpo" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">pankajmathur/orca_mini_v5_8b_dpo</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/pankajmathur__orca_mini_v5_8b_dpo-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | pankajmathur/orca_mini_v5_8b_dpo | fdc0d0aaa85a58f1abaf2c24ce0ddca10c08f0f1 | 19.956563 | llama3 | 2 | 8 | true | true | true | false | false | 0.489647 | 48.964747 | 0.50746 | 29.605373 | 0.074773 | 7.477341 | 0.274329 | 3.243848 | 0.389375 | 6.938542 | 0.311586 | 23.50953 | false | 2024-05-30 | 2024-06-26 |
Casual-Autopsy_L3-Umbral-Mind-RP-v2.0-8B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/Casual-Autopsy/L3-Umbral-Mind-RP-v2.0-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Casual-Autopsy/L3-Umbral-Mind-RP-v2.0-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Casual-Autopsy__L3-Umbral-Mind-RP-v2.0-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Casual-Autopsy/L3-Umbral-Mind-RP-v2.0-8B | b46c066ea8387264858dc3461f382e7b42fd9c48 | 25.76087 | llama3 | 9 | 8 | true | false | true | false | true | 0.712263 | 71.226346 | 0.526241 | 32.486278 | 0.101208 | 10.120846 | 0.286913 | 4.9217 | 0.368667 | 5.55 | 0.37234 | 30.260047 | false | 2024-06-26 | 2024-07-02 |