eval_name
stringlengths 12
111
| Precision
stringclasses 3
values | Type
stringclasses 6
values | T
stringclasses 6
values | Weight type
stringclasses 2
values | Architecture
stringclasses 49
values | Model
stringlengths 355
650
| fullname
stringlengths 4
102
| Model sha
stringlengths 0
40
| Average ⬆️
float64 1.41
51.2
| Hub License
stringclasses 25
values | Hub ❤️
int64 0
5.84k
| #Params (B)
int64 -1
140
| Available on the hub
bool 2
classes | Not_Merged
bool 2
classes | MoE
bool 2
classes | Flagged
bool 1
class | Chat Template
bool 2
classes | CO₂ cost (kg)
float64 0.04
107
| IFEval Raw
float64 0
0.87
| IFEval
float64 0
86.7
| BBH Raw
float64 0.28
0.75
| BBH
float64 0.81
63.5
| MATH Lvl 5 Raw
float64 0
0.51
| MATH Lvl 5
float64 0
50.7
| GPQA Raw
float64 0.22
0.44
| GPQA
float64 0
24.9
| MUSR Raw
float64 0.29
0.59
| MUSR
float64 0
36.4
| MMLU-PRO Raw
float64 0.1
0.7
| MMLU-PRO
float64 0
66.8
| Maintainer's Highlight
bool 2
classes | Upload To Hub Date
stringlengths 0
10
| Submission Date
stringclasses 154
values | Generation
int64 0
8
| Base Model
stringlengths 4
102
|
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
zelk12_recoilme-gemma-2-Gutenberg-Doppel-9B-v0.1_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/zelk12/recoilme-gemma-2-Gutenberg-Doppel-9B-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/recoilme-gemma-2-Gutenberg-Doppel-9B-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__recoilme-gemma-2-Gutenberg-Doppel-9B-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | zelk12/recoilme-gemma-2-Gutenberg-Doppel-9B-v0.1 | 1e3e623e9f0b386bfd967c629dd39c87daef5bed | 31.626376 | 1 | 10 | false | true | true | false | true | 6.461752 | 0.761523 | 76.152276 | 0.609878 | 43.941258 | 0.073263 | 7.326284 | 0.341443 | 12.192394 | 0.431021 | 13.310937 | 0.431516 | 36.835106 | false | 2024-10-07 | 2024-10-07 | 1 | zelk12/recoilme-gemma-2-Gutenberg-Doppel-9B-v0.1 (Merge) |
|
zelk12_recoilme-gemma-2-Ifable-9B-v0.1_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/zelk12/recoilme-gemma-2-Ifable-9B-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/recoilme-gemma-2-Ifable-9B-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__recoilme-gemma-2-Ifable-9B-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | zelk12/recoilme-gemma-2-Ifable-9B-v0.1 | 8af6620b39c9a36239879b6b2bd88f66e9e9d930 | 32.254423 | 0 | 10 | false | true | true | false | true | 6.542869 | 0.794396 | 79.439554 | 0.60644 | 43.39057 | 0.09139 | 9.138973 | 0.35151 | 13.534676 | 0.420229 | 11.095313 | 0.432347 | 36.927453 | false | 2024-10-07 | 2024-10-07 | 1 | zelk12/recoilme-gemma-2-Ifable-9B-v0.1 (Merge) |
|
zelk12_recoilme-gemma-2-psy10k-mental_healt-9B-v0.1_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/zelk12/recoilme-gemma-2-psy10k-mental_healt-9B-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/recoilme-gemma-2-psy10k-mental_healt-9B-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__recoilme-gemma-2-psy10k-mental_healt-9B-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | zelk12/recoilme-gemma-2-psy10k-mental_healt-9B-v0.1 | ced039b03be6f65ac0f713efcee76c6534e65639 | 32.448061 | 0 | 10 | false | true | true | false | true | 3.13222 | 0.744537 | 74.453672 | 0.597759 | 42.132683 | 0.180514 | 18.05136 | 0.34396 | 12.527964 | 0.429469 | 12.183594 | 0.418052 | 35.339096 | false | 2024-10-07 | 2024-10-07 | 1 | zelk12/recoilme-gemma-2-psy10k-mental_healt-9B-v0.1 (Merge) |
|
zetasepic_Qwen2.5-72B-Instruct-abliterated_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/zetasepic/Qwen2.5-72B-Instruct-abliterated" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zetasepic/Qwen2.5-72B-Instruct-abliterated</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zetasepic__Qwen2.5-72B-Instruct-abliterated-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | zetasepic/Qwen2.5-72B-Instruct-abliterated | af94b3c05c9857dbac73afb1cbce00e4833ec9ef | 45.293139 | other | 9 | 72 | true | true | true | false | false | 18.809182 | 0.715261 | 71.526106 | 0.715226 | 59.912976 | 0.46148 | 46.148036 | 0.406879 | 20.917226 | 0.471917 | 19.122917 | 0.587184 | 54.131575 | false | 2024-10-01 | 2024-11-08 | 2 | Qwen/Qwen2.5-72B |
zhengr_MixTAO-7Bx2-MoE-v8.1_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | MixtralForCausalLM | <a target="_blank" href="https://huggingface.co/zhengr/MixTAO-7Bx2-MoE-v8.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zhengr/MixTAO-7Bx2-MoE-v8.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zhengr__MixTAO-7Bx2-MoE-v8.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | zhengr/MixTAO-7Bx2-MoE-v8.1 | 828e963abf2db0f5af9ed0d4034e538fc1cf5f40 | 17.168311 | apache-2.0 | 54 | 12 | true | true | false | false | true | 0.92739 | 0.418781 | 41.878106 | 0.420194 | 19.176907 | 0.066465 | 6.646526 | 0.298658 | 6.487696 | 0.397625 | 8.303125 | 0.284658 | 20.517509 | false | 2024-02-26 | 2024-06-27 | 0 | zhengr/MixTAO-7Bx2-MoE-v8.1 |