File size: 10,386 Bytes
cd7dbcb ea2ea14 cd7dbcb ea2ea14 cd7dbcb ea2ea14 cd7dbcb c53626a cd7dbcb 4c8868e 556afaa 9b082c5 94547e4 0e93a05 cd7dbcb 8252aeb 94547e4 8252aeb cd7dbcb d6f5a3d ea2ea14 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 170 171 172 173 174 175 176 177 178 179 180 181 182 183 184 185 186 187 188 189 190 191 192 193 194 195 196 197 198 199 200 201 202 203 204 205 206 207 208 209 210 211 212 213 214 215 216 217 218 219 220 221 222 223 224 225 226 227 228 229 230 231 232 233 234 235 236 237 238 239 240 241 242 243 244 245 246 247 248 249 250 251 252 253 254 255 256 257 |
---
language:
- en
license: other
tags:
- facebook
- meta
- pytorch
- llama
- llama-3
- moe
- code
base_model: meta-llama/Meta-Llama-3-8B-Instruct
pipeline_tag: text-generation
license_name: llama3
license_link: LICENSE
model-index:
- name: Llama-3-Aplite-Instruct-4x8B-MoE
results:
- task:
type: text-generation
name: Text Generation
dataset:
name: AI2 Reasoning Challenge (25-Shot)
type: ai2_arc
config: ARC-Challenge
split: test
args:
num_few_shot: 25
metrics:
- type: acc_norm
value: 61.86
name: normalized accuracy
source:
url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard?query=raincandy-u/Llama-3-Aplite-Instruct-4x8B-MoE
name: Open LLM Leaderboard
- task:
type: text-generation
name: Text Generation
dataset:
name: HellaSwag (10-Shot)
type: hellaswag
split: validation
args:
num_few_shot: 10
metrics:
- type: acc_norm
value: 81.08
name: normalized accuracy
source:
url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard?query=raincandy-u/Llama-3-Aplite-Instruct-4x8B-MoE
name: Open LLM Leaderboard
- task:
type: text-generation
name: Text Generation
dataset:
name: MMLU (5-Shot)
type: cais/mmlu
config: all
split: test
args:
num_few_shot: 5
metrics:
- type: acc
value: 67.47
name: accuracy
source:
url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard?query=raincandy-u/Llama-3-Aplite-Instruct-4x8B-MoE
name: Open LLM Leaderboard
- task:
type: text-generation
name: Text Generation
dataset:
name: TruthfulQA (0-shot)
type: truthful_qa
config: multiple_choice
split: validation
args:
num_few_shot: 0
metrics:
- type: mc2
value: 51.87
source:
url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard?query=raincandy-u/Llama-3-Aplite-Instruct-4x8B-MoE
name: Open LLM Leaderboard
- task:
type: text-generation
name: Text Generation
dataset:
name: Winogrande (5-shot)
type: winogrande
config: winogrande_xl
split: validation
args:
num_few_shot: 5
metrics:
- type: acc
value: 77.82
name: accuracy
source:
url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard?query=raincandy-u/Llama-3-Aplite-Instruct-4x8B-MoE
name: Open LLM Leaderboard
- task:
type: text-generation
name: Text Generation
dataset:
name: GSM8k (5-shot)
type: gsm8k
config: main
split: test
args:
num_few_shot: 5
metrics:
- type: acc
value: 61.87
name: accuracy
source:
url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard?query=raincandy-u/Llama-3-Aplite-Instruct-4x8B-MoE
name: Open LLM Leaderboard
---
![image/png](https://cdn-uploads.huggingface.co/production/uploads/657eb5b256c9c67605a6e8b5/8JXktjAyUPCWQGnRExiVI.png)
# Aplite-Instruct-4x8B-Llama-3
Aplite-Instruct-4x8B-Llama-3 is a experimental MoE (Mixture of Experts) model based on the Llama-3 architecture using Mergekit.
**GGUF Quant [Here](https://huggingface.co/raincandy-u/Llama-3-Aplite-Instruct-4x8B-GGUF)** or [(Made by mradermacher)](mradermacher/Llama-3-Aplite-Instruct-4x8B-GGUF)
**EXL2 Quant [Here](https://huggingface.co/Knightcodin/Llama-3-Aplite-Instruct-4x8B-exl2)** (Thanks to help of @Knightcodin)
**<|eot_id|> bug has been fixed.**
## Disclaimer
This model is a research experiment and may generate incorrect or harmful content. The model's outputs should not be taken as factual or representative of the views of the model's creator or any other individual.
The model's creator is not responsible for any harm or damage caused by the model's outputs.
## Chat Example
> User: Write a story for four llamas.
Aplite-Instruct-4x8B:
```
Once upon a time, in a land far, far away, there lived four little llamas. They were the best of friends and lived together in a cozy little village nestled in the heart of the Andes mountains.
Their names were Lolo, Lola, Lina, and Lucas. Lolo was the smallest of the group, but he had the biggest heart. Lola was the oldest and wisest of the four, and she loved to tell stories and sing songs. Lina was the most adventurous, always eager to explore the surrounding hills and valleys. And Lucas was the most mischievous, always getting into trouble but always with a twinkle in his eye.
One day, the four little llamas decided to go on an adventure together. They packed up their favorite snacks and set off into the mountains. As they climbed higher and higher, the air grew cooler and the snow began to fall. They trudged on, not minding the cold, for they were having too much fun.
After a while, they came to a beautiful little lake, surrounded by towering peaks. The llamas decided to take a break and enjoy the view. They sat down on the snow, munching on their snacks and watching the snowflakes dance around them.
Just then, a gentle voice called out to them. "Hello, little llamas! What are you doing all the way up here?" The llamas looked up to see a kind old llama standing before them. She had a soft, gentle face and a warm smile.
"We're on an adventure," said Lolo. "We wanted to see the mountains and the snow."
"Ah, that's a good reason to come up here," said the old llama. "But you should be careful. The mountains can be treacherous."
"We're careful," said Lucas. "We're not afraid of the mountains."
The old llama smiled. "I see that. Well, I have a special gift for you. I've been saving it for a special occasion."
She led the four little llamas to a small cave, where she opened a wooden box. Inside were four small lanterns, each one glowing with a soft, golden light.
"These are magic lanterns," said the old llama. "When you light them, they will keep you safe from the dangers of the mountains. And they will always guide you back home."
The four little llamas were thrilled. They thanked the old llama and promised to be careful.
As they continued on their journey, the lanterns cast a warm, golden glow around them, lighting up the path ahead. They could see everything clearly, even in the midst of the falling snow.
After a while, they came to a narrow, winding path that led them to the top of a steep cliff. The path was slippery with ice, and the wind was blowing fiercely. But the lanterns shone brightly, and the four little llamas made their way safely to the top.
As they looked out over the breathtaking view, they saw a strange sight. A group of tiny creatures were scurrying about, gathering something from the ground. They looked like tiny, furry balls with long tails.
"What are they doing?" asked Lina.
"I think they're gathering berries," said Lola. "But they're so small!"
Just then, one of the creatures looked up and saw the four little llamas. It waved its tail and shouted, "Hello! We're the tiny penguins! We live here in the mountains, and we love to gather berries."
The four little llamas were amazed. "Tiny penguins?" they exclaimed. "We've never seen anything like you!"
The tiny penguins invited the llamas to join them in their berry-gathering. The llamas were happy to help, and soon they were all working together, laughing and chatting as they went.
After a while, they decided to take a break and enjoy their snack. The tiny penguins shared their berries with the llamas, and the llamas shared their snacks with the penguins.
As they sat together, munching on their treats, the tiny penguins asked the llamas to stay with them for a while. "We'd love to show you around our home," they said. "You can meet our friends and see our favorite spots."
The four little llamas were thrilled. They had never had such an exciting adventure before. And so they spent the rest of the day with the tiny penguins, exploring the mountains and making new friends.
As the sun began to set, the tiny penguins led the llamas back to the cliff path. They said goodbye to their new friends and wished them a safe journey home.
The four little llamas thanked the penguins and lit their lanterns once more. As they made their way back down the mountain, they could see the tiny penguins waving goodbye from the top of the cliff.
When they arrived back in their village, they told everyone about their amazing adventure. They showed them the lanterns and told them about the tiny penguins and their berry-gathering.
The villagers were amazed and delighted. They had never heard of such a wonderful adventure before. And from that day on, the four little llamas were known as the bravest and most adventurous llamas in the land.
And they lived happily ever after, always ready for their next adventure.
```
## Merge Details
```
base_model: Meta-Llama-3-8B-Instruct
experts:
- source_model: Meta-Llama-3-8B-Instruct
positive_prompts:
- "explain"
- "chat"
- "assistant"
- source_model: Llama3-8B-OpenHermes-DPO
positive_prompts:
- "python"
- "math"
- "solve"
- "code"
- source_model: Llama-3-SLERP-8B
positive_prompts:
- "chat"
- "assistant"
- "AI"
- source_model: hf-llama3-8b-orpo-v0.0
positive_prompts:
- "think"
- "chat"
- "code"
- "roleplay"
gate_mode: hidden
dtype: float16
```
## Join out Discord
If you'd like to discuss potential collaborations or applications, feel free to reach out to me on Discord: [https://discord.gg/KugcbJX5]
**Meta Llama 3 is
licensed under the Meta Llama 3 Community License, Copyright © Meta Platforms, Inc. All Rights
Reserved.**
# [Open LLM Leaderboard Evaluation Results](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)
Detailed results can be found [here](https://huggingface.co/datasets/open-llm-leaderboard/details_raincandy-u__Llama-3-Aplite-Instruct-4x8B-MoE)
| Metric |Value|
|---------------------------------|----:|
|Avg. |66.99|
|AI2 Reasoning Challenge (25-Shot)|61.86|
|HellaSwag (10-Shot) |81.08|
|MMLU (5-Shot) |67.47|
|TruthfulQA (0-shot) |51.87|
|Winogrande (5-shot) |77.82|
|GSM8k (5-shot) |61.87|
|