NexoNimbus-MoE-2x7B / README.md
abideen's picture
Update README.md
c9af20d verified
|
raw
history blame
3.39 kB
---
license: apache-2.0
tags:
- moe
- merge
- abideen/NexoNimbus-7B
- mlabonne/NeuralMarcoro14-7B
language:
- en
library_name: transformers
---
# NexoNimbus-MoE-2x7B
![image/png](https://cdn-uploads.huggingface.co/production/uploads/64e380b2e12618b261fa6ba0/_bzC6xkVIHW0tSigBxUI3.png)
NexoNimbus-MoE-2x7B is a Mixure of Experts (MoE) made with the following models:
* [abideen/NexoNimbus-7B](https://huggingface.co/abideen/NexoNimbus-7B)
* [mlabonne/NeuralMarcoro14-7B](https://huggingface.co/mlabonne/NeuralMarcoro14-7B)
🏆 Evaluation
NexoNimbus-MoE-2x7B is the 10th best-performing 13B LLM on the Open LLM Leaderboard:
![image/png](https://cdn-uploads.huggingface.co/production/uploads/64e380b2e12618b261fa6ba0/z8E728H5fJqVtKNeGuwjX.png)
| Task |Version| Metric |Value| |Stderr|
|-------------|------:|--------|----:|---|-----:|
|arc_challenge| 0|acc |62.28|± | 1.41|
| | |acc_norm|66.80|± | 1.37|
|hellaswag | 0|acc |66.83|± | 0.46|
| | |acc_norm|85.66|± | 0.34|
|gsm8k | 0|acc |53.52|± | 1.37|
|winogrande | 0|acc |81.53|± | 1.09|
|mmlu | 0|acc |64.51|± | 1.00|
Average: 67.51%
### TruthfulQA
| Task |Version|Metric|Value| |Stderr|
|-------------|------:|------|----:|---|-----:|
|truthfulqa_mc| 1|mc1 |35.98|± | 1.68|
| | |mc2 |53.05|± | 1.53|
## 🧩 Configuration
```yaml
base_model: teknium/OpenHermes-2.5-Mistral-7B
gate_mode: hidden
dtype: bfloat16
experts:
- source_model: abideen/NexoNimbus-7B
positive_prompts:
- "Mathematics"
- "Physics"
- "Chemistry"
- "Biology"
- "Medicine"
- "Engineering"
- "Computer Science"
negative_prompts:
- "History"
- "Philosophy"
- "Linguistics"
- "Literature"
- "Art and Art History"
- "Music Theory and Composition"
- "Performing Arts (Theater, Dance)"
- source_model: mlabonne/NeuralMarcoro14-7B
positive_prompts:
- "Earth Sciences (Geology, Meteorology, Oceanography)"
- "Environmental Science"
- "Astronomy and Space Science"
- "Psychology"
- "Sociology"
- "Anthropology"
- "Political Science"
- "Economics"
negative_prompts:
- "Education"
- "Law"
- "Theology and Religious Studies"
- "Communication Studies"
- "Business and Management"
- "Agricultural Sciences"
- "Nutrition and Food Science"
- "Sports Science"
```
## 💻 Usage
Here's a [Colab notebook](https://colab.research.google.com/drive/1B1Q7vO95cDkEJbKIPhOWr6exB9-Q_lr-?usp=sharing) to run NexoNimbus-MoE-2x7B in 4-bit precision on a free T4 GPU.
```python
!pip install -qU transformers bitsandbytes accelerate
from transformers import AutoTokenizer
import transformers
import torch
model = "abideen/NexoNimbus-MoE-2x7B"
tokenizer = AutoTokenizer.from_pretrained(model)
pipeline = transformers.pipeline(
"text-generation",
model=model,
model_kwargs={"torch_dtype": torch.float16, "load_in_4bit": True},
)
messages = [{"role": "user", "content": "Explain what is machine learning."}]
prompt = pipeline.tokenizer.apply_chat_template(messages, tokenize=False, add_generation_prompt=True)
outputs = pipeline(prompt, max_new_tokens=256, do_sample=True, temperature=0.7, top_k=50, top_p=0.95)
print(outputs[0]["generated_text"])
```