File size: 3,385 Bytes
c19ac70
 
 
 
 
 
 
afe7dee
 
 
c19ac70
 
 
 
c3b37bc
 
b775c26
c19ac70
 
 
afe7dee
 
 
 
 
 
 
 
 
c9af20d
 
 
 
 
 
 
afe7dee
c9af20d
afe7dee
 
 
 
c9af20d
 
afe7dee
 
c19ac70
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
c9af20d
afe7dee
c19ac70
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
c9af20d
c19ac70
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
---
license: apache-2.0
tags:
- moe
- merge
- abideen/NexoNimbus-7B
- mlabonne/NeuralMarcoro14-7B
language:
- en
library_name: transformers
---

# NexoNimbus-MoE-2x7B

![image/png](https://cdn-uploads.huggingface.co/production/uploads/64e380b2e12618b261fa6ba0/_bzC6xkVIHW0tSigBxUI3.png)

NexoNimbus-MoE-2x7B is a Mixure of Experts (MoE) made with the following models:
* [abideen/NexoNimbus-7B](https://huggingface.co/abideen/NexoNimbus-7B)
* [mlabonne/NeuralMarcoro14-7B](https://huggingface.co/mlabonne/NeuralMarcoro14-7B)

🏆 Evaluation
NexoNimbus-MoE-2x7B is the 10th best-performing 13B LLM on the Open LLM Leaderboard:


![image/png](https://cdn-uploads.huggingface.co/production/uploads/64e380b2e12618b261fa6ba0/z8E728H5fJqVtKNeGuwjX.png)


|    Task     |Version| Metric |Value|   |Stderr|
|-------------|------:|--------|----:|---|-----:|
|arc_challenge|      0|acc     |62.28|±  |  1.41|
|             |       |acc_norm|66.80|±  |  1.37|
|hellaswag    |      0|acc     |66.83|±  |  0.46|
|             |       |acc_norm|85.66|±  |  0.34|
|gsm8k        |      0|acc     |53.52|±  |  1.37|
|winogrande   |      0|acc     |81.53|±  |  1.09|
|mmlu         |      0|acc     |64.51|±  |  1.00|

Average: 67.51%

### TruthfulQA
|    Task     |Version|Metric|Value|   |Stderr|
|-------------|------:|------|----:|---|-----:|
|truthfulqa_mc|      1|mc1   |35.98|±  |  1.68|
|             |       |mc2   |53.05|±  |  1.53|


## 🧩 Configuration

```yaml
base_model: teknium/OpenHermes-2.5-Mistral-7B
gate_mode: hidden
dtype: bfloat16
experts:
  - source_model: abideen/NexoNimbus-7B
    positive_prompts:
    - "Mathematics"
    - "Physics"
    - "Chemistry"
    - "Biology"
    - "Medicine"
    - "Engineering"
    - "Computer Science"

    negative_prompts:
    - "History"
    - "Philosophy"
    - "Linguistics"
    - "Literature"
    - "Art and Art History"
    - "Music Theory and Composition"
    - "Performing Arts (Theater, Dance)"

  - source_model: mlabonne/NeuralMarcoro14-7B 
    positive_prompts:
    - "Earth Sciences (Geology, Meteorology, Oceanography)"
    - "Environmental Science"
    - "Astronomy and Space Science"
    - "Psychology"
    - "Sociology"
    - "Anthropology"
    - "Political Science"
    - "Economics"
    negative_prompts:
    - "Education"
    - "Law"
    - "Theology and Religious Studies"
    - "Communication Studies"
    - "Business and Management"
    - "Agricultural Sciences"
    - "Nutrition and Food Science"
    - "Sports Science"
```

## 💻 Usage

Here's a [Colab notebook](https://colab.research.google.com/drive/1B1Q7vO95cDkEJbKIPhOWr6exB9-Q_lr-?usp=sharing) to run NexoNimbus-MoE-2x7B in 4-bit precision on a free T4 GPU.

```python
!pip install -qU transformers bitsandbytes accelerate

from transformers import AutoTokenizer
import transformers
import torch

model = "abideen/NexoNimbus-MoE-2x7B"

tokenizer = AutoTokenizer.from_pretrained(model)
pipeline = transformers.pipeline(
    "text-generation",
    model=model,
    model_kwargs={"torch_dtype": torch.float16, "load_in_4bit": True},
)

messages = [{"role": "user", "content": "Explain what is machine learning."}]
prompt = pipeline.tokenizer.apply_chat_template(messages, tokenize=False, add_generation_prompt=True)
outputs = pipeline(prompt, max_new_tokens=256, do_sample=True, temperature=0.7, top_k=50, top_p=0.95)
print(outputs[0]["generated_text"])
```