File size: 737 Bytes
becb9af
 
2dabe38
 
 
 
 
 
becb9af
2dabe38
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
---
license: unknown
tags:
  - merge
  - mergekit
  - WizardLM/WizardMath-7B-V1.1
  - teknium/OpenHermes-2.5-Mistral-7B
  - mlabonne/Marcoro14-7B-slerp
---

# DraftReasoner-2x7B-MoE-v0.1

Experimental 2-expert MoE merge using mlabonne/Marcoro14-7B-slerp as base.

* [Marcoro14-7B-slerp](https://huggingface.co/mlabonne/Marcoro14-7B-slerp ) as base.
* [OpenHermes-2.5-Mistral-7B](https://huggingface.co/teknium/OpenHermes-2.5-Mistral-7B) as model 0.
* [WizardMath-7B-V1.1](https://huggingface.co/WizardLM/WizardMath-7B-V1.1) as model 1.

![](https://i.imgur.com/cDQS6rq.jpg)

## Notes

Please evaluate before use in any application pipeline. Activation for Math part of the model would be `'math'`, `'reason'`, `'solve'`, `'count'`.