StatsGary commited on
Commit
0422ba2
1 Parent(s): 3564430

Create README.md

Browse files

# Mixture of Expert custom

A mixutre of experts fused with `mergekit` with base model: `mistralai/Mistral-7B-Instruct-v0.2`.

Experts merged with Mistral base model:

- HuggingFaceH4/zephyr-7b-beta
- mistralai/Mistral-7B-Instruct-v0.2
- teknium/OpenHermes-2.5-Mistral-7B
- meta-math/MetaMath-Mistral-7B

Files changed (1) hide show
  1. README.md +8 -0
README.md ADDED
@@ -0,0 +1,8 @@
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: mit
3
+ library_name: transformers
4
+ tags:
5
+ - mergekit
6
+ - moe
7
+ - mixture of experts
8
+ ---