Update README.md
Browse files
README.md
CHANGED
@@ -18,8 +18,9 @@ I was approached with the idea to make a merge based on story telling, and consi
|
|
18 |
|
19 |
We believe that this, while it might not be better logically than mixtral base instruct, is definitely more creative. Special thanks to [NeuralNovel](https://huggingface.co/NeuralNovel) for collaborating with me on this project
|
20 |
|
21 |
-
![image/png](https://cdn-uploads.huggingface.co/production/uploads/6589d7e6586088fd2784a12c/
|
22 |
-
|
|
|
23 |
# "[What is a Mixture of Experts (MoE)?](https://huggingface.co/blog/moe)"
|
24 |
### (from the MistralAI papers...click the quoted question above to navigate to it directly.)
|
25 |
|
|
|
18 |
|
19 |
We believe that this, while it might not be better logically than mixtral base instruct, is definitely more creative. Special thanks to [NeuralNovel](https://huggingface.co/NeuralNovel) for collaborating with me on this project
|
20 |
|
21 |
+
![image/png](https://cdn-uploads.huggingface.co/production/uploads/6589d7e6586088fd2784a12c/xXXJhZNJ4q3suxJ9LyLqK.png)
|
22 |
+
![image/png](https://cdn-uploads.huggingface.co/production/uploads/6589d7e6586088fd2784a12c/ZpX1KMNYj11k4pF0NX3Q9.png)
|
23 |
+
It performs better than base mixtral 8x across many evaluations. It's half the size and is comparable to most MoEs. Thanks so much to HuggingFace for evaluating it!
|
24 |
# "[What is a Mixture of Experts (MoE)?](https://huggingface.co/blog/moe)"
|
25 |
### (from the MistralAI papers...click the quoted question above to navigate to it directly.)
|
26 |
|