Kquant03 commited on
Commit
f1336ca
1 Parent(s): d4fa544

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -18,7 +18,7 @@ I was approached with the idea to make a merge based on story telling, and consi
18
 
19
  We believe that this, while it might not be better logically than mixtral base instruct, is definitely more creative. Special thanks to [NeuralNovel](https://huggingface.co/NeuralNovel) for collaborating with me on this project
20
 
21
- ![image/png](https://cdn-uploads.huggingface.co/production/uploads/6589d7e6586088fd2784a12c/xXXJhZNJ4q3suxJ9LyLqK.png)
22
  ![image/png](https://cdn-uploads.huggingface.co/production/uploads/6589d7e6586088fd2784a12c/ZpX1KMNYj11k4pF0NX3Q9.png)
23
  It performs better than base mixtral 8x across many evaluations. It's half the size and is comparable to most MoEs. Thanks so much to HuggingFace for evaluating it!
24
  # "[What is a Mixture of Experts (MoE)?](https://huggingface.co/blog/moe)"
 
18
 
19
  We believe that this, while it might not be better logically than mixtral base instruct, is definitely more creative. Special thanks to [NeuralNovel](https://huggingface.co/NeuralNovel) for collaborating with me on this project
20
 
21
+ ![image/png](https://cdn-uploads.huggingface.co/production/uploads/6589d7e6586088fd2784a12c/1A1oNsGLUco1Rsv9SYQtX.png)
22
  ![image/png](https://cdn-uploads.huggingface.co/production/uploads/6589d7e6586088fd2784a12c/ZpX1KMNYj11k4pF0NX3Q9.png)
23
  It performs better than base mixtral 8x across many evaluations. It's half the size and is comparable to most MoEs. Thanks so much to HuggingFace for evaluating it!
24
  # "[What is a Mixture of Experts (MoE)?](https://huggingface.co/blog/moe)"