Kquant03 commited on
Commit
b376a35
1 Parent(s): f5955da

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +2 -0
README.md CHANGED
@@ -16,6 +16,8 @@ I was approached with the idea to make a merge based on story telling, and consi
16
 
17
  We believe that this, while it might not be better logically than mixtral base instruct, is definitely more creative. Special thanks to [NeuralNovel](https://huggingface.co/NeuralNovel) for collaborating with me on this project
18
 
 
 
19
  # "[What is a Mixture of Experts (MoE)?](https://huggingface.co/blog/moe)"
20
  ### (from the MistralAI papers...click the quoted question above to navigate to it directly.)
21
 
 
16
 
17
  We believe that this, while it might not be better logically than mixtral base instruct, is definitely more creative. Special thanks to [NeuralNovel](https://huggingface.co/NeuralNovel) for collaborating with me on this project
18
 
19
+ ![image/png](https://cdn-uploads.huggingface.co/production/uploads/6589d7e6586088fd2784a12c/V4cv6tthy1quRRMCvAf2H.png)
20
+ # The model performed slightly better than base mixtral instruct in erotic roleplay variety on [Ayumi's benchmark](http://ayumi.m8geil.de/ayumi_bench_v3_results.html) and #29/818 BEST FRANKENMOE overall!!!
21
  # "[What is a Mixture of Experts (MoE)?](https://huggingface.co/blog/moe)"
22
  ### (from the MistralAI papers...click the quoted question above to navigate to it directly.)
23