Text Generation
Transformers
Safetensors
mixtral
Mixture of Experts
Merge
mergekit
lazymergekit
Felladrin/Minueza-32M-Base
Felladrin/Minueza-32M-UltraChat
conversational
Inference Endpoints
text-generation-inference
Isotonic commited on
Commit
1bcd0a9
1 Parent(s): 912c493

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -0
README.md CHANGED
@@ -84,6 +84,7 @@ widget:
84
  Mixnueza-6x32M-MoE is a Mixure of Experts (MoE) made with the following models using [LazyMergekit](https://colab.research.google.com/drive/1obulZ1ROXHjYLn6PPZJwRR6GzgQogxxb?usp=sharing):
85
  * 3 X [Felladrin/Minueza-32M-Base](https://huggingface.co/Felladrin/Minueza-32M-Base)
86
  * 3 X [Felladrin/Minueza-32M-UltraChat](https://huggingface.co/Felladrin/Minueza-32M-UltraChat)
 
87
 
88
  ## Recommended Prompt Format
89
 
 
84
  Mixnueza-6x32M-MoE is a Mixure of Experts (MoE) made with the following models using [LazyMergekit](https://colab.research.google.com/drive/1obulZ1ROXHjYLn6PPZJwRR6GzgQogxxb?usp=sharing):
85
  * 3 X [Felladrin/Minueza-32M-Base](https://huggingface.co/Felladrin/Minueza-32M-Base)
86
  * 3 X [Felladrin/Minueza-32M-UltraChat](https://huggingface.co/Felladrin/Minueza-32M-UltraChat)
87
+ * [Evaluation Results](https://huggingface.co/datasets/open-llm-leaderboard/details_Isotonic__Mixnueza-6x32M-MoE)
88
 
89
  ## Recommended Prompt Format
90