bjoernp commited on
Commit
fb6fdb7
1 Parent(s): 629a1d3

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -23,7 +23,7 @@ Tags:
23
 
24
  **DiscoLM Mixtral 8x7b alpha** is an experimental 8x7b MoE model based on [Mistral AI´s Mixtral 8x7b](https://twitter.com/MistralAI/status/1733150512395038967).
25
  This model is based on experimental code converting the model weights to huggingface format and enabling Transformers-based inference.
26
- It was then finetuned on the OpenHermes, MethaMathQA und Tested-22k-Python-Alpaca datasets.
27
  DiscoLM Mixtral 8x7b alpha is a [DiscoResearch](https://huggingface.co/DiscoResearch) project and was created by [Björn Plüster](https://huggingface.co/bjoernp) with lots of support from the community.
28
 
29
  **Many thanks to [HessianAI](https://hessian.ai/) for providing the compute resources for this project and to the great people at [LAION](https://laion.ai) without whom this project would not have been possible!**
 
23
 
24
  **DiscoLM Mixtral 8x7b alpha** is an experimental 8x7b MoE model based on [Mistral AI´s Mixtral 8x7b](https://twitter.com/MistralAI/status/1733150512395038967).
25
  This model is based on experimental code converting the model weights to huggingface format and enabling Transformers-based inference.
26
+ It was then finetuned on the Synthia, MethaMathQA und Capybara datasets.
27
  DiscoLM Mixtral 8x7b alpha is a [DiscoResearch](https://huggingface.co/DiscoResearch) project and was created by [Björn Plüster](https://huggingface.co/bjoernp) with lots of support from the community.
28
 
29
  **Many thanks to [HessianAI](https://hessian.ai/) for providing the compute resources for this project and to the great people at [LAION](https://laion.ai) without whom this project would not have been possible!**