Update README.md
Browse files
README.md
CHANGED
@@ -169,7 +169,7 @@ DiscoResearch is an aspiring open research community. Disco should be a place wh
|
|
169 |
## Acknowledgements
|
170 |
|
171 |
Many thanks first and foremost to [Mistral AI](https://huggingface.co/mistralai) for releasing another awesome model and their release strategy that is much fun for the whole community.
|
172 |
-
Additionally, many thanks in particular to [Dmytro Dzhulgakov](https://huggingface.co/dzhulgakov) who was the first one with a running [inference implementation](https://github.com/dzhulgakov/llama-mistral), [Vik](https://huggingface.co/vikhyatk) who spotted a critical bug in our first implementation (he actually read the paper!), [winglian](https://huggingface.co/winglian) for helpful advice and Axolotl which was used to finetune the model, [MigTissera](https://huggingface.co/migtissera), [MetaMath](https://huggingface.co/meta-math) and [
|
173 |
|
174 |
**DiscoLM Mixtral is a [DiscoResearch](https://huggingface.co/DiscoResearch) project and was created by [Björn Plüster](https://huggingface.co/bjoernp).
|
175 |
The model was trained with compute provided by [HessianAI](https://hessian.ai/); many thanks as well to [LAION](https://laion.ai) for their coordination and providing invaluable contacts + advice.**
|
|
|
169 |
## Acknowledgements
|
170 |
|
171 |
Many thanks first and foremost to [Mistral AI](https://huggingface.co/mistralai) for releasing another awesome model and their release strategy that is much fun for the whole community.
|
172 |
+
Additionally, many thanks in particular to [Dmytro Dzhulgakov](https://huggingface.co/dzhulgakov) who was the first one with a running [inference implementation](https://github.com/dzhulgakov/llama-mistral), [Vik](https://huggingface.co/vikhyatk) who spotted a critical bug in our first implementation (he actually read the paper!), [winglian](https://huggingface.co/winglian) for helpful advice and Axolotl which was used to finetune the model, [MigTissera](https://huggingface.co/migtissera), [MetaMath](https://huggingface.co/meta-math) and [LDJnr](https://huggingface.co/LDJnr) for their great datasets, and everyone who participated in this awesome speedrun on either our, the [Nous Research](https://huggingface.co/NousResearch) or one of the other Discords (please contact us if we forgot to mention you here!).
|
173 |
|
174 |
**DiscoLM Mixtral is a [DiscoResearch](https://huggingface.co/DiscoResearch) project and was created by [Björn Plüster](https://huggingface.co/bjoernp).
|
175 |
The model was trained with compute provided by [HessianAI](https://hessian.ai/); many thanks as well to [LAION](https://laion.ai) for their coordination and providing invaluable contacts + advice.**
|