Update README.md
Browse files
README.md
CHANGED
@@ -62,7 +62,7 @@ Please give ideas and a detailed plan about how to assemble and train an army of
|
|
62 |
## Gratitude
|
63 |
- So much thanks to MagiCoder and theblackat102 for updating license to apache2 for commercial use!
|
64 |
- This model was made possible by the generous sponsorship of [Convai](https://www.convai.com/).
|
65 |
-
- Huge thank you to [MistralAI](https://mistral.ai/) for training and publishing the weights of Mixtral-
|
66 |
- Thank you to Microsoft for authoring the Orca paper and inspiring this work.
|
67 |
- HUGE Thank you to the dataset authors: @jondurbin, @ise-uiuc, @teknium, @LDJnr and @migtissera
|
68 |
- And HUGE thanks to @winglian and the Axolotl contributors for making the best training framework!
|
|
|
62 |
## Gratitude
|
63 |
- So much thanks to MagiCoder and theblackat102 for updating license to apache2 for commercial use!
|
64 |
- This model was made possible by the generous sponsorship of [Convai](https://www.convai.com/).
|
65 |
+
- Huge thank you to [MistralAI](https://mistral.ai/) for training and publishing the weights of Mixtral-7b
|
66 |
- Thank you to Microsoft for authoring the Orca paper and inspiring this work.
|
67 |
- HUGE Thank you to the dataset authors: @jondurbin, @ise-uiuc, @teknium, @LDJnr and @migtissera
|
68 |
- And HUGE thanks to @winglian and the Axolotl contributors for making the best training framework!
|