Update README.md
Browse files
README.md
CHANGED
@@ -1,7 +1,7 @@
|
|
1 |
---
|
2 |
license: apache-2.0
|
3 |
---
|
4 |
-
|
5 |
|
6 |
## Mistral-22b-V.01 Release Announcement 🚀
|
7 |
|
@@ -40,4 +40,6 @@ Keep an eye out for **V.2**, it's going to be a game-changer! And is currently t
|
|
40 |
- Thank you to Tim Dettmers, for creating QLora
|
41 |
- Thank you to Tri Dao, for creating Flash Attention
|
42 |
- Thank you to Microsoft, for the Lora paper, and the Slice-GPT paper.
|
43 |
-
- Thank you to the Hugging Face team, for everything.❤️ We really do appreciate you guys and all your hard work and commitment to the open source community!❤️
|
|
|
|
|
|
1 |
---
|
2 |
license: apache-2.0
|
3 |
---
|
4 |
+
<img src="https://huggingface.co/Vezora/Mistral-22B-v0.1/resolve/main/unsloth.png" width="400" height="500" />
|
5 |
|
6 |
## Mistral-22b-V.01 Release Announcement 🚀
|
7 |
|
|
|
40 |
- Thank you to Tim Dettmers, for creating QLora
|
41 |
- Thank you to Tri Dao, for creating Flash Attention
|
42 |
- Thank you to Microsoft, for the Lora paper, and the Slice-GPT paper.
|
43 |
+
- Thank you to the Hugging Face team, for everything.❤️ We really do appreciate you guys and all your hard work and commitment to the open source community!❤️
|
44 |
+
|
45 |
+
## I will answer more questions about this model tomorrow, I have not slept since mixtral 22bx8 dropped. Base model + V.2 Lora checkpoint will be available tomorrow ##
|