Update README.md
Browse files
README.md
CHANGED
@@ -2,9 +2,9 @@
|
|
2 |
license: mit
|
3 |
|
4 |
---
|
5 |
-
Model card for the Mistral-14b-v0.1
|
6 |
|
7 |
-
An upscaling and continued pretraining of Mistral-7B-v0.1
|
8 |
|
9 |
❤ Thank you for the original Mistral-7B-v0.1 model to:
|
10 |
```yaml
|
@@ -51,9 +51,9 @@ An upscaling and continued pretraining of Mistral-7B-v0.1 Ai model by MistralAi.
|
|
51 |
|
52 |
```
|
53 |
|
54 |
-
The Mistral-14b-v0.1
|
55 |
|
56 |
-
The model had continued pre-training on a private collection of data that we do not plan on releasing at the current time.
|
57 |
|
58 |
The Prompt template for the Mistral-14b-v0.1 is (ChatML) as that is the format that it was trained on.
|
59 |
```
|
|
|
2 |
license: mit
|
3 |
|
4 |
---
|
5 |
+
Model card for the Mistral-14b-v0.1 AI model.
|
6 |
|
7 |
+
An upscaling and continued pretraining of Mistral-7B-v0.1 AI model by MistralAi.
|
8 |
|
9 |
❤ Thank you for the original Mistral-7B-v0.1 model to:
|
10 |
```yaml
|
|
|
51 |
|
52 |
```
|
53 |
|
54 |
+
The Mistral-14b-v0.1 AI model is a next generation AI meant to continue the legacy started by MistralAi. What began as a 7b parameter model, we have now grown, not only in capacity but also in intelligence, to help bring more to the open source AI community that was sorely needed in this area. Mistral-7B-v0.1 at the time of release was the best and smartest AI model for its size, and we at Replete-AI hope to reproduce that success with our new model at its 14b parameter size.
|
55 |
|
56 |
+
The model had continued pre-training on a private collection of data that we do not plan on releasing at the current time.
|
57 |
|
58 |
The Prompt template for the Mistral-14b-v0.1 is (ChatML) as that is the format that it was trained on.
|
59 |
```
|