Update README.md
Browse files
README.md
CHANGED
@@ -13,7 +13,7 @@ pipeline_tag: text-generation
|
|
13 |
> A [polyglot](https://en.wikipedia.org/wiki/Multilingualism#In_individuals) language model for the [Occident](https://en.wikipedia.org/wiki/Occident).
|
14 |
>
|
15 |
|
16 |
-
**Occiglot-7B-DE-EN** is a generative language model with 7B parameters for
|
17 |
It is based on [Mistral-7B-v0.1](https://huggingface.co/mistralai/Mistral-7B-v0.1) and trained on 114B tokens of additional multilingual and code data with a block size of 8,192 tokens per sample.
|
18 |
Note that the model is a general-purpose base model and was not instruction-fine-tuned nor optimized for chat or other applications. We make an instruction tuned variant available as [occiglot-7b-de-en-instruct](https://huggingface.co/occiglot/occiglot-7b-de-en-instruct)
|
19 |
|
|
|
13 |
> A [polyglot](https://en.wikipedia.org/wiki/Multilingualism#In_individuals) language model for the [Occident](https://en.wikipedia.org/wiki/Occident).
|
14 |
>
|
15 |
|
16 |
+
**Occiglot-7B-DE-EN** is a generative language model with 7B parameters for German and English and trained by the [Occiglot Research Collective](https://ociglot.eu).
|
17 |
It is based on [Mistral-7B-v0.1](https://huggingface.co/mistralai/Mistral-7B-v0.1) and trained on 114B tokens of additional multilingual and code data with a block size of 8,192 tokens per sample.
|
18 |
Note that the model is a general-purpose base model and was not instruction-fine-tuned nor optimized for chat or other applications. We make an instruction tuned variant available as [occiglot-7b-de-en-instruct](https://huggingface.co/occiglot/occiglot-7b-de-en-instruct)
|
19 |
|