eryk-mazus
commited on
Commit
•
20211fe
1
Parent(s):
aecbcac
Update README.md
Browse files
README.md
CHANGED
@@ -30,7 +30,7 @@ Context size: 2,048 tokens.
|
|
30 |
|
31 |
## Notes
|
32 |
|
33 |
-
This base model was initially developed as
|
34 |
|
35 |
The model is capable of producing coherent Polish text, but due to its size, it is likely to suffer from hallucination.
|
36 |
|
|
|
30 |
|
31 |
## Notes
|
32 |
|
33 |
+
This base model was initially developed as the foundation for instruction tuning, which resulted in [polka-1.1b-chat](https://huggingface.co/eryk-mazus/polka-1.1b-chat). Nonetheless, I'm sharing it with the community because I see potential value in its combination of relatively good performance and an efficient bilingual tokenizer.
|
34 |
|
35 |
The model is capable of producing coherent Polish text, but due to its size, it is likely to suffer from hallucination.
|
36 |
|