Update README.md
Browse files
README.md
CHANGED
@@ -6,7 +6,7 @@ This repo contains YugoGPT - the best open-source base 7B LLM for BCS (Bosnian,
|
|
6 |
|
7 |
You can access more powerful iterations of YugoGPT already through the recently announced [RunaAI's API platform](https://dev.runaai.com/)!
|
8 |
|
9 |
-
Serbian LLM eval results:
|
10 |

|
11 |
|
12 |
Eval was computed using https://github.com/gordicaleksa/serbian-llm-eval
|
@@ -23,7 +23,7 @@ It was trained on tens of billions of BCS tokens and is based off of [Mistral 7B
|
|
23 |
|
24 |
# Credits
|
25 |
|
26 |
-
The data for the project was obtained with the help of [Nikola Ljubešić](https://nljubesi.github.io/), [CLARIN.SI](https://www.clarin.si), and [CLASSLA](https://www.clarin.si/info/k-centre/).
|
27 |
|
28 |
# Project Sponsors
|
29 |
|
|
|
6 |
|
7 |
You can access more powerful iterations of YugoGPT already through the recently announced [RunaAI's API platform](https://dev.runaai.com/)!
|
8 |
|
9 |
+
Serbian LLM eval results compared to Mistral 7B, LLaMA 2 7B, and GPT2-orao:
|
10 |

|
11 |
|
12 |
Eval was computed using https://github.com/gordicaleksa/serbian-llm-eval
|
|
|
23 |
|
24 |
# Credits
|
25 |
|
26 |
+
The data for the project was obtained with the help of [Nikola Ljubešić](https://nljubesi.github.io/), [CLARIN.SI](https://www.clarin.si), and [CLASSLA](https://www.clarin.si/info/k-centre/). Thank you!
|
27 |
|
28 |
# Project Sponsors
|
29 |
|