This repo contains YugoGPT - the best open-source base 7B LLM for BCS (Bosnian, Croatian, Serbian) languages developed by Aleksa Gordić.
You can access more powerful iterations of YugoGPT already through the recently announced RunaAI's API platform!
Serbian LLM eval results compared to Mistral 7B, LLaMA 2 7B, and GPT2-orao (also see this LinkedIn post):
Eval was computed using https://github.com/gordicaleksa/serbian-llm-eval
It was trained on tens of billions of BCS tokens and is based off of Mistral 7B.
Notes
YugoGPT is a base model and therefore does not have any moderation mechanisms.
Since it's a base model it won't follow your instructions as it's just a powerful autocomplete engine.
If you want an access to much more powerful BCS LLMs (some of which are powering yugochat) - you can access the models through RunaAI's API
Credits
The data for the project was obtained with the help of Nikola Ljubešić, CLARIN.SI, and CLASSLA. Thank you!
Project Sponsors
A big thank you to the project sponsors!
Platinum sponsors 🌟
- Ivan (anon)
- Things Solver
Gold sponsors 🟡
- qq (anon)
- Adam Sofronijevic
- Yanado
- Mitar Perovic
- Nikola Ivancevic
- Rational Development DOO
- Ivan i Natalija Kokić
Silver sponsors ⚪
psk.rs, OmniStreak, Luka Važić, Miloš Durković, Marjan Radeski, Marjan Stankovic, Nikola Stojiljkovic, Mihailo Tomić, Bojan Jevtic, Jelena Jovanović, Nenad Davidović, Mika Tasich, TRENCH-NS, Nemanja Grujičić, tim011
Also a big thank you to the following individuals:
- Slobodan Marković - for spreading the word! :)
- Aleksander Segedi - for help around bookkeeping!
Citation
@article{YugoGPT,
author = "Gordić Aleksa",
title = "YugoGPT - an open-source LLM for Serbian, Bosnian, and Croatian languages",
year = "2024"
howpublished = {\url{https://huggingface.co/gordicaleksa/YugoGPT}},
}
- Downloads last month
- 629