gpt2-srlat / README.md
procesaur's picture
Update README.md
1e1a2ca
metadata
license: agpl-3.0
language:
  - sr

Model is developed in support of the University of Belgrade doctoral dissertation "Composite pseudogrammars based on parallel language models of Serbian" by Mihailo Škorić.

This small gpt-2 model was trained on several corpora for Serbian, including "The corpus of Contemporary Serbian", SrpELTeC and WikiKorpus by JeRTeh – Society for Language Resources and Technologies.