Edit model card

RobeCzech model

RobeCzech is a monolingual RoBERTa language representation model trained on Czech data.

RobeCzech model is released publicly at LINDAT and Hugging Face.

Please cite the corresponding publication:

  • Milan Straka, Jakub Náplava, Jana Straková and David Samuel: Czech RoBERTa, a monolingual contextualized language representation model. Accepted to TSD 2021.

Preprint of the paper is available at https://arxiv.org/abs/2105.11314.

Downloads last month
29,807
Hosted inference API
Mask token: <mask>
This model can be loaded on the Inference API on-demand.

Space using ufal/robeczech-base