Fill-Mask
Transformers
PyTorch
Russian
roberta
Inference Endpoints
Edit model card

SyntheticRoBERTa

This is a pre-trained model of type RoBERTa. SyntheticRoBERTa is built on a dataset of texts in Russian, which were generated according to the rules written in context-free grammar.

Evaluation

This model was evaluated on RussianSuperGLUE tests:

Task Result Metrics
LiDiRus 0,0 Matthews Correlation Coefficient
RCB 0,091 / 0,158 F1 / Accuracy
PARus 0,502 Accuracy
TERRa 0,487 Accuracy
RUSSE 0,587 Accuracy
RWSD 0,331 Accuracy
Downloads last month
38

Dataset used to train tay-yozhik/SyntheticRoBERTa