Edit model card

SyntheticRoBERTa

This is a pre-trained model of type RoBERTa. SyntheticRoBERTa is built on a dataset of texts in Russian, which were generated according to the rules written in context-free grammar.

Evaluation

This model was evaluated on RussianSuperGLUE tests:

Task Result Metrics
LiDiRus 0,0 Matthews Correlation Coefficient
RCB 0,091 / 0,158 F1 / Accuracy
PARus 0,502 Accuracy
TERRa 0,487 Accuracy
RUSSE 0,587 Accuracy
RWSD 0,331 Accuracy
Downloads last month
4
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Dataset used to train tay-yozhik/SyntheticRoBERTa