aacl22.scale_pre / README.md
delmaksym's picture
Create README.md
b377b63

Trained from scratch multilingual language model (XLM-Roberta architecture) from our AACL 2022 paper Cross-lingual Similarity of Multilingual Representations Revisited.

Paper (model and training description): https://aclanthology.org/2022.aacl-main.15/
GitHub repo: https://github.com/delmaksym/xsim#cross-lingual-similarity-of-multilingual-representations-revisited