YAML Metadata Warning: empty or missing yaml metadata in repo card (https://huggingface.co/docs/hub/model-cards#model-card-metadata)

Trained from scratch multilingual language model (XLM-Roberta architecture) from our AACL 2022 paper Cross-lingual Similarity of Multilingual Representations Revisited.

Paper (model and training description): https://aclanthology.org/2022.aacl-main.15/
GitHub repo: https://github.com/delmaksym/xsim#cross-lingual-similarity-of-multilingual-representations-revisited

Downloads last month
9
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Collection including delmaksym/aacl22.scale_normformer