--- license: cc-by-4.0 language: - multilingual - af - am - ar - as - az - be - bg - bn - br - bs - ca - cs - cy - da - de - el - en - eo - es - et - eu - fa - fi - fr - fy - ga - gd - gl - gu - ha - he - hi - hr - hu - hy - id - is - it - ja - jv - ka - kk - km - kn - ko - ku - ky - la - lo - lt - lv - mg - mk - ml - mn - mr - ms - my - ne - nl - no - om - or - pa - pl - ps - pt - ro - ru - sa - sd - si - sk - sl - so - sq - sr - su - sv - sw - ta - te - th - tl - tr - ug - uk - ur - uz - vi - xh - yi - zh - an - ast - ba - bar - inc - ceb - ce - cv - ht - io - roa - nds - lm - min - new - nb - nn - oc - pms - sco - scn - aze - tg - tt - ud - vo - war - fry - pnb - yo tags: - multilingual - bert - roberta - xlmr - bm --- ## Model type: Transformer-based masked language model ## Training data: No additional pretraining, merges two existing models ## Languages: 100+ languages # Architecture: - Base architectures: - XLM-RoBERTa base (multilingual) - BERT base cased (multilingual) ## Custom merging technique to combine weights from both base models into one unified model