Edit model card

mLUKE

mLUKE (multilingual LUKE) is a multilingual extension of LUKE.

Please check the official repository for more details and updates.

This is the mLUKE base model with 12 hidden layers, 768 hidden size. The total number of parameters in this model is 279M. The model was initialized with the weights of XLM-RoBERTa(base) and trained using December 2020 version of Wikipedia in 24 languages.

This model is a lite-weight version of studio-ousia/mluke-base, without Wikipedia entity embeddings but only with special entities such as [MASK].

Citation

If you find mLUKE useful for your work, please cite the following paper:

@inproceedings{ri-etal-2022-mluke,
    title = "m{LUKE}: {T}he Power of Entity Representations in Multilingual Pretrained Language Models",
    author = "Ri, Ryokan  and
      Yamada, Ikuya  and
      Tsuruoka, Yoshimasa",
    booktitle = "Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)",
    year = "2022",
    url = "https://aclanthology.org/2022.acl-long.505",
Downloads last month
89
Hosted inference API
Mask token: undefined
This model can be loaded on the Inference API on-demand.