# studio-ousia /mluke-large-lite

## mLUKE

mLUKE (multilingual LUKE) is a multilingual extension of LUKE.

This is the mLUKE base model with 12 hidden layers, 768 hidden size. The total number of parameters in this model is 561M. The model was initialized with the weights of XLM-RoBERTa(large) and trained using December 2020 version of Wikipedia in 24 languages.

This model is a lite-weight version of studio-ousia/mluke-large, without Wikipedia entity embeddings but only with special entities such as [MASK].

### Citation

If you find mLUKE useful for your work, please cite the following paper:

@inproceedings{ri-etal-2022-mluke,
title = "m{LUKE}: {T}he Power of Entity Representations in Multilingual Pretrained Language Models",
author = "Ri, Ryokan  and

Mask token: undefined