ryo0634 commited on
Commit
ddb832c
1 Parent(s): acc1f9f

Create README.md

Browse files
Files changed (1) hide show
  1. README.md +63 -0
README.md ADDED
@@ -0,0 +1,63 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ language:
3
+ - multilingual
4
+ - ar
5
+ - bn
6
+ - de
7
+ - el
8
+ - en
9
+ - es
10
+ - fi
11
+ - fr
12
+ - hi
13
+ - id
14
+ - it
15
+ - ja
16
+ - ko
17
+ - nl
18
+ - pl
19
+ - pt
20
+ - ru
21
+ - sv
22
+ - sw
23
+ - te
24
+ - th
25
+ - tr
26
+ - vi
27
+ - zh
28
+ thumbnail: https://github.com/studio-ousia/luke/raw/master/resources/luke_logo.png
29
+ tags:
30
+ - luke
31
+ - named entity recognition
32
+ - relation classification
33
+ - question answering
34
+ license: apache-2.0
35
+ ---
36
+
37
+ ## mLUKE
38
+
39
+ **mLUKE** (multilingual LUKE) is a multilingual extension of LUKE.
40
+
41
+ Please check the [official repository](https://github.com/studio-ousia/luke) for
42
+ more details and updates.
43
+
44
+ This is the mLUKE base model with 12 hidden layers, 768 hidden size. The total number
45
+ of parameters in this model is 279M.
46
+ The model was initialized with the weights of XLM-RoBERTa(base) and trained using December 2020 version of Wikipedia in 24 languages.
47
+
48
+ This model is a lite-weight version of [studio-ousia/mluke-base](https://huggingface.co/studio-ousia/mluke-base), without Wikipedia entity embeddings but only with special entities such as `[MASK]`.
49
+
50
+ ### Citation
51
+
52
+ If you find mLUKE useful for your work, please cite the following paper:
53
+
54
+ ```latex
55
+ @inproceedings{ri-etal-2022-mluke,
56
+ title = "m{LUKE}: {T}he Power of Entity Representations in Multilingual Pretrained Language Models",
57
+ author = "Ri, Ryokan and
58
+ Yamada, Ikuya and
59
+ Tsuruoka, Yoshimasa",
60
+ booktitle = "Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)",
61
+ year = "2022",
62
+ url = "https://aclanthology.org/2022.acl-long.505",
63
+ ```