Mizuiro-sakura
commited on
Commit
•
462f2b7
1
Parent(s):
634b1ff
Update README.md
Browse files
README.md
CHANGED
@@ -11,7 +11,17 @@ This could be able to distinguish between positive and negative content.
|
|
11 |
This model was fine-tuned by using Natsume Souseki's documents
|
12 |
For example Kokoro, Bocchan and Sanshiro and so on...
|
13 |
|
14 |
-
#
|
|
|
15 |
|
|
|
16 |
|
|
|
17 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
11 |
This model was fine-tuned by using Natsume Souseki's documents
|
12 |
For example Kokoro, Bocchan and Sanshiro and so on...
|
13 |
|
14 |
+
# what is Luke?[1]
|
15 |
+
LUKE (Language Understanding with Knowledge-based Embeddings) is a new pre-trained contextualized representation of words and entities based on transformer. LUKE treats words and entities in a given text as independent tokens, and outputs contextualized representations of them. LUKE adopts an entity-aware self-attention mechanism that is an extension of the self-attention mechanism of the transformer, and considers the types of tokens (words or entities) when computing attention scores.
|
16 |
|
17 |
+
LUKE achieves state-of-the-art results on five popular NLP benchmarks including SQuAD v1.1 (extractive question answering), CoNLL-2003 (named entity recognition), ReCoRD (cloze-style question answering), TACRED (relation classification), and Open Entity (entity typing).
|
18 |
|
19 |
+
# how to use 使い方
|
20 |
|
21 |
+
# Citation
|
22 |
+
[1]@inproceedings{yamada2020luke,
|
23 |
+
title={LUKE: Deep Contextualized Entity Representations with Entity-aware Self-attention},
|
24 |
+
author={Ikuya Yamada and Akari Asai and Hiroyuki Shindo and Hideaki Takeda and Yuji Matsumoto},
|
25 |
+
booktitle={EMNLP},
|
26 |
+
year={2020}
|
27 |
+
}
|