Commit
·
2108463
1
Parent(s):
5b4c442
Update README.md
Browse files
README.md
CHANGED
@@ -24,9 +24,8 @@ For example Kokoro, Bocchan, Sanshiro and so on...
|
|
24 |
LUKE (Language Understanding with Knowledge-based Embeddings) is a new pre-trained contextualized representation of words and entities based on transformer. LUKE treats words and entities in a given text as independent tokens, and outputs contextualized representations of them. LUKE adopts an entity-aware self-attention mechanism that is an extension of the self-attention mechanism of the transformer, and considers the types of tokens (words or entities) when computing attention scores.
|
25 |
|
26 |
LUKE achieves state-of-the-art results on five popular NLP benchmarks including SQuAD v1.1 (extractive question answering), CoNLL-2003 (named entity recognition), ReCoRD (cloze-style question answering), TACRED (relation classification), and Open Entity (entity typing).
|
27 |
-
luke-japaneseは、単語とエンティティの知識拡張型訓練済み Transformer モデルLUKEの日本語版です。LUKE
|
28 |
|
29 |
-
このモデルは、通常の NLP タスクでは使われない Wikipedia エンティティのエンベディングを含んでいます。単語の入力のみを使うタスクには、lite versionを使用してください。
|
30 |
# how to use 使い方
|
31 |
ステップ0:pythonとpytorchのインストールとtransformersのアップデート(バージョンが古すぎるとMLukeTokenizerが入っていないため)
|
32 |
update transformers and install python and pytorch
|
|
|
24 |
LUKE (Language Understanding with Knowledge-based Embeddings) is a new pre-trained contextualized representation of words and entities based on transformer. LUKE treats words and entities in a given text as independent tokens, and outputs contextualized representations of them. LUKE adopts an entity-aware self-attention mechanism that is an extension of the self-attention mechanism of the transformer, and considers the types of tokens (words or entities) when computing attention scores.
|
25 |
|
26 |
LUKE achieves state-of-the-art results on five popular NLP benchmarks including SQuAD v1.1 (extractive question answering), CoNLL-2003 (named entity recognition), ReCoRD (cloze-style question answering), TACRED (relation classification), and Open Entity (entity typing).
|
27 |
+
luke-japaneseは、単語とエンティティの知識拡張型訓練済み Transformer モデルLUKEの日本語版です。LUKE は単語とエンティティを独立したトークンとして扱い、これらの文脈を考慮した表現を出力します。
|
28 |
|
|
|
29 |
# how to use 使い方
|
30 |
ステップ0:pythonとpytorchのインストールとtransformersのアップデート(バージョンが古すぎるとMLukeTokenizerが入っていないため)
|
31 |
update transformers and install python and pytorch
|