Mizuiro-sakura
commited on
Commit
•
74135d6
1
Parent(s):
da788c6
Update README.md
Browse files
README.md
CHANGED
@@ -8,8 +8,8 @@ license: mit
|
|
8 |
# This model is based on Luke-japanese-base-lite
|
9 |
This model was fine-tuned model which besed on studio-ousia/Luke-japanese-base-lite.
|
10 |
This could be able to distinguish between positive and negative content.
|
11 |
-
This model was fine-tuned by using Natsume Souseki's documents
|
12 |
-
For example Kokoro, Bocchan
|
13 |
|
14 |
# what is Luke?[1]
|
15 |
LUKE (Language Understanding with Knowledge-based Embeddings) is a new pre-trained contextualized representation of words and entities based on transformer. LUKE treats words and entities in a given text as independent tokens, and outputs contextualized representations of them. LUKE adopts an entity-aware self-attention mechanism that is an extension of the self-attention mechanism of the transformer, and considers the types of tokens (words or entities) when computing attention scores.
|
|
|
8 |
# This model is based on Luke-japanese-base-lite
|
9 |
This model was fine-tuned model which besed on studio-ousia/Luke-japanese-base-lite.
|
10 |
This could be able to distinguish between positive and negative content.
|
11 |
+
This model was fine-tuned by using Natsume Souseki's documents.
|
12 |
+
For example Kokoro, Bocchan, Sanshiro and so on...
|
13 |
|
14 |
# what is Luke?[1]
|
15 |
LUKE (Language Understanding with Knowledge-based Embeddings) is a new pre-trained contextualized representation of words and entities based on transformer. LUKE treats words and entities in a given text as independent tokens, and outputs contextualized representations of them. LUKE adopts an entity-aware self-attention mechanism that is an extension of the self-attention mechanism of the transformer, and considers the types of tokens (words or entities) when computing attention scores.
|