ptaszynski
commited on
Commit
•
d99818c
1
Parent(s):
dbc2b61
Update README.md
Browse files
README.md
CHANGED
@@ -14,4 +14,16 @@ datasets:
|
|
14 |
|
15 |
This is [ELECTRA](https://github.com/google-research/electra) Small model for Japanese pretrained on 354 million sentences / 5.6 billion words of [YACIS](https://github.com/ptaszynski/yacis-corpus) blog corpus.
|
16 |
|
17 |
-
The corpus was tokenized for pretraining with [MeCab](https://taku910.github.io/mecab/). Subword tokenization was peroformed with WordPiece.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
14 |
|
15 |
This is [ELECTRA](https://github.com/google-research/electra) Small model for Japanese pretrained on 354 million sentences / 5.6 billion words of [YACIS](https://github.com/ptaszynski/yacis-corpus) blog corpus.
|
16 |
|
17 |
+
The corpus was tokenized for pretraining with [MeCab](https://taku910.github.io/mecab/). Subword tokenization was peroformed with WordPiece.
|
18 |
+
|
19 |
+
## Model architecture
|
20 |
+
|
21 |
+
This model uses the original ELECTRA Small model; 12 layers, 128 dimensions of hidden states, and 12 attention heads.
|
22 |
+
|
23 |
+
Vocabulary size was 32,000 tokens.
|
24 |
+
|
25 |
+
## Licenses
|
26 |
+
|
27 |
+
The pretrained model with all attached files is distributed under the terms of the [CC BY-SA 4.0](https://creativecommons.org/licenses/by-sa/4.0/deed.en) license.
|
28 |
+
|
29 |
+
<a rel="license" href="http://creativecommons.org/licenses/by-sa/4.0/"><img alt="Creative Commons License" style="border-width:0" src="https://i.creativecommons.org/l/by-sa/4.0/80x15.png" /></a><br />This work is licensed under a <a rel="license" href="http://creativecommons.org/licenses/by-sa/4.0/">Creative Commons Attribution-ShareAlike 4.0 International License</a>.
|