Migrate model card from transformers-repo
Browse filesRead announcement at https://discuss.huggingface.co/t/announcement-all-model-cards-will-be-migrated-to-hf-co-model-repos/2755
Original file history: https://github.com/huggingface/transformers/commits/master/model_cards/Cinnamon/electra-small-japanese-discriminator/README.md
README.md
ADDED
@@ -0,0 +1,20 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
---
|
2 |
+
language: ja
|
3 |
+
license: apache-2.0
|
4 |
+
---
|
5 |
+
|
6 |
+
## Japanese ELECTRA-small
|
7 |
+
|
8 |
+
We provide a Japanese **ELECTRA-Small** model, as described in [ELECTRA: Pre-training Text Encoders as Discriminators Rather Than Generators](https://openreview.net/pdf?id=r1xMH1BtvB).
|
9 |
+
|
10 |
+
Our pretraining process employs subword units derived from the [Japanese Wikipedia](https://dumps.wikimedia.org/jawiki/latest), using the [Byte-Pair Encoding](https://www.aclweb.org/anthology/P16-1162.pdf) method and building on an initial tokenization with [mecab-ipadic-NEologd](https://github.com/neologd/mecab-ipadic-neologd). For optimal performance, please take care to set your MeCab dictionary appropriately.
|
11 |
+
|
12 |
+
## How to use the discriminator in `transformers`
|
13 |
+
|
14 |
+
```
|
15 |
+
from transformers import BertJapaneseTokenizer, ElectraForPreTraining
|
16 |
+
|
17 |
+
tokenizer = BertJapaneseTokenizer.from_pretrained('Cinnamon/electra-small-japanese-discriminator', mecab_kwargs={"mecab_option": "-d /usr/lib/x86_64-linux-gnu/mecab/dic/mecab-ipadic-neologd"})
|
18 |
+
|
19 |
+
model = ElectraForPreTraining.from_pretrained('Cinnamon/electra-small-japanese-discriminator')
|
20 |
+
```
|