hiroshi-matsuda-rit commited on
Commit
16bb5ff
1 Parent(s): 1690dbf

Create README.md

Browse files
Files changed (1) hide show
  1. README.md +34 -0
README.md ADDED
@@ -0,0 +1,34 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ language: ja
3
+ license: MIT
4
+ datasets:
5
+ - mC4 Japanese
6
+ ---
7
+
8
+ # transformers-ud-japanese-electra-ginza (sudachitra-wordpiece, mC4 Japanese)
9
+
10
+ This is an [ELECTRA](https://github.com/google-research/electra) model pretrained on approximately 200M Japanese sentences extracted from the [mC4](https://huggingface.co/datasets/mc4) and finetuned by [spaCy v3](https://spacy.io/usage/v3) on [UD\_Japanese\_BCCWJ r2.8](https://universaldependencies.org/treebanks/ja_bccwj/index.html).
11
+
12
+ The base pretrain model is [megagonlabs/transformers-ud-japanese-electra-base-discrimininator](https://huggingface.co/megagonlabs/transformers-ud-japanese-electra-base-discriminator).
13
+
14
+ ## Licenses
15
+
16
+ The models are distributed under the terms of [The MIT License](https://opensource.org/licenses/mit-license.php).
17
+
18
+ ## Citations
19
+ - [mC4](https://huggingface.co/datasets/mc4)
20
+ Contains information from `mC4` which is made available under the [ODC Attribution License](https://opendatacommons.org/licenses/by/1-0/).
21
+ ```
22
+ @article{2019t5,
23
+ author = {Colin Raffel and Noam Shazeer and Adam Roberts and Katherine Lee and Sharan Narang and Michael Matena and Yanqi Zhou and Wei Li and Peter J. Liu},
24
+ title = {Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer},
25
+ journal = {arXiv e-prints},
26
+ year = {2019},
27
+ archivePrefix = {arXiv},
28
+ eprint = {1910.10683},
29
+ }
30
+ ```
31
+ - [UD\_Japanese\_BCCWJ r2.8](https://universaldependencies.org/treebanks/ja_bccwj/index.html)
32
+ ```
33
+ Asahara, M., Kanayama, H., Tanaka, T., Miyao, Y., Uematsu, S., Mori, S., Matsumoto, Y., Omura, M., & Murawaki, Y. (2018). Universal Dependencies Version 2 for Japanese. In LREC-2018.
34
+ ```