|
--- |
|
language: ja |
|
license: MIT |
|
datasets: |
|
- mC4 Japanese |
|
--- |
|
|
|
# transformers-ud-japanese-electra-ginza (sudachitra-wordpiece, mC4 Japanese) |
|
|
|
This is an [ELECTRA](https://github.com/google-research/electra) model pretrained on approximately 200M Japanese sentences extracted from the [mC4](https://huggingface.co/datasets/mc4) and finetuned by [spaCy v3](https://spacy.io/usage/v3) on [UD\_Japanese\_BCCWJ r2.8](https://universaldependencies.org/treebanks/ja_bccwj/index.html). |
|
|
|
The base pretrain model is [megagonlabs/transformers-ud-japanese-electra-base-discrimininator](https://huggingface.co/megagonlabs/transformers-ud-japanese-electra-base-discriminator). |
|
|
|
## Licenses |
|
|
|
The models are distributed under the terms of [The MIT License](https://opensource.org/licenses/mit-license.php). |
|
|
|
## Citations |
|
- [mC4](https://huggingface.co/datasets/mc4) |
|
Contains information from `mC4` which is made available under the [ODC Attribution License](https://opendatacommons.org/licenses/by/1-0/). |
|
``` |
|
@article{2019t5, |
|
author = {Colin Raffel and Noam Shazeer and Adam Roberts and Katherine Lee and Sharan Narang and Michael Matena and Yanqi Zhou and Wei Li and Peter J. Liu}, |
|
title = {Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer}, |
|
journal = {arXiv e-prints}, |
|
year = {2019}, |
|
archivePrefix = {arXiv}, |
|
eprint = {1910.10683}, |
|
} |
|
``` |
|
- [UD\_Japanese\_BCCWJ r2.8](https://universaldependencies.org/treebanks/ja_bccwj/index.html) |
|
``` |
|
Asahara, M., Kanayama, H., Tanaka, T., Miyao, Y., Uematsu, S., Mori, S., |
|
Matsumoto, Y., Omura, M., & Murawaki, Y. (2018). |
|
Universal Dependencies Version 2 for Japanese. |
|
In LREC-2018. |
|
``` |