--- language: ja tags: - t5 - text2text-generation - seq2seq license: apache-2.0 datasets: - mc4 - wiki40b --- # t5-base-japanese-web (with Byte-fallback) ## Description [megagonlabs/t5-base-japanese-web](https://huggingface.co/megagonlabs/t5-base-japanese-web) is a T5 (Text-to-Text Transfer Transformer) model pre-trained on Japanese web texts. ### Corpus - Japanese in [mC4/3.0.1](https://github.com/allenai/allennlp/discussions/5056) - [Japanese in wiki40b/1.3.0](https://www.tensorflow.org/datasets/catalog/wiki40b#wiki40bja) ### Tokeniser SentencePiece trained on Japanese Wikipedia - Vocabulary size: 32,000 - [Byte-fallback](https://github.com/google/sentencepiece/releases/tag/v0.1.9): Enabled ### Parameters - T5 model: [models/t5.1.1.base.gin](https://github.com/google-research/text-to-text-transfer-transformer/blob/main/t5/models/gin/models/t5.1.1.base.gin) - Training steps: 1,000,000 ## Documents - [pretrain of T5 with TPU](docs/mC4_wiki40b.md) ## Links - Repositories - [T5](https://github.com/google-research/text-to-text-transfer-transformer) - [mT5](https://github.com/google-research/multilingual-t5) - Related models - [日本語T5事前学習済みモデル (sonoisa/t5-base-japanese)](https://huggingface.co/sonoisa/t5-base-japanese) - [日本語T5事前学習済みモデル (sonoisa/t5-base-japanese-mC4-Wikipedia)](https://huggingface.co/sonoisa/t5-base-japanese-mC4-Wikipedia) - Articles - [第7回 T5 によるテキスト生成の検証 (2020年2月26日)](https://www.ogis-ri.co.jp/otc/hiroba/technical/similar-document-search/part7.html) - [第8回 続・T5 によるテキスト生成の検証 (2020年4月23日)](https://www.ogis-ri.co.jp/otc/hiroba/technical/similar-document-search/part8.html) - [第14回 Hugging Face Transformers で T5 を使ってみる (2021年4月22日)](https://www.ogis-ri.co.jp/otc/hiroba/technical/similar-document-search/part14.html) ## License Apache License 2.0