Yuta Hayashibe
Update README.md
3dd9d27
|
raw
history blame
No virus
1.32 kB
metadata
language: ja
tags:
  - t5
  - text2text-generation
  - seq2seq
license: apache-2.0
datasets:
  - mc4
  - wiki40b

t5-base-japanese-web (with Byte-fallback)

Description

megagonlabs/t5-base-japanese-web is a T5 (Text-to-Text Transfer Transformer) model pre-trained on Japanese web texts.
Training codes are available on GitHub.

Corpus

Tokeniser

SentencePiece trained on Japanese Wikipedia

Parameters

Related models

License

Apache License 2.0