Indonesian T5 Language Models
Indonesian T5 models pre-trained with nanoT5 and fine-tuned on IndoNLG tasks. GitHub: https://github.com/LazarusNLP/IndoT5/
Text2Text Generation • Updated • 4 • 2Note Baseline T5 model trained on `uonlp/CulturaX` for 65k steps. Achieves an evaluation loss of 2.082, PPL 8.02.
LazarusNLP/IndoNanoT5-base-IndoSum
Text2Text Generation • Updated • 15Note `LazarusNLP/IndoNanoT5-base` fine-tuned on IndoSum. State-of-the-art model on IndoSum; R1: 75.29, R2: 71.23, RL: 73.30.
LazarusNLP/IndoNanoT5-base-Liputan6-Canonical
Text2Text Generation • Updated • 3Note `LazarusNLP/IndoNanoT5-base` fine-tuned on Liputan6 Canonical. Competitive with IndoBART and mT5 Small; Canonical - R1: 39.76, R2: 22.29, RL: 33.46; Extreme - R1: 33.26, R2: 14.17, RL: 26.21.
LazarusNLP/IndoNanoT5-base-TyDiQA
Text2Text Generation • Updated • 11Note `LazarusNLP/IndoNanoT5-base` fine-tuned on TyDiQA. Outperforms IndoBART; EM: 58.94, F1: 72.19.
LazarusNLP/IndoNanoT5-base-XPersona
Text2Text Generation • Updated • 6Note `LazarusNLP/IndoNanoT5-base` fine-tuned on XPersona `id`. State-of-the-art model on XPersona `id`; BLEU: 4.0669, SacreBLEU: 4.0669.