### t5-base finetuned on xsum dataset #### train args
max_input_length: 512
max_tgt_length: 128
epoch: 3
optimizer: AdamW
lr: 2e-5
weight_decay: 1e-3
fp16: False
prefix: "summarize: "
#### performance
train_loss 0.5976
eval_loss: 0.5340
eval_rouge1: 34.6791
eval_rouge2: 12.8236
eval_rougeL: 28.1201
eval_rougeLsum: 28.1241
#### usage
from transformers import AutoTokenizer, AutoModelForSeq2SeqLM
#### dependency
trained with transformers==4.24
compatible with transformers==3.0.2