Edit model card

t5_8_3e-5_datav2_min30_lp2_sample

This model is a fine-tuned version of KETI-AIR/ke-t5-large-ko on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 6.2375
  • Rouge1: 24.1102
  • Rouge2: 5.3137
  • Rougel: 16.1086
  • Bleu1: 18.6424
  • Bleu2: 8.0483
  • Bleu3: 2.7046
  • Bleu4: 0.7308
  • Gen Len: 36.4012

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 3e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 10.0

Training results

Training Loss Epoch Step Validation Loss Rouge1 Rouge2 Rougel Bleu1 Bleu2 Bleu3 Bleu4 Gen Len
4.1641 1.04 5000 6.8094 21.6187 4.959 14.8344 16.9553 7.4791 2.8017 1.1852 38.0426
3.1804 2.08 10000 5.6664 22.2631 5.127 15.5533 16.881 7.515 2.8628 1.0614 33.7325
2.779 3.12 15000 5.3350 22.5781 5.1137 15.7717 16.8632 7.3067 2.7117 0.9906 31.459
2.4111 4.15 20000 5.2687 24.4915 6.003 16.8096 18.5998 8.54 3.4084 1.1511 32.7477
2.2192 5.19 25000 5.3300 24.9661 6.0773 16.8486 19.0105 8.6794 3.4052 1.3281 32.9696
1.9306 6.23 30000 5.4806 24.8662 5.9711 16.235 19.2093 8.7044 3.2412 1.0675 35.0973
1.6696 7.27 35000 5.6865 24.3913 5.6936 16.4663 18.5884 8.3035 2.9593 1.0997 34.617
1.4566 8.31 40000 5.8677 24.9166 5.8251 16.647 19.0703 8.5159 3.3477 1.1257 35.1763
1.2808 9.35 45000 6.2375 24.1102 5.3137 16.1086 18.6424 8.0483 2.7046 0.7308 36.4012

Framework versions

  • Transformers 4.25.1
  • Pytorch 1.13.0+cu117
  • Datasets 2.7.1
  • Tokenizers 0.13.2
Downloads last month
2
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.