Edit model card

t5-vietnamese-summarization

This model is a fine-tuned version of pengold/t5-vietnamese-summarization on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 4.6288
  • Rouge1: 0.4728
  • Rouge2: 0.1669
  • Rougel: 0.3049
  • Rougelsum: 0.3049
  • Gen Len: 18.7458

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 70

Training results

Training Loss Epoch Step Validation Loss Rouge1 Rouge2 Rougel Rougelsum Gen Len
5.2487 1.0 2007 5.0028 0.4671 0.1595 0.2994 0.2994 18.7618
5.217 2.0 4014 4.9802 0.4639 0.1569 0.2984 0.2983 18.7747
5.2191 3.0 6021 4.9685 0.4644 0.1594 0.2989 0.2989 18.7613
5.2254 4.0 8028 4.9477 0.4648 0.1586 0.2988 0.2987 18.7458
5.1735 5.0 10035 4.9366 0.4654 0.1593 0.2988 0.2987 18.761
5.1735 6.0 12042 4.9214 0.4676 0.1611 0.3004 0.3004 18.78
5.1653 7.0 14049 4.9095 0.4681 0.1616 0.3007 0.3007 18.7523
5.1154 8.0 16056 4.8971 0.4664 0.1598 0.3002 0.3001 18.7655
5.1232 9.0 18063 4.8882 0.4683 0.1612 0.3008 0.3008 18.761
5.0995 10.0 20070 4.8758 0.4709 0.1618 0.3021 0.302 18.7518
5.1012 11.0 22077 4.8689 0.4685 0.1616 0.3011 0.3009 18.7665
5.0916 12.0 24084 4.8486 0.4695 0.1623 0.3024 0.3023 18.7655
5.0559 13.0 26091 4.8409 0.4699 0.1631 0.3024 0.3023 18.7849
5.0633 14.0 28098 4.8326 0.4705 0.1613 0.302 0.302 18.7583
5.0335 15.0 30105 4.8243 0.4696 0.1612 0.3023 0.3022 18.7638
5.0271 16.0 32112 4.8046 0.4691 0.1618 0.3022 0.3022 18.7518
5.0045 17.0 34119 4.8060 0.4708 0.1629 0.3029 0.3028 18.7568
5.0072 18.0 36126 4.7945 0.4702 0.1633 0.3024 0.3023 18.776
4.9954 19.0 38133 4.7894 0.47 0.1639 0.3022 0.3021 18.7785
4.9994 20.0 40140 4.7773 0.4692 0.1625 0.3028 0.3027 18.7623
4.953 21.0 42147 4.7641 0.4682 0.162 0.3015 0.3014 18.757
4.9526 22.0 44154 4.7600 0.4703 0.1626 0.3023 0.3023 18.7625
4.9571 23.0 46161 4.7592 0.4698 0.1627 0.3025 0.3025 18.781
4.9324 24.0 48168 4.7511 0.4697 0.1631 0.3022 0.3021 18.769
4.9323 25.0 50175 4.7433 0.4723 0.1649 0.304 0.3039 18.7757
4.9381 26.0 52182 4.7378 0.4703 0.1629 0.3026 0.3026 18.7782
4.9288 27.0 54189 4.7454 0.4709 0.1627 0.3026 0.3026 18.7777
4.9131 28.0 56196 4.7222 0.471 0.1652 0.3037 0.3037 18.782
4.9005 29.0 58203 4.7241 0.4719 0.1638 0.3039 0.3038 18.778
4.9051 30.0 60210 4.7225 0.4715 0.1647 0.3037 0.3036 18.7668
4.8816 31.0 62217 4.7181 0.4701 0.1631 0.3029 0.3029 18.7416
4.8687 32.0 64224 4.7061 0.4705 0.1643 0.3032 0.3031 18.7625
4.8935 33.0 66231 4.7063 0.4697 0.1632 0.3028 0.3028 18.7458
4.88 34.0 68238 4.6984 0.471 0.164 0.3039 0.3039 18.7663
4.8473 35.0 70245 4.6934 0.4699 0.1636 0.3034 0.3033 18.7531
4.8613 36.0 72252 4.6863 0.4705 0.1631 0.303 0.303 18.7797
4.8491 37.0 74259 4.6847 0.4703 0.1638 0.3037 0.3037 18.78
4.8239 38.0 76266 4.6804 0.4707 0.1632 0.3032 0.3032 18.7802
4.8767 39.0 78273 4.6788 0.4703 0.1637 0.3027 0.3026 18.7446
4.8402 40.0 80280 4.6700 0.4699 0.1633 0.3028 0.3028 18.7516
4.8261 41.0 82287 4.6660 0.4699 0.1633 0.3029 0.3028 18.7369
4.8193 42.0 84294 4.6693 0.4711 0.1654 0.3039 0.3038 18.7421
4.8161 43.0 86301 4.6636 0.4707 0.1642 0.303 0.303 18.7595
4.832 44.0 88308 4.6619 0.4708 0.1646 0.3036 0.3035 18.7423
4.8304 45.0 90315 4.6575 0.4711 0.1651 0.3038 0.3037 18.7354
4.7958 46.0 92322 4.6543 0.4711 0.165 0.3032 0.3032 18.7189
4.804 47.0 94329 4.6541 0.4711 0.1656 0.3037 0.3036 18.7396
4.7968 48.0 96336 4.6495 0.4709 0.165 0.3034 0.3034 18.7411
4.7912 49.0 98343 4.6471 0.4718 0.1655 0.3041 0.3042 18.7361
4.7721 50.0 100350 4.6469 0.4723 0.1667 0.3047 0.3047 18.7309
4.7828 51.0 102357 4.6476 0.4712 0.1656 0.3044 0.3045 18.7446
4.7934 52.0 104364 4.6453 0.4707 0.1645 0.3035 0.3035 18.7329
4.7724 53.0 106371 4.6425 0.4715 0.1657 0.304 0.304 18.7403
4.7804 54.0 108378 4.6362 0.4711 0.1658 0.3041 0.3041 18.7488
4.792 55.0 110385 4.6363 0.4706 0.1653 0.3038 0.3038 18.7281
4.7528 56.0 112392 4.6357 0.4724 0.1667 0.3044 0.3044 18.7463
4.7849 57.0 114399 4.6346 0.472 0.1661 0.3041 0.304 18.7431
4.7618 58.0 116406 4.6332 0.472 0.167 0.3046 0.3046 18.7336
4.7841 59.0 118413 4.6287 0.4716 0.1664 0.3043 0.3043 18.7369
4.7764 60.0 120420 4.6316 0.473 0.1666 0.3048 0.3047 18.7548
4.7504 61.0 122427 4.6276 0.4721 0.1671 0.3043 0.3044 18.7371
4.7629 62.0 124434 4.6250 0.4726 0.167 0.3046 0.3046 18.76
4.7764 63.0 126441 4.6264 0.4725 0.1666 0.3044 0.3044 18.7446
4.7524 64.0 128448 4.6275 0.4719 0.166 0.3041 0.3041 18.7428
4.7641 65.0 130455 4.6288 0.4728 0.1669 0.3049 0.3049 18.7458

Framework versions

  • Transformers 4.33.2
  • Pytorch 2.0.1+cu117
  • Datasets 2.14.5
  • Tokenizers 0.13.3
Downloads last month
781

Finetuned from