mt5-deed-sum / README.md
Hasanur525's picture
End of training
3e706f0 verified
metadata
license: apache-2.0
base_model: Hasanur525/deed_summarization_mt5_version_1
tags:
  - generated_from_trainer
metrics:
  - rouge
model-index:
  - name: mt5-deed-sum
    results: []

mt5-deed-sum

This model is a fine-tuned version of Hasanur525/deed_summarization_mt5_version_1 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.4953
  • Rouge1: 1.5754
  • Rouge2: 1.087
  • Rougel: 1.5005
  • Rougelsum: 1.4211
  • Gen Len: 310.6981

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 2
  • eval_batch_size: 2
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 5000
  • num_epochs: 22

Training results

Training Loss Epoch Step Validation Loss Rouge1 Rouge2 Rougel Rougelsum Gen Len
0.0915 1.0 375 0.5844 0.7311 0.4193 0.7311 0.7311 289.3396
0.9545 2.0 750 0.5858 0.6289 0.444 0.6289 0.6289 291.5912
0.8026 3.0 1125 0.5817 1.1119 0.6733 1.067 1.0428 295.0692
0.2525 4.0 1500 0.5698 0.7311 0.4193 0.7311 0.7311 299.7987
1.5794 5.0 1875 0.5685 0.8096 0.4733 0.7714 0.7549 286.0126
0.0558 6.0 2250 0.5701 0.5003 0.3431 0.5003 0.4785 301.6855
0.4973 7.0 2625 0.5521 1.1281 0.7349 0.9983 0.9983 295.0692
1.1935 8.0 3000 0.5661 1.3444 0.9964 1.2673 1.2213 324.3648
0.0752 9.0 3375 0.5531 1.4883 1.0199 1.4252 1.3979 301.0377
0.216 10.0 3750 0.5573 1.5516 1.0371 1.5047 1.4656 319.195
0.3619 11.0 4125 0.5571 1.2368 0.8055 1.2326 1.2146 294.4717
0.1881 12.0 4500 0.5293 1.2922 0.941 1.2149 1.2084 305.9057
0.2247 13.0 4875 0.5340 1.0581 0.594 0.9989 0.987 306.3774
0.0715 14.0 5250 0.5211 1.2905 0.8861 1.259 1.2143 321.6226
0.1851 15.0 5625 0.5231 1.4625 0.9737 1.3919 1.3637 318.4969
0.5285 16.0 6000 0.5154 1.1892 0.8552 1.1401 1.1061 313.2138
0.0482 17.0 6375 0.5032 1.1826 0.8687 1.1554 1.1554 327.1824
0.0733 18.0 6750 0.5193 1.6133 1.1373 1.5626 1.5085 317.8113
0.2814 19.0 7125 0.5007 1.5689 1.1133 1.5189 1.4606 307.7421
0.0672 20.0 7500 0.4959 1.5754 1.078 1.489 1.4166 316.6164
0.2456 21.0 7875 0.4966 1.5754 1.087 1.5005 1.4211 314.3396
0.0405 22.0 8250 0.4953 1.5754 1.087 1.5005 1.4211 310.6981

Framework versions

  • Transformers 4.37.2
  • Pytorch 2.1.0.dev20230811+cu121
  • Datasets 2.17.0
  • Tokenizers 0.15.2