Edit model card

bart-large-cnn-samsum-rescom-finetuned-resume-summarizer-10-epoch-tweak-lr-8-100-1

This model is a fine-tuned version of Ameer05/model-token-repo on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 2.6315
  • Rouge1: 61.441
  • Rouge2: 52.9403
  • Rougel: 58.3426
  • Rougelsum: 60.8249

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 32
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Rouge1 Rouge2 Rougel Rougelsum
No log 0.91 5 2.0139 53.4301 46.6698 50.644 53.3985
No log 1.91 10 1.6309 61.4629 53.8884 59.0867 60.8823
No log 2.91 15 1.5379 61.2938 53.7208 59.0644 60.7381
No log 3.91 20 1.4470 63.2667 55.9273 60.5112 62.7538
1.5454 4.91 25 1.4353 62.7166 54.8328 60.0101 62.1378
1.5454 5.91 30 1.4411 59.7469 51.9068 57.036 58.9474
1.5454 6.91 35 1.5195 64.152 57.1447 61.362 63.5951
1.5454 7.91 40 1.6174 60.1464 51.5654 57.1676 59.4405
0.5429 8.91 45 1.7451 61.9696 53.6421 58.5884 61.3286
0.5429 9.91 50 1.9081 60.3296 52.3052 57.6518 59.7854
0.5429 10.91 55 1.9721 61.5597 51.9027 57.1184 60.6717
0.5429 11.91 60 2.0471 61.2222 53.9475 58.725 60.6668
0.5429 12.91 65 2.1422 60.1915 52.0627 56.9955 59.438
0.1506 13.91 70 2.1542 61.6915 53.045 58.1727 60.8765
0.1506 14.91 75 2.1885 59.8069 51.6543 56.8112 59.2055
0.1506 15.91 80 2.3146 61.695 53.2666 57.9003 61.1108
0.1506 16.91 85 2.3147 60.4482 52.1694 57.0649 59.7882
0.0452 17.91 90 2.1731 60.0259 51.5046 56.7399 59.2955
0.0452 18.91 95 2.2690 60.0534 52.4819 57.1631 59.5056
0.0452 19.91 100 2.2990 58.0737 48.8098 54.5684 57.3187
0.0452 20.91 105 2.2704 61.8982 53.9077 58.6909 61.4252
0.0267 21.91 110 2.3012 62.0174 53.5427 58.5278 61.1921
0.0267 22.91 115 2.3569 61.6327 53.7387 58.8908 61.1623
0.0267 23.91 120 2.3579 60.228 52.3747 58.1448 59.7322
0.0267 24.91 125 2.3389 60.4902 51.7935 57.0689 59.7132
0.0267 25.91 130 2.3168 58.8469 50.3181 55.7386 58.3598
0.0211 26.91 135 2.4147 59.4225 50.8405 56.503 58.7221
0.0211 27.91 140 2.3631 59.7489 51.2137 57.3204 59.3348
0.0211 28.91 145 2.3850 60.1718 51.4176 57.2152 59.5157
0.0211 29.91 150 2.4610 60.1433 51.433 56.6256 59.3265
0.0175 30.91 155 2.4400 58.8345 49.7031 55.3079 57.9236
0.0175 31.91 160 2.4506 59.209 50.1626 55.6451 58.5791
0.0175 32.91 165 2.4316 59.7713 50.8999 56.4235 58.9845
0.0175 33.91 170 2.2781 60.1822 51.9435 57.4586 59.6766
0.0175 34.91 175 2.3849 58.2328 49.2106 55.1516 57.5072
0.0141 35.91 180 2.4872 58.4916 50.3345 55.5991 58.1131
0.0141 36.91 185 2.4883 59.0957 49.76 55.3567 58.076
0.0141 37.91 190 2.4327 58.091 48.8628 54.8678 57.5406
0.0141 38.91 195 2.4998 57.7428 48.7366 54.2166 56.7643
0.0089 39.91 200 2.4107 60.1662 51.9832 57.1372 59.6989
0.0089 40.91 205 2.4700 58.2159 49.3934 54.9265 57.4126
0.0089 41.91 210 2.4833 58.7434 49.6619 55.5239 57.9562
0.0089 42.91 215 2.4703 60.2984 51.3168 56.9082 59.3958
0.0062 43.91 220 2.5306 60.5455 52.1189 57.3213 60.0232
0.0062 44.91 225 2.5181 60.2149 51.2187 56.1935 59.3471
0.0062 45.91 230 2.4871 59.8013 51.6114 56.0911 59.0902
0.0062 46.91 235 2.4811 58.0271 48.9441 54.3108 57.3647
0.0062 47.91 240 2.5290 62.5087 54.6149 59.638 62.0455
0.0072 48.91 245 2.5194 58.7193 49.9679 55.6517 58.1569
0.0072 49.91 250 2.5708 58.4626 49.5257 54.5032 58.1413
0.0072 50.91 255 2.6449 58.446 49.4625 55.1092 58.03
0.0072 51.91 260 2.5592 58.859 49.4398 55.1503 57.9663
0.0056 52.91 265 2.5086 59.7322 51.3051 56.5401 59.2726
0.0056 53.91 270 2.4846 57.8603 48.2408 54.3847 57.115
0.0056 54.91 275 2.5509 58.9506 50.045 55.6658 58.3618
0.0056 55.91 280 2.5032 60.2524 51.8167 56.98 59.7506
0.0056 56.91 285 2.5012 60.0596 51.4924 56.7181 59.5037
0.0054 57.91 290 2.5176 61.0622 52.6235 57.9317 60.5036
0.0054 58.91 295 2.5024 62.9246 54.8544 59.9824 62.5584
0.0054 59.91 300 2.5687 62.2602 53.9673 58.9862 61.5837
0.0054 60.91 305 2.5890 62.5706 54.227 59.2032 62.125
0.0036 61.91 310 2.5454 62.1565 53.2585 58.7169 61.3943
0.0036 62.91 315 2.5629 62.8292 54.6781 59.9889 62.254
0.0036 63.91 320 2.5581 58.8394 50.4421 56.0742 58.1945
0.0036 64.91 325 2.5532 59.5814 51.1335 56.5841 59.196
0.0031 65.91 330 2.5826 59.0485 50.3992 55.5283 58.3757
0.0031 66.91 335 2.5815 61.4832 52.7977 57.7351 60.9888
0.0031 67.91 340 2.5865 61.7836 53.6797 58.6743 61.3765
0.0031 68.91 345 2.6007 61.2253 52.8781 57.7006 60.717
0.0031 69.91 350 2.6210 60.717 52.4933 57.5089 60.4196
0.0035 70.91 355 2.6169 61.3491 53.3932 58.2288 60.8793
0.0035 71.91 360 2.6025 62.0101 54.0289 59.0822 61.7202
0.0035 72.91 365 2.5705 61.2227 52.9937 58.2493 60.6631
0.0035 73.91 370 2.5623 59.1718 50.7827 56.1851 58.7118
0.002 74.91 375 2.5536 58.4201 49.6923 55.0398 57.7707
0.002 75.91 380 2.5478 60.2307 51.7503 57.3173 59.692
0.002 76.91 385 2.6039 58.7637 49.741 55.5341 58.0784
0.002 77.91 390 2.6371 59.3929 50.6444 55.9887 58.813
0.002 78.91 395 2.6238 59.0572 50.605 55.6631 58.4366
0.0019 79.91 400 2.5783 57.9852 49.2588 54.822 57.4643
0.0019 80.91 405 2.5982 58.0218 49.1651 54.9876 57.4066
0.0019 81.91 410 2.6141 60.3133 51.5723 56.9476 59.715
0.0019 82.91 415 2.5904 60.8199 51.8956 58.406 60.323
0.0017 83.91 420 2.5718 60.3449 51.1433 57.6984 59.7513
0.0017 84.91 425 2.5737 60.151 51.1986 57.3376 59.378
0.0017 85.91 430 2.5807 60.9273 52.2469 58.2038 60.1642
0.0017 86.91 435 2.5900 60.1846 51.6144 57.5407 59.5109
0.0011 87.91 440 2.6066 62.0776 53.6022 59.157 61.6201
0.0011 88.91 445 2.6231 61.8822 53.5232 58.965 61.401
0.0011 89.91 450 2.6273 60.3358 51.9941 57.3823 59.7729
0.0011 90.91 455 2.6194 60.0196 51.6134 57.1357 59.4594
0.0011 91.91 460 2.6118 60.6898 52.1328 57.3076 60.0351
0.0015 92.91 465 2.6032 61.2119 52.5034 57.8098 60.6634
0.0015 93.91 470 2.6040 61.4812 52.8197 57.9668 60.8767
0.0015 94.91 475 2.6158 61.4046 52.8905 57.8958 60.804
0.0015 95.91 480 2.6280 62.1764 53.8521 58.8608 61.6138
0.0012 96.91 485 2.6304 62.2028 53.8967 58.8976 61.6409
0.0012 97.91 490 2.6328 61.7371 53.3908 58.4107 61.1382
0.0012 98.91 495 2.6331 61.441 52.9403 58.3426 60.8249
0.0012 99.91 500 2.6315 61.441 52.9403 58.3426 60.8249

Framework versions

  • Transformers 4.15.0
  • Pytorch 1.9.1
  • Datasets 1.18.4
  • Tokenizers 0.10.3
Downloads last month
9