imsumit18's picture
Update README.md
08657a1 verified
|
raw
history blame
9.92 kB
metadata
license: apache-2.0
base_model: sshleifer/distilbart-cnn-12-6
tags:
  - generated_from_trainer
metrics:
  - rouge
model-index:
  - name: distilbart-cnn-12-6-finetuned-stocknews_200
    results: []
pipeline_tag: summarization

distilbart-cnn-12-6-finetuned-stocknews_200

This model is a fine-tuned version of sshleifer/distilbart-cnn-12-6 on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 1.0370
  • Rouge1: 79.8682
  • Rouge2: 71.4205
  • Rougel: 75.6301
  • Rougelsum: 77.0085
  • Gen Len: 74.1543

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 80
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Rouge1 Rouge2 Rougel Rougelsum Gen Len
No log 1.0 169 0.5736 64.7045 47.6749 56.2681 59.2198 74.6113
No log 2.0 338 0.4806 72.0942 58.8471 65.4706 67.8252 71.5163
0.4734 3.0 507 0.4991 73.967 62.7751 68.5945 70.6273 74.724
0.4734 4.0 676 0.4965 76.8393 66.9993 72.19 73.864 72.7003
0.4734 5.0 845 0.5139 78.0584 68.124 73.447 75.0284 73.9466
0.1158 6.0 1014 0.5328 78.409 68.5496 73.4175 75.0927 72.6914
0.1158 7.0 1183 0.5370 77.5134 67.8142 72.7732 74.5942 71.5727
0.1158 8.0 1352 0.5872 78.01 68.8818 73.7514 75.3546 73.4036
0.0631 9.0 1521 0.5787 78.8662 69.9291 74.7183 76.1309 73.365
0.0631 10.0 1690 0.5887 78.5145 69.2414 73.9729 75.4945 73.3947
0.0631 11.0 1859 0.5866 77.9579 68.5705 73.2277 75.2179 72.4807
0.0456 12.0 2028 0.6155 79.4247 70.3457 75.0464 76.723 71.6261
0.0456 13.0 2197 0.6270 78.2792 69.1958 74.171 75.7049 72.9347
0.0456 14.0 2366 0.6342 78.6039 69.2197 74.2082 75.7638 74.543
0.0364 15.0 2535 0.6282 78.7977 69.8903 74.5441 76.4053 72.8961
0.0364 16.0 2704 0.6456 78.4486 69.2633 74.0665 75.4348 72.2819
0.0364 17.0 2873 0.6583 79.1083 70.2974 75.0199 76.544 72.6469
0.0282 18.0 3042 0.6477 78.7872 69.9616 74.6811 76.0256 72.8279
0.0282 19.0 3211 0.6716 78.7369 69.889 74.4537 75.9916 73.4214
0.0282 20.0 3380 0.6729 79.3218 70.2074 75.162 76.5582 73.7003
0.0222 21.0 3549 0.7011 77.7228 68.6481 73.4411 74.9113 74.4748
0.0222 22.0 3718 0.6763 79.47 70.7597 75.2025 76.8042 72.73
0.0222 23.0 3887 0.7025 79.8675 70.9624 75.4989 77.0572 72.8427
0.0196 24.0 4056 0.6746 79.1486 70.4134 74.9573 76.4961 73.0208
0.0196 25.0 4225 0.6750 79.774 71.187 75.6008 77.2557 72.1098
0.0196 26.0 4394 0.6921 79.5747 70.894 75.2295 76.7905 72.9318
0.0176 27.0 4563 0.7611 79.0068 70.1336 74.3258 75.9459 74.3501
0.0176 28.0 4732 0.7093 79.5467 70.8754 75.4346 77.2047 72.3116
0.0176 29.0 4901 0.7168 79.5496 70.5612 75.0587 76.6486 74.0415
0.0154 30.0 5070 0.7032 79.7382 71.0288 75.9411 77.103 72.5282
0.0154 31.0 5239 0.7206 79.3973 70.7136 75.1744 76.5041 72.5757
0.0154 32.0 5408 0.7478 79.6311 70.74 75.1728 76.8626 73.1395
0.013 33.0 5577 0.7279 79.9423 71.2295 75.7646 77.2329 70.8872
0.013 34.0 5746 0.7685 78.8995 70.121 74.4843 76.028 72.9763
0.013 35.0 5915 0.7498 79.6454 70.8632 75.4972 76.8668 72.0297
0.0126 36.0 6084 0.8016 78.8582 70.0804 74.5498 76.0402 74.8338
0.0126 37.0 6253 0.7923 78.8845 70.1465 74.837 76.2453 74.0742
0.0126 38.0 6422 0.7813 78.7254 70.0885 74.6831 76.1384 73.5994
0.0103 39.0 6591 0.7974 79.5855 70.7472 75.5436 76.9493 72.6795
0.0103 40.0 6760 0.7967 79.656 70.7795 75.2844 76.6875 72.3294
0.0103 41.0 6929 0.8029 79.8831 71.1647 75.697 77.0773 71.8872
0.0086 42.0 7098 0.8245 78.999 70.1721 74.8494 76.2723 72.7478
0.0086 43.0 7267 0.8459 79.052 70.2714 75.0921 76.4209 74.3828
0.0086 44.0 7436 0.8077 79.6009 70.4859 75.0207 76.7271 72.5163
0.0078 45.0 7605 0.8431 79.093 70.433 75.0361 76.589 73.3145
0.0078 46.0 7774 0.8794 79.1461 70.3654 74.845 76.3544 75.0415
0.0078 47.0 7943 0.8668 79.1443 70.2647 74.7967 76.3801 71.724
0.0076 48.0 8112 0.8347 78.6997 70.1008 74.6051 76.0351 73.9763
0.0076 49.0 8281 0.8544 78.9749 69.9824 74.6559 76.0268 74.6528
0.0076 50.0 8450 0.9060 79.5051 70.5755 75.3817 77.0026 71.1217
0.0065 51.0 8619 0.9501 79.2498 70.5003 75.1244 76.5023 75.0
0.0065 52.0 8788 0.8724 79.5012 70.4217 75.109 76.6551 73.73
0.0065 53.0 8957 0.8860 79.5313 71.0337 75.3122 76.928 72.7685
0.0053 54.0 9126 0.8859 79.674 71.0878 75.4582 76.925 73.3294
0.0053 55.0 9295 0.8965 78.5857 69.8599 74.2323 75.6027 75.7359
0.0053 56.0 9464 0.9871 79.8361 71.2171 75.8197 77.1182 74.0861
0.0052 57.0 9633 0.8972 79.8939 71.3469 75.9245 77.1549 72.8398
0.0052 58.0 9802 0.9693 79.5523 70.8739 75.2116 76.7137 74.3412
0.0052 59.0 9971 0.9605 79.483 70.6684 75.0183 76.3226 75.2522
0.0047 60.0 10140 0.9705 79.4894 70.6424 75.0833 76.504 74.8694
0.0047 61.0 10309 0.9730 79.4781 70.9014 75.4589 76.6387 75.0504
0.0047 62.0 10478 0.9284 79.485 70.6651 75.1062 76.4092 74.0148
0.0045 63.0 10647 0.9537 79.2664 70.4345 74.9998 76.4565 73.9199
0.0045 64.0 10816 0.9554 79.6061 70.8702 75.3191 76.6242 74.3145
0.0045 65.0 10985 1.0090 79.6107 70.9297 75.4102 76.9842 73.9466
0.0041 66.0 11154 0.9736 79.6246 70.8827 75.2682 76.7209 74.8131
0.0041 67.0 11323 0.9498 79.9549 71.3231 75.7987 77.2809 73.5371
0.0041 68.0 11492 0.9965 80.1403 71.4991 76.017 77.3741 74.2404
0.004 69.0 11661 1.0012 79.8784 71.444 75.827 77.1888 74.0059
0.004 70.0 11830 0.9888 80.1075 71.7102 75.9687 77.3636 72.9911
0.004 71.0 11999 0.9758 79.7998 71.3682 75.6694 77.0498 73.8991
0.0043 72.0 12168 0.9760 79.9748 71.4703 75.8148 77.1338 72.8843
0.0043 73.0 12337 0.9930 80.1032 71.6551 75.8235 77.1674 73.6499
0.0037 74.0 12506 1.0006 80.0302 71.5324 75.7755 77.2182 73.3027
0.0037 75.0 12675 0.9958 79.9088 71.313 75.7842 77.1939 73.362
0.0037 76.0 12844 0.9993 80.3059 71.7887 76.0696 77.5045 73.3086
0.0039 77.0 13013 1.0224 79.5564 71.1191 75.4324 76.7285 74.2344
0.0039 78.0 13182 1.0510 80.0006 71.4199 75.6626 77.006 74.0119
0.0039 79.0 13351 1.0410 79.7101 71.2137 75.5206 76.8997 74.4303
0.0036 80.0 13520 1.0370 79.8682 71.4205 75.6301 77.0085 74.1543

Framework versions

  • Transformers 4.38.1
  • Pytorch 2.1.2
  • Datasets 2.1.0
  • Tokenizers 0.15.2