Edit model card

text_shortening_model_v24

This model is a fine-tuned version of t5-small on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 2.7906
  • Rouge1: 0.4431
  • Rouge2: 0.2299
  • Rougel: 0.4035
  • Rougelsum: 0.4054
  • Bert precision: 0.8678
  • Bert recall: 0.8614
  • Average word count: 9.0699
  • Max word count: 15
  • Min word count: 4
  • Average token count: 13.7991
  • % shortened texts with length > 12: 4.8035

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.005
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Rouge1 Rouge2 Rougel Rougelsum Bert precision Bert recall Average word count Max word count Min word count Average token count % shortened texts with length > 12
1.1052 1.0 100 1.7596 0.4856 0.282 0.458 0.458 0.8852 0.8717 9.1485 19 5 13.7642 12.2271
0.8757 2.0 200 1.6688 0.4887 0.2853 0.4624 0.4634 0.8881 0.867 7.9956 16 4 12.607 5.2402
0.7611 3.0 300 1.6431 0.4559 0.2503 0.422 0.4226 0.8732 0.8655 9.4017 17 3 13.7598 15.2838
0.6987 4.0 400 1.7128 0.4505 0.2426 0.4204 0.4216 0.8773 0.8591 8.1135 15 3 12.9869 2.1834
0.675 5.0 500 1.7736 0.458 0.2581 0.4317 0.4328 0.8717 0.8638 9.2183 19 4 14.0699 11.7904
0.6318 6.0 600 1.8170 0.4631 0.2566 0.4304 0.4323 0.8728 0.8652 9.441 17 3 14.2926 12.2271
0.6309 7.0 700 1.9121 0.4423 0.2243 0.4084 0.4102 0.8712 0.8581 8.393 16 4 13.1921 3.9301
0.556 8.0 800 2.0421 0.4406 0.2387 0.4099 0.4111 0.8614 0.8605 10.2445 18 5 15.2707 18.3406
0.5823 9.0 900 2.0004 0.4101 0.2159 0.3759 0.3768 0.8561 0.854 9.6638 16 5 14.6638 18.3406
0.5624 10.0 1000 2.0062 0.4706 0.2497 0.4325 0.4333 0.873 0.8676 9.4367 16 4 14.3275 12.6638
0.554 11.0 1100 2.0794 0.4706 0.2634 0.4386 0.4389 0.8777 0.8687 9.3799 17 5 14.1616 11.3537
0.5548 12.0 1200 2.1752 0.4463 0.2377 0.416 0.4164 0.8722 0.8602 9.0262 17 3 13.738 9.607
0.5444 13.0 1300 2.2306 0.4307 0.229 0.3985 0.4001 0.8698 0.8553 8.5677 16 3 13.2271 5.2402
0.5351 14.0 1400 2.1538 0.4326 0.2189 0.3974 0.398 0.8702 0.8571 8.7642 19 4 13.4148 6.1135
0.5389 15.0 1500 2.2735 0.4334 0.237 0.4046 0.4057 0.8659 0.8567 9.1616 19 4 14.0175 8.7336
0.5601 16.0 1600 2.3076 0.4291 0.2171 0.3936 0.395 0.8657 0.8578 9.0699 16 1 14.1659 8.2969
0.5361 17.0 1700 2.3043 0.4435 0.2307 0.4074 0.4078 0.871 0.8626 9.1266 16 4 13.8341 9.1703
0.5117 18.0 1800 2.3479 0.4221 0.212 0.3879 0.3881 0.8683 0.8523 8.5633 18 4 12.7686 6.1135
0.5009 19.0 1900 2.3773 0.4457 0.2405 0.4143 0.4147 0.8712 0.8621 8.8428 17 5 13.6288 6.9869
0.542 20.0 2000 2.3419 0.4312 0.2129 0.3902 0.391 0.8647 0.8562 9.1878 16 4 14.2445 10.0437
0.4985 21.0 2100 2.3961 0.43 0.2276 0.3991 0.4011 0.8679 0.8554 8.5197 15 3 13.3974 4.3668
0.5095 22.0 2200 2.4267 0.4373 0.2298 0.4072 0.4077 0.8667 0.8588 9.1616 18 4 13.8253 12.6638
0.51 23.0 2300 2.4633 0.4505 0.2298 0.413 0.4145 0.8712 0.8602 9.0 17 2 13.7074 10.4803
0.4877 24.0 2400 2.5496 0.4266 0.2273 0.3931 0.394 0.8627 0.854 8.7118 14 2 13.6157 4.3668
0.5164 25.0 2500 2.5375 0.438 0.2246 0.4033 0.4055 0.8677 0.8584 8.9476 15 3 13.7467 10.0437
0.5019 26.0 2600 2.6164 0.4145 0.2084 0.3833 0.3837 0.8595 0.8519 9.3057 15 4 14.1135 10.917
0.4905 27.0 2700 2.5586 0.4372 0.2201 0.4043 0.4053 0.8671 0.857 8.6463 15 4 13.3231 5.2402
0.5008 28.0 2800 2.5457 0.4022 0.2011 0.3676 0.3684 0.8576 0.8513 9.1703 16 4 14.0742 6.5502
0.5014 29.0 2900 2.5506 0.413 0.2108 0.3771 0.3776 0.8635 0.8545 9.1004 16 3 13.6463 8.2969
0.5128 30.0 3000 2.5791 0.4121 0.2082 0.3794 0.3794 0.8692 0.8552 8.4585 14 2 12.9083 4.3668
0.5237 31.0 3100 2.6008 0.4219 0.2114 0.3924 0.3918 0.8634 0.8555 9.1921 16 5 13.8646 7.4236
0.4643 32.0 3200 2.6541 0.4304 0.2343 0.4015 0.4016 0.8664 0.8575 8.9127 16 3 13.7249 5.2402
0.4891 33.0 3300 2.6072 0.4205 0.2072 0.3854 0.387 0.8613 0.8548 9.4236 19 3 14.1528 12.2271
0.4981 34.0 3400 2.6505 0.4255 0.2084 0.3926 0.3931 0.865 0.8544 8.952 17 4 13.4672 6.9869
0.4895 35.0 3500 2.5491 0.4192 0.2046 0.3854 0.3862 0.8653 0.8527 8.3843 15 2 13.0218 3.4934
0.5119 36.0 3600 2.5115 0.4088 0.1994 0.3837 0.3838 0.8629 0.8536 9.5415 15 5 14.1179 15.2838
0.5064 37.0 3700 2.4837 0.422 0.2161 0.3923 0.393 0.8655 0.8562 9.0961 15 4 13.6681 6.9869
0.5064 38.0 3800 2.5476 0.4126 0.2085 0.3773 0.3778 0.8605 0.8555 9.0131 15 4 13.8253 6.5502
0.4768 39.0 3900 2.6396 0.4528 0.2268 0.4063 0.4072 0.8706 0.8603 8.7511 15 4 13.2926 3.4934
0.487 40.0 4000 2.4817 0.4272 0.227 0.3991 0.4003 0.868 0.8573 8.8384 18 4 13.393 7.4236
0.4969 41.0 4100 2.5901 0.4211 0.2133 0.387 0.3872 0.8635 0.8543 9.179 18 4 13.6419 10.917
0.4781 42.0 4200 2.6128 0.4286 0.2229 0.3889 0.3901 0.862 0.8578 9.6769 16 5 14.7773 14.4105
0.4865 43.0 4300 2.5942 0.4097 0.2064 0.3789 0.3793 0.8612 0.8535 8.8777 19 2 13.7336 5.6769
0.4833 44.0 4400 2.6585 0.4119 0.2116 0.3796 0.3806 0.859 0.8515 9.3624 19 4 14.1441 9.607
0.4687 45.0 4500 2.7545 0.415 0.2065 0.3862 0.3872 0.8649 0.8534 8.8341 15 4 13.2533 4.8035
0.4832 46.0 4600 2.6578 0.4273 0.2156 0.3935 0.395 0.866 0.8573 9.2445 19 5 14.345 11.3537
0.471 47.0 4700 2.6619 0.4316 0.2274 0.4015 0.4036 0.8658 0.8578 8.9956 15 4 13.8515 7.8603
0.469 48.0 4800 2.7021 0.4328 0.2244 0.3887 0.3897 0.8641 0.854 8.9869 18 5 13.8122 7.4236
0.4784 49.0 4900 2.5634 0.4217 0.2111 0.3871 0.3882 0.8588 0.8553 9.4454 19 5 14.476 12.6638
0.4947 50.0 5000 2.6781 0.4435 0.2288 0.405 0.4066 0.8709 0.861 8.9258 17 4 13.7467 6.9869
0.4819 51.0 5100 2.6497 0.4255 0.2175 0.3921 0.3932 0.8646 0.8572 9.2096 17 5 14.1354 6.5502
0.4594 52.0 5200 2.7126 0.4246 0.2121 0.3875 0.3891 0.8705 0.854 8.4367 16 4 13.2009 5.2402
0.4804 53.0 5300 2.6285 0.4148 0.2099 0.3833 0.3845 0.8643 0.855 8.6681 15 2 13.441 5.6769
0.4923 54.0 5400 2.6453 0.4343 0.2333 0.4017 0.4026 0.8698 0.8591 8.952 17 2 13.4061 6.5502
0.4712 55.0 5500 2.7145 0.4269 0.2163 0.3927 0.3941 0.8657 0.8577 9.0699 19 3 13.821 7.8603
0.467 56.0 5600 2.7005 0.4241 0.2118 0.3907 0.3903 0.8627 0.8571 9.5371 16 4 14.393 11.7904
0.4584 57.0 5700 2.7004 0.4291 0.2233 0.3956 0.3959 0.865 0.8573 9.4105 18 4 14.214 9.1703
0.4714 58.0 5800 2.5910 0.4306 0.2317 0.3952 0.3957 0.8635 0.8595 9.2969 15 5 14.3188 10.4803
0.4743 59.0 5900 2.6688 0.4328 0.2209 0.395 0.396 0.8668 0.8585 8.9258 16 3 13.7467 5.6769
0.4613 60.0 6000 2.7094 0.4342 0.2294 0.4003 0.4019 0.8673 0.8602 9.0524 18 4 13.9782 7.4236
0.4597 61.0 6100 2.6848 0.4162 0.2217 0.3858 0.3866 0.8612 0.8544 9.2096 17 3 14.0175 9.607
0.4725 62.0 6200 2.7496 0.4348 0.2176 0.395 0.3954 0.8628 0.8636 10.0524 16 5 14.9869 16.5939
0.4324 63.0 6300 2.6998 0.4256 0.2158 0.3946 0.3956 0.8682 0.8557 8.3144 15 4 13.1135 1.7467
0.4315 64.0 6400 2.7197 0.4313 0.2263 0.3892 0.3904 0.866 0.8571 9.1528 14 4 14.0568 8.7336
0.4401 65.0 6500 2.7221 0.4193 0.2151 0.3842 0.3851 0.8622 0.8564 9.1528 17 4 14.0262 6.5502
0.4167 66.0 6600 2.7048 0.4401 0.2327 0.408 0.4084 0.8689 0.8603 9.1921 17 4 13.8035 6.5502
0.4339 67.0 6700 2.7436 0.4373 0.2286 0.4041 0.405 0.8668 0.8586 8.9039 19 3 13.7773 6.5502
0.4435 68.0 6800 2.6951 0.4191 0.2135 0.3827 0.3855 0.8649 0.8538 8.5852 14 3 13.7336 3.0568
0.4513 69.0 6900 2.7253 0.4188 0.2078 0.3865 0.3865 0.8631 0.8539 8.8734 15 5 13.4803 5.2402
0.4457 70.0 7000 2.6112 0.4273 0.2166 0.3882 0.3887 0.8652 0.8573 9.0917 16 5 14.0742 6.5502
0.4456 71.0 7100 2.6492 0.4198 0.2217 0.3916 0.3927 0.868 0.8573 8.6288 14 2 13.3712 5.6769
0.4249 72.0 7200 2.6881 0.4293 0.2178 0.386 0.3874 0.8638 0.8581 9.2926 15 4 14.2882 7.4236
0.439 73.0 7300 2.7046 0.4275 0.2171 0.3917 0.3934 0.8701 0.8556 8.5852 15 3 13.2751 5.2402
0.435 74.0 7400 2.6745 0.4323 0.2235 0.3961 0.3966 0.8637 0.8578 9.1878 19 5 14.2402 9.607
0.448 75.0 7500 2.7169 0.4262 0.2233 0.3904 0.3923 0.8643 0.855 8.821 15 4 13.5808 4.8035
0.4468 76.0 7600 2.6498 0.4368 0.225 0.3994 0.4016 0.8647 0.8598 9.2358 15 4 14.5109 8.7336
0.4544 77.0 7700 2.7268 0.4358 0.2318 0.4038 0.406 0.8704 0.859 8.6201 13 2 13.1616 1.31
0.4511 78.0 7800 2.7418 0.4381 0.2254 0.3993 0.4006 0.8698 0.8593 8.6769 16 4 13.3319 2.6201
0.4472 79.0 7900 2.7356 0.4332 0.2193 0.3957 0.3956 0.8653 0.8592 9.3231 15 5 14.2227 8.7336
0.4471 80.0 8000 2.6328 0.4383 0.2207 0.4031 0.4047 0.8682 0.86 9.0524 14 4 14.0044 3.9301
0.4312 81.0 8100 2.6819 0.4153 0.2013 0.3764 0.3774 0.862 0.852 8.7511 15 2 13.6856 4.8035
0.4289 82.0 8200 2.7087 0.4327 0.2171 0.3964 0.3971 0.8664 0.856 8.7074 15 4 13.31 5.2402
0.4531 83.0 8300 2.6771 0.4282 0.2195 0.39 0.3911 0.866 0.855 8.6507 15 2 13.2576 3.9301
0.4355 84.0 8400 2.6833 0.4345 0.2162 0.3951 0.3961 0.8659 0.8574 8.8777 14 5 13.6507 4.3668
0.4511 85.0 8500 2.7157 0.4331 0.2185 0.397 0.3983 0.8672 0.859 9.0262 15 5 13.7511 3.9301
0.4383 86.0 8600 2.7073 0.4272 0.2202 0.3929 0.3939 0.8654 0.8566 8.8734 15 5 13.5939 4.8035
0.4221 87.0 8700 2.7001 0.4325 0.2264 0.3977 0.399 0.8667 0.8571 8.7293 14 4 13.4279 3.0568
0.4395 88.0 8800 2.7394 0.4349 0.2188 0.3961 0.3972 0.8667 0.8591 8.8734 14 4 13.7074 2.1834
0.4365 89.0 8900 2.7430 0.4368 0.2272 0.4004 0.4013 0.867 0.8591 9.0524 15 4 13.7598 5.6769
0.4501 90.0 9000 2.7777 0.4327 0.22 0.3962 0.3972 0.8667 0.8562 8.6812 15 4 13.4323 3.0568
0.4359 91.0 9100 2.7498 0.4401 0.2273 0.4028 0.4042 0.8668 0.861 9.3188 15 4 14.0306 9.607
0.4445 92.0 9200 2.7315 0.4339 0.2214 0.3947 0.3957 0.865 0.8594 9.1004 15 4 13.8952 4.8035
0.445 93.0 9300 2.7602 0.4392 0.2258 0.3994 0.4007 0.867 0.8601 8.9869 15 4 13.7424 3.9301
0.4197 94.0 9400 2.7757 0.4431 0.2259 0.3992 0.4007 0.8676 0.8611 9.1485 16 4 13.8646 5.2402
0.4425 95.0 9500 2.7751 0.4373 0.2202 0.3946 0.3961 0.8661 0.86 9.1092 15 3 13.8297 5.2402
0.4337 96.0 9600 2.7765 0.4426 0.227 0.4005 0.4021 0.8681 0.8615 9.0175 15 4 13.7467 4.8035
0.439 97.0 9700 2.7823 0.443 0.2272 0.4013 0.4028 0.8685 0.8613 9.048 15 4 13.7555 5.2402
0.4519 98.0 9800 2.7894 0.4446 0.2294 0.4035 0.4046 0.8686 0.8611 8.9956 15 4 13.6507 5.2402
0.4563 99.0 9900 2.7929 0.4453 0.2327 0.4054 0.4072 0.869 0.8618 9.0393 16 4 13.7249 4.8035
0.4316 100.0 10000 2.7906 0.4431 0.2299 0.4035 0.4054 0.8678 0.8614 9.0699 15 4 13.7991 4.8035

Framework versions

  • Transformers 4.33.1
  • Pytorch 2.0.1+cu118
  • Datasets 2.14.5
  • Tokenizers 0.13.3
Downloads last month
2
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for ldos/text_shortening_model_v24

Base model

google-t5/t5-small
Finetuned
(1525)
this model