Edit model card

text_shortening_model_v9

This model is a fine-tuned version of t5-small on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 1.7285
  • Rouge1: 0.5919
  • Rouge2: 0.3742
  • Rougel: 0.5529
  • Rougelsum: 0.5532
  • Bert precision: 0.8979
  • Bert recall: 0.9029
  • Average word count: 11.1929
  • Max word count: 17
  • Min word count: 7
  • Average token count: 16.3286
  • % shortened texts with length > 12: 22.1429

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0001
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Rouge1 Rouge2 Rougel Rougelsum Bert precision Bert recall Average word count Max word count Min word count Average token count % shortened texts with length > 12
1.2656 1.0 16 1.6819 0.5512 0.3185 0.4947 0.4946 0.8804 0.8891 11.8643 18 5 17.0071 45.7143
1.1187 2.0 32 1.5924 0.567 0.3403 0.5157 0.5151 0.8857 0.8954 11.8214 18 3 16.7786 45.7143
1.0753 3.0 48 1.5304 0.5832 0.3555 0.5319 0.5304 0.8881 0.8998 11.9571 18 4 17.0357 46.4286
1.0235 4.0 64 1.4952 0.5785 0.3453 0.5277 0.527 0.8875 0.9003 11.8857 17 6 17.0286 42.8571
0.9861 5.0 80 1.4627 0.5894 0.3606 0.5388 0.5379 0.8885 0.901 11.9429 17 6 17.1929 43.5714
0.9616 6.0 96 1.4499 0.59 0.3567 0.536 0.5355 0.8877 0.9019 12.0071 18 6 17.2714 42.8571
0.9193 7.0 112 1.4335 0.5912 0.3627 0.5427 0.5419 0.8877 0.9025 11.9786 17 6 17.3571 40.7143
0.8959 8.0 128 1.4193 0.5866 0.3583 0.5346 0.5337 0.8887 0.9016 11.7714 17 6 17.1143 38.5714
0.8834 9.0 144 1.4090 0.5979 0.369 0.5469 0.5464 0.8908 0.9042 11.7 16 6 17.2071 37.8571
0.8468 10.0 160 1.4035 0.5977 0.3678 0.5473 0.5469 0.8916 0.9048 11.7643 17 6 17.2071 35.7143
0.8297 11.0 176 1.3956 0.5986 0.365 0.549 0.5475 0.8934 0.9046 11.5857 16 6 16.9429 32.8571
0.8275 12.0 192 1.3934 0.6027 0.3731 0.555 0.5551 0.8934 0.9049 11.6143 17 6 16.9286 32.8571
0.8072 13.0 208 1.3915 0.5973 0.3672 0.5484 0.5472 0.8921 0.905 11.7214 16 6 17.0857 35.7143
0.7744 14.0 224 1.3972 0.6006 0.3707 0.5544 0.5529 0.8947 0.9051 11.5214 16 6 16.8714 33.5714
0.7626 15.0 240 1.3910 0.6039 0.3745 0.5586 0.5576 0.8962 0.9053 11.5071 16 6 16.7071 36.4286
0.7564 16.0 256 1.3918 0.6046 0.3739 0.5571 0.5563 0.8943 0.906 11.7286 17 6 17.0214 40.0
0.7599 17.0 272 1.3822 0.6025 0.3753 0.5549 0.5542 0.8939 0.9059 11.6571 16 6 17.0429 35.7143
0.7331 18.0 288 1.3885 0.6019 0.3705 0.5548 0.5539 0.8935 0.9048 11.65 16 6 17.0357 34.2857
0.7227 19.0 304 1.3916 0.6084 0.3825 0.563 0.5628 0.8991 0.9069 11.2214 16 6 16.5786 27.1429
0.6906 20.0 320 1.4023 0.6065 0.3797 0.5579 0.5579 0.8934 0.9067 11.7714 16 7 17.1357 37.1429
0.6917 21.0 336 1.4052 0.6095 0.3831 0.5621 0.5623 0.8965 0.9072 11.4357 16 6 16.7786 31.4286
0.6867 22.0 352 1.4104 0.6026 0.3807 0.5558 0.5561 0.8928 0.9057 11.5857 16 6 17.0643 31.4286
0.6995 23.0 368 1.4127 0.5999 0.3744 0.5514 0.5511 0.8941 0.9034 11.3571 16 6 16.6714 29.2857
0.6699 24.0 384 1.4217 0.6003 0.3804 0.5558 0.5551 0.8945 0.906 11.4714 16 7 16.8857 29.2857
0.6598 25.0 400 1.4344 0.5975 0.3744 0.552 0.5517 0.8943 0.9053 11.4429 16 6 16.7857 29.2857
0.6592 26.0 416 1.4340 0.6081 0.3868 0.5617 0.5614 0.8964 0.9071 11.3786 16 7 16.8 27.8571
0.6651 27.0 432 1.4375 0.6005 0.3741 0.553 0.553 0.8947 0.9042 11.3714 16 6 16.7071 28.5714
0.6409 28.0 448 1.4511 0.5977 0.3713 0.5508 0.5508 0.8959 0.9033 11.05 16 6 16.45 22.1429
0.6373 29.0 464 1.4670 0.5918 0.3655 0.5426 0.5426 0.8933 0.9026 11.3429 16 7 16.8071 25.7143
0.6284 30.0 480 1.4591 0.5973 0.3782 0.5497 0.5498 0.8947 0.904 11.3 16 7 16.8 24.2857
0.6214 31.0 496 1.4709 0.5987 0.3806 0.5543 0.5543 0.8963 0.9041 11.2214 16 6 16.6714 25.7143
0.6086 32.0 512 1.4839 0.5874 0.3667 0.5442 0.5434 0.8942 0.9016 11.1357 16 6 16.5429 26.4286
0.6102 33.0 528 1.4852 0.5928 0.3746 0.5479 0.5474 0.8954 0.9022 11.1286 16 6 16.5071 24.2857
0.6118 34.0 544 1.4869 0.5962 0.3766 0.5488 0.5486 0.8948 0.9035 11.4 16 7 16.7643 27.1429
0.605 35.0 560 1.4881 0.5943 0.3746 0.5461 0.5457 0.8942 0.9019 11.3143 16 7 16.7929 26.4286
0.6039 36.0 576 1.4854 0.5903 0.3716 0.5431 0.5431 0.8957 0.9014 11.1 16 7 16.45 24.2857
0.5777 37.0 592 1.4901 0.5922 0.3685 0.5461 0.546 0.8943 0.9042 11.3786 16 7 16.8143 26.4286
0.5634 38.0 608 1.4975 0.594 0.3721 0.5454 0.5446 0.8958 0.9019 11.0929 16 7 16.4286 22.8571
0.5794 39.0 624 1.5088 0.5963 0.3792 0.5515 0.5508 0.896 0.9026 11.2429 16 7 16.55 24.2857
0.5825 40.0 640 1.5150 0.5951 0.3736 0.5512 0.5502 0.895 0.9031 11.3786 16 6 16.6643 27.8571
0.5632 41.0 656 1.5230 0.5998 0.3731 0.5571 0.5561 0.9 0.9037 11.0714 16 6 16.1214 22.1429
0.5544 42.0 672 1.5356 0.6036 0.3798 0.5628 0.5628 0.8987 0.9046 11.2143 16 7 16.3143 22.8571
0.5672 43.0 688 1.5493 0.5944 0.3671 0.5502 0.5504 0.8954 0.9024 11.3786 16 7 16.6 25.0
0.551 44.0 704 1.5563 0.5859 0.362 0.543 0.5426 0.8957 0.9002 11.1214 15 7 16.35 23.5714
0.543 45.0 720 1.5601 0.592 0.3643 0.5484 0.5481 0.8968 0.9014 11.0929 17 7 16.3 22.8571
0.5352 46.0 736 1.5680 0.6039 0.3783 0.5618 0.5614 0.8987 0.905 11.1929 17 7 16.4071 23.5714
0.528 47.0 752 1.5732 0.595 0.3721 0.5562 0.5558 0.8968 0.9024 11.1643 17 7 16.3714 25.0
0.528 48.0 768 1.5749 0.5933 0.372 0.5539 0.5537 0.896 0.9026 11.2643 17 7 16.4857 25.7143
0.5296 49.0 784 1.5795 0.596 0.3726 0.554 0.5541 0.897 0.904 11.2571 17 7 16.4571 26.4286
0.5235 50.0 800 1.5828 0.5916 0.3679 0.5484 0.548 0.8951 0.9019 11.2643 17 7 16.4571 27.1429
0.5168 51.0 816 1.5879 0.5917 0.368 0.5473 0.5465 0.8962 0.9006 11.1857 17 7 16.2286 25.7143
0.5133 52.0 832 1.5932 0.5928 0.3665 0.5473 0.5465 0.8979 0.9018 11.1643 17 7 16.2643 21.4286
0.5036 53.0 848 1.6016 0.5927 0.3703 0.5508 0.5511 0.8949 0.9012 11.3286 17 7 16.4143 26.4286
0.492 54.0 864 1.6074 0.5922 0.37 0.5496 0.5493 0.8953 0.9021 11.3643 17 7 16.5214 26.4286
0.5184 55.0 880 1.6153 0.5953 0.3714 0.5542 0.5536 0.8963 0.9027 11.3 17 7 16.5 24.2857
0.5057 56.0 896 1.6311 0.5874 0.3636 0.5424 0.5425 0.896 0.9009 11.0857 17 7 16.2429 21.4286
0.5053 57.0 912 1.6356 0.5835 0.3623 0.5411 0.5408 0.8953 0.8996 11.1929 17 7 16.3143 25.7143
0.5016 58.0 928 1.6342 0.5908 0.3679 0.5475 0.5472 0.8966 0.9011 11.1214 17 7 16.2929 23.5714
0.4921 59.0 944 1.6312 0.5899 0.3719 0.5495 0.549 0.8966 0.9006 11.0429 17 7 16.1929 25.0
0.5051 60.0 960 1.6316 0.5989 0.3766 0.5572 0.5566 0.8964 0.9045 11.3214 17 7 16.6643 25.7143
0.4938 61.0 976 1.6377 0.6007 0.3812 0.5581 0.5578 0.898 0.903 11.1214 17 7 16.2357 25.0
0.4843 62.0 992 1.6437 0.5981 0.3844 0.5597 0.5595 0.8965 0.9033 11.1714 17 7 16.3286 26.4286
0.4894 63.0 1008 1.6473 0.594 0.3718 0.5525 0.5523 0.8951 0.903 11.2857 17 7 16.5071 28.5714
0.4956 64.0 1024 1.6549 0.5843 0.37 0.5449 0.5447 0.895 0.8995 11.0929 17 7 16.2 25.7143
0.4852 65.0 1040 1.6543 0.5947 0.3742 0.5553 0.555 0.8958 0.9024 11.35 17 7 16.55 27.8571
0.489 66.0 1056 1.6558 0.5922 0.3751 0.5546 0.5544 0.896 0.9014 11.1357 17 7 16.2857 25.7143
0.4852 67.0 1072 1.6619 0.591 0.376 0.5522 0.5523 0.8959 0.9016 11.1571 17 7 16.2571 23.5714
0.4847 68.0 1088 1.6699 0.5913 0.3781 0.556 0.5556 0.8969 0.901 11.0214 17 7 16.1357 22.8571
0.4685 69.0 1104 1.6720 0.5909 0.3755 0.5516 0.5517 0.8961 0.9015 11.2571 17 7 16.35 25.0
0.4799 70.0 1120 1.6734 0.586 0.3654 0.5448 0.5454 0.8937 0.8998 11.25 17 7 16.3214 24.2857
0.4781 71.0 1136 1.6765 0.5844 0.3634 0.5429 0.5428 0.8927 0.8996 11.35 17 7 16.4929 26.4286
0.4843 72.0 1152 1.6814 0.5864 0.3619 0.5426 0.5432 0.8928 0.9006 11.4286 17 7 16.5929 27.8571
0.4658 73.0 1168 1.6846 0.5888 0.3628 0.5431 0.5437 0.8941 0.9001 11.3214 17 7 16.4429 25.7143
0.4664 74.0 1184 1.6899 0.5885 0.3692 0.5473 0.5473 0.8949 0.9 11.1786 17 7 16.3143 22.1429
0.4805 75.0 1200 1.6954 0.5915 0.3765 0.5506 0.5511 0.8956 0.9013 11.2286 17 7 16.3643 23.5714
0.4708 76.0 1216 1.6964 0.5888 0.37 0.5479 0.5483 0.8964 0.9004 11.0571 17 7 16.1929 21.4286
0.4483 77.0 1232 1.6968 0.5881 0.3669 0.5455 0.5457 0.8954 0.8999 11.1214 17 7 16.2857 22.8571
0.4699 78.0 1248 1.6993 0.5908 0.369 0.5477 0.5481 0.8957 0.9015 11.1786 15 7 16.3857 24.2857
0.4657 79.0 1264 1.7014 0.5927 0.3734 0.5528 0.553 0.8971 0.9021 11.1429 15 7 16.3214 22.8571
0.4616 80.0 1280 1.7063 0.5919 0.3743 0.5531 0.5533 0.8975 0.9009 11.0714 15 7 16.25 20.7143
0.4706 81.0 1296 1.7087 0.5933 0.3728 0.5521 0.5525 0.8976 0.9015 11.0643 15 7 16.2429 21.4286
0.4557 82.0 1312 1.7109 0.5917 0.3717 0.5517 0.5515 0.8971 0.902 11.1429 17 7 16.35 22.8571
0.474 83.0 1328 1.7164 0.5918 0.3714 0.5507 0.5509 0.8967 0.9024 11.2357 17 7 16.4143 24.2857
0.4715 84.0 1344 1.7165 0.591 0.3717 0.5522 0.5533 0.8975 0.9025 11.1071 17 7 16.2857 22.8571
0.462 85.0 1360 1.7159 0.5892 0.3708 0.5479 0.5481 0.896 0.9021 11.2071 17 7 16.3714 23.5714
0.455 86.0 1376 1.7171 0.5943 0.379 0.5551 0.5559 0.898 0.9031 11.1929 17 7 16.3429 23.5714
0.4613 87.0 1392 1.7173 0.5894 0.371 0.5501 0.5507 0.8967 0.9018 11.2 17 7 16.3571 22.8571
0.4663 88.0 1408 1.7191 0.5895 0.3705 0.5505 0.5509 0.8968 0.9018 11.1857 17 7 16.3429 22.1429
0.4662 89.0 1424 1.7213 0.5893 0.3692 0.5498 0.5501 0.8961 0.9012 11.2214 17 7 16.3714 23.5714
0.4352 90.0 1440 1.7202 0.5886 0.3696 0.549 0.5498 0.8963 0.9015 11.2214 17 7 16.3714 23.5714
0.4567 91.0 1456 1.7193 0.5885 0.373 0.5509 0.5516 0.8968 0.9022 11.1929 17 7 16.3429 23.5714
0.4421 92.0 1472 1.7211 0.5885 0.3734 0.5498 0.5505 0.8962 0.9022 11.2429 17 7 16.3857 24.2857
0.4655 93.0 1488 1.7230 0.5925 0.3763 0.5537 0.5538 0.8977 0.9029 11.1929 17 7 16.35 23.5714
0.4431 94.0 1504 1.7246 0.5912 0.3765 0.5529 0.5531 0.898 0.903 11.1929 17 7 16.3286 22.8571
0.4493 95.0 1520 1.7258 0.5921 0.3756 0.5531 0.5535 0.8979 0.903 11.2357 17 7 16.3714 22.8571
0.4546 96.0 1536 1.7272 0.5918 0.375 0.5529 0.5533 0.8978 0.9029 11.2357 17 7 16.3643 23.5714
0.4558 97.0 1552 1.7279 0.5925 0.3744 0.5536 0.554 0.8979 0.9029 11.2071 17 7 16.3357 22.8571
0.4575 98.0 1568 1.7281 0.592 0.3746 0.5532 0.5533 0.8978 0.9029 11.2 17 7 16.3357 22.8571
0.441 99.0 1584 1.7285 0.5919 0.3742 0.5529 0.5532 0.8978 0.9029 11.1929 17 7 16.3286 22.1429
0.4529 100.0 1600 1.7285 0.5919 0.3742 0.5529 0.5532 0.8979 0.9029 11.1929 17 7 16.3286 22.1429

Framework versions

  • Transformers 4.32.1
  • Pytorch 2.0.1+cu118
  • Datasets 2.14.4
  • Tokenizers 0.13.3
Downloads last month
1
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for ldos/text_shortening_model_v9

Base model

google-t5/t5-small
Finetuned
(1525)
this model