Edit model card

text_shortening_model_v52

This model is a fine-tuned version of t5-small on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 2.5235
  • Rouge1: 0.4999
  • Rouge2: 0.2774
  • Rougel: 0.4503
  • Rougelsum: 0.4506
  • Bert precision: 0.8767
  • Bert recall: 0.8725
  • Average word count: 8.1138
  • Max word count: 16
  • Min word count: 3
  • Average token count: 12.5
  • % shortened texts with length > 12: 5.8201

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0001
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 250

Training results

Training Loss Epoch Step Validation Loss Rouge1 Rouge2 Rougel Rougelsum Bert precision Bert recall Average word count Max word count Min word count Average token count % shortened texts with length > 12
2.1707 1.0 83 1.8170 0.4878 0.2747 0.4466 0.445 0.8688 0.8685 8.6667 17 3 13.0767 12.963
1.7733 2.0 166 1.6790 0.4915 0.2767 0.4462 0.4458 0.8665 0.8724 9.1825 17 4 13.7778 14.8148
1.6346 3.0 249 1.6275 0.5003 0.2755 0.4523 0.4517 0.8706 0.8723 8.8148 17 4 13.4206 12.963
1.5193 4.0 332 1.5648 0.5179 0.2984 0.4686 0.4674 0.8717 0.8795 9.373 17 4 13.8995 15.6085
1.414 5.0 415 1.5338 0.512 0.2843 0.4591 0.4585 0.8721 0.8747 8.8598 17 4 13.3122 13.7566
1.3401 6.0 498 1.4932 0.5067 0.2814 0.4543 0.4539 0.8712 0.874 8.9074 17 4 13.3942 14.0212
1.2739 7.0 581 1.4780 0.5015 0.2726 0.4498 0.4496 0.8717 0.8715 8.5714 16 4 13.0 10.8466
1.2165 8.0 664 1.4522 0.5188 0.2933 0.469 0.469 0.8746 0.8776 8.9021 16 4 13.3519 14.5503
1.1494 9.0 747 1.4381 0.5105 0.2825 0.4587 0.4584 0.8734 0.8746 8.6878 16 4 13.119 12.963
1.0923 10.0 830 1.4354 0.513 0.2855 0.462 0.4618 0.8725 0.8753 9.0053 17 4 13.3624 14.2857
1.0625 11.0 913 1.4224 0.5132 0.2841 0.4645 0.4643 0.8736 0.8766 8.9497 16 4 13.3704 15.3439
1.0264 12.0 996 1.4196 0.4984 0.2716 0.4458 0.4456 0.8701 0.8712 8.8148 16 4 13.172 14.8148
0.9701 13.0 1079 1.4137 0.5065 0.2765 0.4541 0.4537 0.8753 0.8737 8.545 16 4 12.7434 11.1111
0.9328 14.0 1162 1.4155 0.5073 0.2786 0.4543 0.4541 0.8727 0.873 8.6349 16 4 12.9868 11.6402
0.8992 15.0 1245 1.4097 0.5116 0.28 0.4602 0.4604 0.8758 0.875 8.6402 16 4 12.8598 11.6402
0.8526 16.0 1328 1.4446 0.5055 0.276 0.4526 0.4523 0.8721 0.8732 8.746 16 4 13.0741 11.9048
0.8119 17.0 1411 1.4334 0.5084 0.2772 0.4557 0.4552 0.8744 0.8722 8.5212 16 4 12.8333 9.7884
0.8129 18.0 1494 1.4401 0.5039 0.2719 0.4526 0.4518 0.8743 0.8727 8.5238 17 4 12.7222 10.3175
0.7636 19.0 1577 1.4418 0.5037 0.269 0.4493 0.4492 0.8721 0.8726 8.6455 17 4 13.0159 10.0529
0.7277 20.0 1660 1.4461 0.5064 0.2769 0.452 0.452 0.8733 0.8736 8.5423 17 4 12.9788 10.0529
0.7254 21.0 1743 1.4375 0.5059 0.2761 0.4534 0.453 0.8759 0.8732 8.373 17 3 12.7884 8.9947
0.6898 22.0 1826 1.4799 0.5096 0.2825 0.4594 0.4594 0.876 0.8751 8.5026 17 4 12.9841 9.5238
0.6712 23.0 1909 1.4885 0.5164 0.2874 0.4599 0.4602 0.8743 0.8768 8.8624 17 4 13.3175 13.2275
0.6456 24.0 1992 1.4956 0.5168 0.2876 0.4653 0.465 0.8784 0.8763 8.4788 17 4 12.7381 9.2593
0.619 25.0 2075 1.5044 0.5054 0.2762 0.4569 0.4568 0.8765 0.8728 8.2804 17 4 12.6825 8.7302
0.6044 26.0 2158 1.5089 0.511 0.2902 0.4624 0.463 0.8767 0.8737 8.3915 16 4 12.7275 10.0529
0.58 27.0 2241 1.5231 0.511 0.2814 0.4567 0.4574 0.8756 0.8754 8.6984 17 4 12.9868 11.3757
0.5671 28.0 2324 1.5270 0.5095 0.2766 0.4524 0.4529 0.8745 0.8746 8.672 17 4 13.0291 12.1693
0.5547 29.0 2407 1.5530 0.5019 0.27 0.4482 0.4485 0.8737 0.8719 8.4233 17 4 12.8042 8.2011
0.5371 30.0 2490 1.5537 0.5021 0.2693 0.4491 0.4488 0.8737 0.8711 8.3995 16 4 12.7884 9.5238
0.526 31.0 2573 1.5595 0.5024 0.273 0.4481 0.4482 0.8737 0.8722 8.4868 17 4 12.8836 10.0529
0.5067 32.0 2656 1.5753 0.5048 0.2745 0.4535 0.4537 0.8752 0.8717 8.3413 16 4 12.7513 9.2593
0.4954 33.0 2739 1.5977 0.5011 0.2713 0.4523 0.4523 0.8738 0.8721 8.4815 16 4 12.8545 12.1693
0.4766 34.0 2822 1.6143 0.5083 0.2764 0.4528 0.453 0.8744 0.874 8.6085 17 4 12.955 10.8466
0.4603 35.0 2905 1.6225 0.5058 0.2768 0.4541 0.4541 0.8743 0.8724 8.5397 16 4 12.8624 10.0529
0.449 36.0 2988 1.6171 0.5004 0.2704 0.4484 0.4485 0.8758 0.8717 8.3042 16 4 12.6005 6.8783
0.4299 37.0 3071 1.6431 0.5016 0.2757 0.4508 0.4504 0.8763 0.8705 8.1455 16 4 12.5053 6.8783
0.4177 38.0 3154 1.6731 0.5 0.2695 0.4434 0.4437 0.8736 0.8707 8.2698 16 3 12.664 8.2011
0.4092 39.0 3237 1.6705 0.5041 0.2726 0.4512 0.4506 0.8742 0.8735 8.5106 16 4 12.9524 8.4656
0.3996 40.0 3320 1.6956 0.5066 0.277 0.4538 0.4538 0.876 0.8733 8.2566 16 4 12.6534 7.9365
0.3949 41.0 3403 1.6975 0.5089 0.2773 0.4573 0.4576 0.8759 0.8753 8.4577 16 4 12.9339 8.2011
0.3802 42.0 3486 1.7115 0.4986 0.2676 0.4458 0.4459 0.8739 0.8711 8.3466 17 4 12.8651 8.9947
0.3734 43.0 3569 1.7011 0.5055 0.2793 0.4541 0.454 0.8756 0.8746 8.4127 16 4 12.8413 9.2593
0.356 44.0 3652 1.7285 0.5015 0.2728 0.4469 0.4474 0.8741 0.8727 8.3862 17 4 12.8836 7.9365
0.3438 45.0 3735 1.7326 0.4989 0.2731 0.4426 0.4432 0.8744 0.8709 8.2566 17 4 12.6667 7.9365
0.3412 46.0 3818 1.7328 0.5051 0.2777 0.4519 0.4523 0.8753 0.8731 8.3254 16 4 12.7963 6.6138
0.3404 47.0 3901 1.7651 0.5135 0.2825 0.4584 0.459 0.8746 0.8749 8.6032 16 4 13.0582 10.0529
0.3283 48.0 3984 1.7700 0.5052 0.2763 0.4519 0.453 0.8749 0.873 8.3704 17 4 12.8254 6.8783
0.3206 49.0 4067 1.7754 0.5028 0.2741 0.4481 0.4489 0.8744 0.8728 8.3836 17 4 12.8122 6.0847
0.3116 50.0 4150 1.7851 0.5027 0.2769 0.4491 0.4494 0.8747 0.8727 8.3439 17 4 12.754 6.0847
0.3081 51.0 4233 1.8009 0.4975 0.2673 0.4392 0.4394 0.8727 0.8699 8.3175 17 4 12.746 7.4074
0.2999 52.0 4316 1.8090 0.502 0.2706 0.4464 0.4468 0.8741 0.8716 8.3122 17 4 12.7487 6.3492
0.2925 53.0 4399 1.8183 0.5082 0.2794 0.4535 0.4543 0.8762 0.8734 8.2725 16 4 12.672 7.672
0.2837 54.0 4482 1.8324 0.5101 0.28 0.4525 0.4538 0.8763 0.8741 8.3201 17 4 12.7857 6.6138
0.2763 55.0 4565 1.8444 0.4984 0.2688 0.4437 0.4446 0.8745 0.8703 8.2169 16 4 12.6825 6.0847
0.2663 56.0 4648 1.8723 0.4998 0.2715 0.4451 0.4451 0.874 0.8712 8.3175 16 4 12.7672 7.9365
0.2659 57.0 4731 1.8871 0.5043 0.2818 0.4513 0.4513 0.8754 0.8726 8.2751 16 4 12.7646 6.6138
0.2532 58.0 4814 1.8827 0.5084 0.2848 0.4542 0.4547 0.8769 0.8748 8.254 16 4 12.7672 5.8201
0.2528 59.0 4897 1.8852 0.4997 0.2749 0.4467 0.4472 0.8752 0.8709 8.2037 16 4 12.6376 7.4074
0.2408 60.0 4980 1.9075 0.5026 0.2833 0.452 0.4531 0.8755 0.8726 8.2434 16 4 12.7328 6.8783
0.2405 61.0 5063 1.9030 0.5028 0.2798 0.4508 0.4512 0.8759 0.8725 8.2037 16 4 12.6349 8.4656
0.233 62.0 5146 1.8968 0.5018 0.2775 0.4484 0.4487 0.874 0.8723 8.3598 16 4 12.7989 7.1429
0.2272 63.0 5229 1.9150 0.4976 0.2737 0.4432 0.4437 0.8745 0.8702 8.1243 16 4 12.4788 4.7619
0.2204 64.0 5312 1.9315 0.4971 0.2731 0.443 0.4438 0.8745 0.8698 8.1111 16 4 12.5053 6.8783
0.2222 65.0 5395 1.9321 0.5043 0.2778 0.4455 0.4457 0.8746 0.8719 8.2222 16 4 12.7672 5.8201
0.2179 66.0 5478 1.9581 0.4965 0.2712 0.4403 0.4412 0.8738 0.8691 8.1217 16 4 12.5397 6.6138
0.2083 67.0 5561 1.9525 0.5056 0.2783 0.452 0.4526 0.8755 0.8727 8.2857 16 4 12.7249 6.8783
0.2064 68.0 5644 1.9598 0.5025 0.2715 0.4453 0.4462 0.8746 0.8711 8.2249 16 4 12.5899 7.1429
0.1988 69.0 5727 2.0058 0.5017 0.2793 0.4493 0.4502 0.8757 0.8736 8.2487 16 4 12.8862 6.8783
0.202 70.0 5810 1.9937 0.4954 0.2668 0.4418 0.443 0.8743 0.8697 8.1111 16 4 12.5265 7.9365
0.1886 71.0 5893 2.0146 0.5008 0.2757 0.4477 0.4491 0.8754 0.871 8.1587 16 4 12.5582 7.1429
0.1911 72.0 5976 2.0179 0.4938 0.2703 0.4424 0.4436 0.8741 0.8692 8.119 16 4 12.4603 6.3492
0.1842 73.0 6059 2.0298 0.5016 0.2783 0.4517 0.4523 0.8766 0.8718 8.1058 16 4 12.5397 6.0847
0.1879 74.0 6142 2.0286 0.4974 0.2728 0.4445 0.4446 0.8754 0.8707 8.1958 16 4 12.5529 5.291
0.1848 75.0 6225 2.0334 0.4983 0.2742 0.4447 0.4453 0.875 0.8705 8.1111 16 4 12.5026 5.8201
0.1739 76.0 6308 2.0553 0.4928 0.2695 0.4402 0.4403 0.8749 0.869 8.0026 16 4 12.3836 5.5556
0.1755 77.0 6391 2.0758 0.4987 0.2723 0.4447 0.4449 0.8745 0.8715 8.1852 16 4 12.6693 5.8201
0.1718 78.0 6474 2.0837 0.4978 0.2746 0.4467 0.4473 0.8759 0.8708 8.0503 16 4 12.4101 6.3492
0.1627 79.0 6557 2.0698 0.5026 0.2797 0.4523 0.4522 0.8758 0.8739 8.3016 16 4 12.7143 8.4656
0.173 80.0 6640 2.0829 0.5042 0.2825 0.4542 0.4544 0.876 0.8736 8.2857 16 4 12.7487 7.4074
0.1644 81.0 6723 2.0771 0.4957 0.2718 0.443 0.4438 0.8757 0.871 8.1243 16 4 12.4815 6.0847
0.1613 82.0 6806 2.0779 0.4972 0.2741 0.4486 0.4495 0.876 0.8725 8.2487 16 4 12.6243 6.6138
0.155 83.0 6889 2.1022 0.4997 0.2781 0.4475 0.4481 0.8751 0.8718 8.2884 16 4 12.6958 6.8783
0.1506 84.0 6972 2.1198 0.503 0.2782 0.4495 0.4504 0.8759 0.8728 8.3677 16 4 12.7831 8.4656
0.1526 85.0 7055 2.1269 0.4977 0.2738 0.4472 0.4475 0.8748 0.8721 8.3016 16 4 12.709 7.1429
0.149 86.0 7138 2.1286 0.4966 0.2699 0.4454 0.4458 0.8764 0.8698 8.0053 16 4 12.2751 5.291
0.1472 87.0 7221 2.1412 0.5 0.2788 0.4473 0.4474 0.8758 0.8713 8.1005 16 4 12.4524 6.3492
0.1491 88.0 7304 2.1345 0.4969 0.2782 0.4451 0.4451 0.8753 0.8709 8.119 16 4 12.4947 6.8783
0.1381 89.0 7387 2.1461 0.4964 0.2742 0.4443 0.4442 0.8756 0.8705 8.1296 16 4 12.5053 6.0847
0.1437 90.0 7470 2.1432 0.4992 0.279 0.4451 0.4459 0.8755 0.8712 8.2566 16 4 12.5979 6.3492
0.1404 91.0 7553 2.1540 0.5004 0.2789 0.4486 0.4489 0.8765 0.8719 8.1138 16 4 12.5582 5.291
0.1329 92.0 7636 2.1649 0.5012 0.2804 0.4476 0.4488 0.8765 0.8724 8.1587 16 4 12.5503 5.8201
0.1351 93.0 7719 2.1639 0.4965 0.2757 0.4449 0.4459 0.8752 0.8715 8.1693 15 4 12.4894 7.4074
0.129 94.0 7802 2.1781 0.5063 0.2807 0.4545 0.4552 0.8771 0.8738 8.2646 16 4 12.5688 7.9365
0.1368 95.0 7885 2.1866 0.5007 0.2778 0.4455 0.4456 0.8772 0.8713 8.1058 16 4 12.3704 6.8783
0.1303 96.0 7968 2.1857 0.5038 0.2829 0.4494 0.4496 0.8768 0.8731 8.172 16 3 12.4788 6.8783
0.1258 97.0 8051 2.2108 0.5002 0.2765 0.4461 0.4465 0.8765 0.8724 8.1693 15 4 12.5423 5.8201
0.123 98.0 8134 2.2159 0.5033 0.2838 0.4487 0.4491 0.8771 0.8734 8.2698 16 4 12.6402 7.9365
0.117 99.0 8217 2.2073 0.5101 0.2881 0.4586 0.4584 0.8785 0.875 8.2275 16 4 12.5952 6.8783
0.1231 100.0 8300 2.2047 0.498 0.2775 0.4474 0.4469 0.8756 0.8722 8.328 17 4 12.6905 7.9365
0.12 101.0 8383 2.2128 0.4948 0.2709 0.4437 0.4441 0.8758 0.8714 8.1508 16 4 12.4683 6.6138
0.1135 102.0 8466 2.2282 0.5016 0.2806 0.4505 0.4509 0.8762 0.874 8.3757 17 4 12.7672 8.2011
0.122 103.0 8549 2.2265 0.493 0.2713 0.4437 0.4433 0.8744 0.8697 8.127 17 4 12.5079 5.8201
0.1156 104.0 8632 2.2346 0.4909 0.2684 0.4391 0.439 0.8731 0.8703 8.3519 17 4 12.6931 9.5238
0.1084 105.0 8715 2.2580 0.4891 0.2646 0.437 0.4369 0.8729 0.8697 8.3122 17 4 12.6931 8.2011
0.1122 106.0 8798 2.2623 0.4969 0.2755 0.4443 0.4446 0.8749 0.8719 8.3466 16 4 12.7487 7.4074
0.1131 107.0 8881 2.2554 0.4989 0.2751 0.4481 0.448 0.8758 0.8725 8.2646 16 4 12.7143 6.6138
0.1102 108.0 8964 2.2697 0.5023 0.2787 0.4514 0.4519 0.8768 0.8729 8.2196 16 4 12.6376 7.1429
0.1088 109.0 9047 2.2657 0.4979 0.2778 0.4501 0.4499 0.8746 0.8727 8.4286 16 4 12.8413 8.7302
0.1098 110.0 9130 2.2708 0.4911 0.2698 0.4376 0.4379 0.8749 0.8696 8.0767 16 4 12.4312 6.3492
0.1045 111.0 9213 2.2643 0.5014 0.2778 0.4508 0.4511 0.8758 0.8722 8.1931 16 4 12.5952 6.8783
0.0976 112.0 9296 2.2865 0.5012 0.2802 0.4475 0.4478 0.8761 0.8721 8.1534 16 4 12.5265 7.1429
0.1043 113.0 9379 2.2988 0.5025 0.2858 0.4497 0.4499 0.8767 0.8727 8.2037 16 4 12.5635 7.9365
0.1058 114.0 9462 2.3022 0.5027 0.2833 0.453 0.4535 0.877 0.8734 8.246 16 4 12.6693 8.2011
0.1024 115.0 9545 2.3016 0.498 0.2773 0.4477 0.448 0.8759 0.8721 8.2354 16 4 12.6905 7.9365
0.1049 116.0 9628 2.2934 0.4973 0.2779 0.448 0.4478 0.8758 0.8721 8.1931 16 2 12.6587 7.9365
0.0995 117.0 9711 2.3201 0.4987 0.2766 0.4489 0.4489 0.8766 0.8718 8.1138 16 4 12.5238 6.3492
0.0968 118.0 9794 2.3141 0.492 0.2703 0.4407 0.4403 0.8746 0.8699 8.1138 16 4 12.5079 6.8783
0.095 119.0 9877 2.3291 0.4952 0.2735 0.4456 0.4454 0.8752 0.8703 8.1217 16 2 12.537 6.6138
0.0981 120.0 9960 2.3340 0.4959 0.2765 0.4464 0.446 0.8753 0.8705 8.1429 16 2 12.5556 7.1429
0.0966 121.0 10043 2.3176 0.4953 0.276 0.4464 0.4461 0.875 0.8713 8.1667 16 2 12.5529 6.0847
0.086 122.0 10126 2.3323 0.4999 0.2835 0.4513 0.4516 0.8772 0.8719 8.0688 16 3 12.4577 5.8201
0.092 123.0 10209 2.3340 0.5001 0.2806 0.4512 0.4519 0.8758 0.873 8.2778 16 2 12.7222 7.1429
0.09 124.0 10292 2.3530 0.5054 0.2831 0.4542 0.454 0.8767 0.8743 8.2963 16 2 12.7698 7.1429
0.0935 125.0 10375 2.3389 0.5025 0.2849 0.4526 0.4523 0.8779 0.874 8.1561 16 2 12.5397 7.1429
0.092 126.0 10458 2.3580 0.5057 0.2826 0.4512 0.451 0.8776 0.874 8.2434 16 2 12.5979 7.4074
0.0881 127.0 10541 2.3666 0.5069 0.2839 0.4563 0.4557 0.8775 0.8741 8.2831 16 3 12.6878 7.672
0.0857 128.0 10624 2.3517 0.5041 0.2823 0.4528 0.4526 0.8775 0.8734 8.1746 16 2 12.5661 6.8783
0.0869 129.0 10707 2.3571 0.5009 0.2796 0.4502 0.4502 0.8769 0.8725 8.1217 16 2 12.5 6.3492
0.0818 130.0 10790 2.3604 0.5076 0.2872 0.4591 0.459 0.8792 0.8743 8.0952 16 2 12.4735 6.0847
0.0876 131.0 10873 2.3589 0.4996 0.2815 0.4498 0.4502 0.8766 0.8722 8.1243 16 2 12.5582 5.291
0.0871 132.0 10956 2.3850 0.5 0.2845 0.4512 0.4513 0.877 0.8726 8.1508 16 4 12.5661 6.3492
0.0854 133.0 11039 2.3704 0.4958 0.2757 0.4441 0.4445 0.8766 0.8706 8.0423 15 2 12.4815 5.5556
0.0856 134.0 11122 2.3682 0.4973 0.2754 0.4426 0.4423 0.8746 0.8725 8.2937 16 2 12.7963 7.672
0.0839 135.0 11205 2.3730 0.4978 0.2762 0.4457 0.4455 0.8749 0.8719 8.2566 16 3 12.7196 6.3492
0.0855 136.0 11288 2.3803 0.4942 0.2725 0.4399 0.4401 0.8743 0.8712 8.2381 16 2 12.6878 5.8201
0.0829 137.0 11371 2.3822 0.4959 0.2723 0.4447 0.4445 0.8753 0.8721 8.2566 16 2 12.6587 7.672
0.0832 138.0 11454 2.3868 0.4957 0.2713 0.4438 0.4437 0.8749 0.8714 8.1878 16 3 12.6243 6.3492
0.086 139.0 11537 2.3940 0.4979 0.2744 0.4429 0.4428 0.8754 0.8717 8.1825 16 3 12.582 6.8783
0.0789 140.0 11620 2.3893 0.5057 0.2811 0.4507 0.4511 0.8767 0.8739 8.209 16 3 12.672 6.3492
0.0841 141.0 11703 2.3948 0.5021 0.275 0.4477 0.4476 0.875 0.8725 8.2698 16 3 12.7354 7.1429
0.0784 142.0 11786 2.3889 0.4999 0.2737 0.4464 0.4466 0.8756 0.8727 8.2566 16 2 12.7566 6.3492
0.0752 143.0 11869 2.4188 0.5032 0.2759 0.4477 0.4479 0.8765 0.8729 8.1693 16 2 12.6217 5.291
0.0769 144.0 11952 2.4022 0.4994 0.2743 0.4438 0.4438 0.8753 0.8715 8.2063 16 2 12.6746 6.8783
0.0773 145.0 12035 2.4139 0.5042 0.2819 0.4518 0.4519 0.8772 0.8736 8.209 16 2 12.6243 6.0847
0.0751 146.0 12118 2.4150 0.4909 0.2655 0.4387 0.4386 0.875 0.8694 8.0079 16 2 12.455 3.9683
0.0816 147.0 12201 2.4090 0.4968 0.2711 0.443 0.4437 0.8752 0.8715 8.1667 17 2 12.6058 5.5556
0.0775 148.0 12284 2.4251 0.5002 0.2752 0.4459 0.4463 0.8764 0.8727 8.1878 16 2 12.6693 5.8201
0.0761 149.0 12367 2.4093 0.5032 0.2805 0.4516 0.4512 0.8767 0.8741 8.2011 15 3 12.6243 4.7619
0.078 150.0 12450 2.4119 0.4981 0.2737 0.4484 0.4488 0.8767 0.8726 8.127 16 2 12.5688 5.5556
0.0791 151.0 12533 2.4017 0.4959 0.2723 0.4462 0.4463 0.8756 0.8715 8.1561 16 2 12.5503 5.291
0.0762 152.0 12616 2.4192 0.4995 0.2736 0.4507 0.4503 0.8758 0.8711 8.1243 15 2 12.5688 5.5556
0.0761 153.0 12699 2.4085 0.5022 0.2781 0.4529 0.4524 0.8759 0.8728 8.2804 16 3 12.6667 6.3492
0.0738 154.0 12782 2.4299 0.5019 0.2785 0.452 0.4523 0.8771 0.872 8.0952 16 3 12.4894 5.8201
0.0735 155.0 12865 2.4236 0.5002 0.2792 0.4514 0.4515 0.8771 0.872 8.0767 16 3 12.4947 5.291
0.0722 156.0 12948 2.4271 0.5035 0.279 0.4528 0.4527 0.8765 0.8739 8.3413 16 3 12.7011 7.4074
0.069 157.0 13031 2.4337 0.4978 0.2731 0.4473 0.4475 0.8748 0.8714 8.2937 16 3 12.6402 6.8783
0.0737 158.0 13114 2.4321 0.4962 0.2722 0.4441 0.4444 0.8762 0.8714 8.0926 16 3 12.4815 5.8201
0.0673 159.0 13197 2.4347 0.4967 0.2748 0.4446 0.4438 0.8748 0.8718 8.1905 16 3 12.6111 5.5556
0.0713 160.0 13280 2.4431 0.5017 0.2771 0.4508 0.4509 0.8764 0.8729 8.2011 16 3 12.6058 6.0847
0.0757 161.0 13363 2.4656 0.4969 0.2737 0.4458 0.4459 0.8759 0.8711 8.119 16 3 12.5582 5.5556
0.0756 162.0 13446 2.4435 0.4978 0.2747 0.4454 0.445 0.8749 0.8718 8.2619 16 3 12.6746 6.6138
0.0738 163.0 13529 2.4417 0.5045 0.2824 0.4522 0.4523 0.8765 0.8736 8.2302 16 3 12.6508 5.8201
0.0656 164.0 13612 2.4491 0.5009 0.2757 0.4494 0.4497 0.8763 0.8725 8.164 16 3 12.6032 5.5556
0.0726 165.0 13695 2.4493 0.4994 0.2773 0.4475 0.4478 0.8759 0.8728 8.2407 16 3 12.6508 6.6138
0.0704 166.0 13778 2.4400 0.4982 0.2766 0.4479 0.4478 0.8761 0.8717 8.0952 16 3 12.5344 4.7619
0.0681 167.0 13861 2.4468 0.499 0.2753 0.4469 0.4469 0.8762 0.8717 8.1455 16 3 12.5952 5.0265
0.0646 168.0 13944 2.4546 0.497 0.2754 0.445 0.4447 0.8746 0.8717 8.2116 16 3 12.6878 5.0265
0.0712 169.0 14027 2.4622 0.4959 0.2714 0.442 0.4424 0.8748 0.8713 8.2116 16 3 12.6138 5.5556
0.0724 170.0 14110 2.4731 0.4998 0.2729 0.4445 0.4443 0.8754 0.8715 8.1746 16 3 12.5635 5.0265
0.0689 171.0 14193 2.4743 0.4977 0.273 0.4452 0.4451 0.8749 0.8712 8.2037 16 3 12.6032 5.5556
0.0684 172.0 14276 2.4612 0.5047 0.2798 0.4489 0.4492 0.8761 0.874 8.3333 16 3 12.7857 6.6138
0.069 173.0 14359 2.4644 0.4997 0.2778 0.4492 0.4499 0.8757 0.8726 8.2275 16 3 12.6905 6.0847
0.0703 174.0 14442 2.4594 0.498 0.2722 0.4442 0.4442 0.8748 0.8726 8.2407 16 3 12.6931 5.5556
0.0685 175.0 14525 2.4617 0.4988 0.2762 0.448 0.4484 0.8761 0.8725 8.164 16 3 12.5873 5.8201
0.0647 176.0 14608 2.4674 0.4998 0.2777 0.4481 0.4482 0.8764 0.872 8.1243 16 3 12.5397 5.0265
0.065 177.0 14691 2.4695 0.4998 0.2751 0.4466 0.4467 0.8761 0.8723 8.1746 16 3 12.6005 5.291
0.0622 178.0 14774 2.4708 0.5043 0.283 0.4527 0.4529 0.8773 0.8739 8.1878 16 3 12.5926 5.8201
0.0639 179.0 14857 2.4797 0.4996 0.2745 0.4486 0.4482 0.8756 0.8726 8.164 16 3 12.5952 5.291
0.0646 180.0 14940 2.4685 0.497 0.2757 0.449 0.4486 0.8763 0.8721 8.119 16 3 12.5794 4.4974
0.0625 181.0 15023 2.4856 0.4996 0.2746 0.4486 0.4484 0.8764 0.8723 8.1481 16 3 12.5503 5.0265
0.0653 182.0 15106 2.4901 0.4987 0.2715 0.4473 0.4477 0.8756 0.8723 8.1931 16 3 12.6402 5.8201
0.065 183.0 15189 2.4897 0.5016 0.275 0.4497 0.4494 0.8766 0.8722 8.119 16 2 12.5053 5.291
0.0614 184.0 15272 2.4936 0.4984 0.2738 0.4459 0.4461 0.8758 0.8719 8.1667 16 3 12.5582 5.8201
0.0649 185.0 15355 2.4916 0.4983 0.2731 0.4435 0.4438 0.876 0.8723 8.1931 16 3 12.5926 5.8201
0.0618 186.0 15438 2.4926 0.4992 0.2742 0.4456 0.4456 0.8758 0.8719 8.1958 16 3 12.5899 6.0847
0.0624 187.0 15521 2.4986 0.499 0.2732 0.4452 0.4452 0.8758 0.8722 8.1693 16 3 12.6005 5.291
0.0641 188.0 15604 2.4982 0.5059 0.2806 0.4503 0.4508 0.877 0.8737 8.2063 16 3 12.6005 4.7619
0.0663 189.0 15687 2.4938 0.5031 0.2761 0.4483 0.4488 0.8763 0.8731 8.1614 16 3 12.5847 5.8201
0.0633 190.0 15770 2.4823 0.499 0.278 0.446 0.4457 0.8763 0.8727 8.1481 16 3 12.5608 5.0265
0.0653 191.0 15853 2.4956 0.4983 0.2744 0.4445 0.4446 0.8762 0.8722 8.1217 16 3 12.5741 5.0265
0.0657 192.0 15936 2.4920 0.5034 0.2804 0.4482 0.4476 0.8762 0.8735 8.2566 16 2 12.6693 6.0847
0.0625 193.0 16019 2.4892 0.504 0.2815 0.4499 0.4496 0.8767 0.8735 8.2381 16 2 12.6958 5.8201
0.0621 194.0 16102 2.4890 0.503 0.28 0.45 0.45 0.8765 0.8726 8.1614 16 3 12.5873 5.8201
0.063 195.0 16185 2.4954 0.5013 0.2773 0.4493 0.4487 0.8763 0.8724 8.1984 16 2 12.6323 5.8201
0.0626 196.0 16268 2.4961 0.5001 0.2769 0.4489 0.4493 0.8762 0.872 8.1746 16 2 12.5556 5.8201
0.0623 197.0 16351 2.5007 0.4954 0.271 0.4415 0.4416 0.8749 0.8708 8.2011 16 2 12.537 6.0847
0.0607 198.0 16434 2.5061 0.5023 0.2764 0.4498 0.4497 0.876 0.872 8.1852 16 2 12.5926 6.0847
0.0621 199.0 16517 2.5001 0.4951 0.2746 0.4449 0.4447 0.875 0.8712 8.172 16 2 12.5582 6.0847
0.0581 200.0 16600 2.5074 0.497 0.2725 0.4455 0.4459 0.8757 0.8719 8.2011 15 2 12.5794 5.8201
0.06 201.0 16683 2.5067 0.4948 0.2734 0.4444 0.4445 0.8749 0.8713 8.1958 16 2 12.5661 5.8201
0.0614 202.0 16766 2.5133 0.4941 0.2742 0.445 0.445 0.8752 0.8715 8.1667 16 2 12.5476 6.0847
0.0637 203.0 16849 2.5173 0.4957 0.2739 0.446 0.446 0.8756 0.8717 8.1852 15 2 12.5767 5.291
0.0592 204.0 16932 2.5142 0.4909 0.2726 0.4435 0.4434 0.875 0.871 8.1481 16 2 12.5238 5.5556
0.0594 205.0 17015 2.5157 0.4936 0.2727 0.4458 0.4453 0.8756 0.8715 8.1349 16 2 12.5317 5.5556
0.0584 206.0 17098 2.5139 0.4929 0.2724 0.4439 0.4441 0.8753 0.871 8.172 15 3 12.5688 5.8201
0.0602 207.0 17181 2.5148 0.5 0.2784 0.4503 0.4507 0.8763 0.8726 8.1746 15 3 12.5714 5.5556
0.0623 208.0 17264 2.5022 0.5021 0.2811 0.4545 0.4546 0.8773 0.8733 8.1534 15 3 12.5582 5.291
0.0599 209.0 17347 2.5013 0.4986 0.2768 0.45 0.4502 0.8763 0.8725 8.1825 16 3 12.6032 5.5556
0.0564 210.0 17430 2.5135 0.501 0.2793 0.4518 0.4523 0.8764 0.8728 8.2063 16 3 12.672 5.8201
0.0576 211.0 17513 2.5129 0.4994 0.2762 0.4501 0.4497 0.8754 0.8723 8.2063 15 3 12.6852 5.5556
0.0624 212.0 17596 2.5019 0.4988 0.2755 0.4497 0.4496 0.8753 0.8719 8.1852 15 4 12.619 5.5556
0.0549 213.0 17679 2.5068 0.4993 0.2775 0.4499 0.45 0.8758 0.8724 8.1958 16 4 12.672 6.3492
0.0599 214.0 17762 2.5078 0.4965 0.2741 0.4464 0.4463 0.8753 0.8719 8.1746 17 2 12.582 6.0847
0.0595 215.0 17845 2.5134 0.5021 0.281 0.4527 0.453 0.8766 0.8732 8.1667 16 4 12.6138 5.8201
0.0606 216.0 17928 2.5134 0.5025 0.2804 0.4548 0.4553 0.8768 0.873 8.1376 16 4 12.582 5.5556
0.0593 217.0 18011 2.5090 0.5043 0.2818 0.4547 0.4551 0.8769 0.8734 8.1614 16 4 12.582 5.8201
0.0617 218.0 18094 2.5083 0.4994 0.2755 0.449 0.4491 0.8762 0.8728 8.1852 16 3 12.582 5.8201
0.059 219.0 18177 2.5082 0.4971 0.2731 0.4467 0.4472 0.8759 0.8721 8.1296 16 2 12.5132 5.5556
0.0592 220.0 18260 2.5075 0.4973 0.2754 0.4463 0.4467 0.8761 0.8729 8.1852 16 2 12.5661 5.8201
0.0595 221.0 18343 2.5078 0.4964 0.2738 0.4462 0.4463 0.8758 0.8727 8.1746 16 2 12.537 5.8201
0.0573 222.0 18426 2.5065 0.4929 0.2705 0.4424 0.4425 0.8757 0.8716 8.1323 16 3 12.4709 5.5556
0.0541 223.0 18509 2.5154 0.4937 0.2705 0.443 0.4431 0.8755 0.8715 8.1243 16 3 12.463 5.8201
0.0589 224.0 18592 2.5163 0.4946 0.2718 0.4437 0.4438 0.8755 0.8718 8.164 16 2 12.5185 5.8201
0.0566 225.0 18675 2.5151 0.4922 0.2703 0.4418 0.4423 0.8754 0.8715 8.1508 16 2 12.5053 5.8201
0.0557 226.0 18758 2.5158 0.4926 0.2687 0.4411 0.4415 0.8752 0.8712 8.1481 16 2 12.5185 5.5556
0.053 227.0 18841 2.5210 0.4928 0.2696 0.4418 0.4425 0.8748 0.8712 8.1614 16 2 12.537 6.0847
0.0583 228.0 18924 2.5222 0.4942 0.2702 0.4446 0.4452 0.8751 0.8712 8.1508 16 3 12.5397 5.291
0.0615 229.0 19007 2.5243 0.4927 0.2697 0.4409 0.4413 0.875 0.871 8.1534 16 2 12.5344 5.291
0.0585 230.0 19090 2.5194 0.4916 0.27 0.4404 0.4407 0.875 0.8706 8.1164 16 3 12.4868 5.291
0.0563 231.0 19173 2.5186 0.4901 0.2673 0.4381 0.4389 0.8741 0.8704 8.1614 16 2 12.5794 5.5556
0.0523 232.0 19256 2.5209 0.4913 0.2681 0.4395 0.4402 0.875 0.8703 8.0979 16 2 12.463 5.291
0.0543 233.0 19339 2.5230 0.4932 0.2715 0.4428 0.444 0.8756 0.8707 8.0688 16 2 12.4471 5.291
0.0561 234.0 19422 2.5236 0.4961 0.274 0.4455 0.4459 0.8762 0.8718 8.1005 16 2 12.4868 5.291
0.0551 235.0 19505 2.5214 0.4959 0.2739 0.4456 0.4462 0.876 0.8717 8.1243 16 2 12.4921 5.291
0.0575 236.0 19588 2.5205 0.4984 0.2751 0.4478 0.4485 0.8763 0.8724 8.1349 16 2 12.5159 5.5556
0.0596 237.0 19671 2.5214 0.4957 0.2726 0.4447 0.4455 0.876 0.8716 8.119 16 2 12.4841 5.5556
0.0546 238.0 19754 2.5209 0.4948 0.2733 0.445 0.4455 0.8761 0.8717 8.0926 16 3 12.4868 5.5556
0.0577 239.0 19837 2.5189 0.4962 0.2724 0.446 0.4463 0.8762 0.8717 8.0661 16 3 12.4497 5.5556
0.0577 240.0 19920 2.5190 0.4969 0.2738 0.4465 0.4467 0.8761 0.8722 8.1349 16 3 12.5423 5.8201
0.0547 241.0 20003 2.5187 0.4961 0.2729 0.4459 0.446 0.8759 0.872 8.1217 16 3 12.5265 5.8201
0.0506 242.0 20086 2.5190 0.4965 0.2735 0.4465 0.4469 0.8761 0.8721 8.1217 16 3 12.5132 5.8201
0.0539 243.0 20169 2.5192 0.4968 0.2743 0.4466 0.4469 0.8762 0.8721 8.1164 16 3 12.4974 6.0847
0.0529 244.0 20252 2.5203 0.496 0.2734 0.4461 0.4464 0.8762 0.8718 8.0899 16 3 12.4683 5.5556
0.0567 245.0 20335 2.5208 0.4971 0.2741 0.4476 0.4472 0.8763 0.8723 8.127 16 3 12.5212 5.5556
0.0563 246.0 20418 2.5219 0.4976 0.2749 0.4476 0.4477 0.8764 0.8722 8.1243 16 3 12.5079 6.0847
0.0542 247.0 20501 2.5230 0.4997 0.2766 0.4495 0.4495 0.8766 0.8724 8.119 16 3 12.5079 5.8201
0.0537 248.0 20584 2.5234 0.5001 0.278 0.4505 0.4509 0.8768 0.8725 8.1138 16 3 12.5 5.8201
0.0573 249.0 20667 2.5236 0.4999 0.2774 0.4503 0.4506 0.8767 0.8725 8.1138 16 3 12.5 5.8201
0.0562 250.0 20750 2.5235 0.4999 0.2774 0.4503 0.4506 0.8767 0.8725 8.1138 16 3 12.5 5.8201

Framework versions

  • Transformers 4.33.1
  • Pytorch 2.0.1+cu118
  • Datasets 2.14.5
  • Tokenizers 0.13.3
Downloads last month
4
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for ldos/text_shortening_model_v52

Base model

google-t5/t5-small
Finetuned
(1530)
this model