Edit model card

my_awesome_billsum_model_28

This model is a fine-tuned version of google-t5/t5-small on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.3463
  • Rouge1: 0.9844
  • Rouge2: 0.9417
  • Rougel: 0.9576
  • Rougelsum: 0.9576
  • Gen Len: 5.25

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Rouge1 Rouge2 Rougel Rougelsum Gen Len
No log 1.0 12 0.3001 0.9821 0.9347 0.9494 0.9511 5.2708
No log 2.0 24 0.3040 0.979 0.8986 0.9355 0.9368 5.25
No log 3.0 36 0.3007 0.9814 0.9208 0.9479 0.9487 5.2292
No log 4.0 48 0.3041 0.9814 0.9208 0.9479 0.9487 5.2292
No log 5.0 60 0.3050 0.9814 0.9208 0.9479 0.9487 5.2292
No log 6.0 72 0.3048 0.9814 0.9208 0.9479 0.9487 5.2292
No log 7.0 84 0.2996 0.9814 0.9208 0.9479 0.9487 5.2292
No log 8.0 96 0.2991 0.9844 0.9417 0.9576 0.9576 5.25
No log 9.0 108 0.3005 0.9844 0.9417 0.9576 0.9576 5.25
No log 10.0 120 0.2967 0.9866 0.9486 0.9628 0.9628 5.2292
No log 11.0 132 0.2947 0.9866 0.9486 0.9628 0.9628 5.2292
No log 12.0 144 0.2935 0.9866 0.9486 0.9628 0.9628 5.2292
No log 13.0 156 0.2947 0.9866 0.9486 0.9628 0.9628 5.2292
No log 14.0 168 0.2950 0.9866 0.9486 0.9628 0.9628 5.2292
No log 15.0 180 0.2873 0.9844 0.9417 0.9576 0.9576 5.25
No log 16.0 192 0.2813 0.9866 0.9486 0.9628 0.9628 5.2292
No log 17.0 204 0.2861 0.9844 0.9417 0.9576 0.9576 5.25
No log 18.0 216 0.2947 0.9844 0.9417 0.9576 0.9576 5.25
No log 19.0 228 0.3042 0.9844 0.9417 0.9576 0.9576 5.25
No log 20.0 240 0.3125 0.9844 0.9417 0.9576 0.9576 5.25
No log 21.0 252 0.3223 0.9844 0.9417 0.9576 0.9576 5.25
No log 22.0 264 0.3225 0.9844 0.9417 0.9576 0.9576 5.25
No log 23.0 276 0.3132 0.9844 0.9417 0.9576 0.9576 5.25
No log 24.0 288 0.3082 0.9844 0.9417 0.9576 0.9576 5.25
No log 25.0 300 0.3109 0.9844 0.9417 0.9576 0.9576 5.25
No log 26.0 312 0.3193 0.9844 0.9417 0.9576 0.9576 5.25
No log 27.0 324 0.3314 0.9844 0.9417 0.9576 0.9576 5.25
No log 28.0 336 0.3288 0.9844 0.9417 0.9576 0.9576 5.25
No log 29.0 348 0.3214 0.9844 0.9417 0.9576 0.9576 5.25
No log 30.0 360 0.3261 0.9844 0.9417 0.9576 0.9576 5.25
No log 31.0 372 0.3247 0.9844 0.9417 0.9576 0.9576 5.25
No log 32.0 384 0.3286 0.9844 0.9417 0.9576 0.9576 5.25
No log 33.0 396 0.3209 0.9844 0.9417 0.9576 0.9576 5.25
No log 34.0 408 0.3167 0.9844 0.9417 0.9576 0.9576 5.25
No log 35.0 420 0.3226 0.9844 0.9417 0.9576 0.9576 5.25
No log 36.0 432 0.3304 0.9844 0.9417 0.9576 0.9576 5.25
No log 37.0 444 0.3320 0.9844 0.9417 0.9576 0.9576 5.25
No log 38.0 456 0.3258 0.9844 0.9417 0.9576 0.9576 5.25
No log 39.0 468 0.3298 0.9844 0.9278 0.9472 0.9479 5.25
No log 40.0 480 0.3278 0.9844 0.9278 0.9472 0.9479 5.25
No log 41.0 492 0.3314 0.9844 0.9278 0.9472 0.9479 5.25
0.0342 42.0 504 0.3370 0.9844 0.9417 0.9576 0.9576 5.25
0.0342 43.0 516 0.3360 0.9844 0.9417 0.9576 0.9576 5.25
0.0342 44.0 528 0.3416 0.9844 0.9417 0.9576 0.9576 5.25
0.0342 45.0 540 0.3348 0.9844 0.9417 0.9576 0.9576 5.25
0.0342 46.0 552 0.3350 0.9844 0.9417 0.9576 0.9576 5.25
0.0342 47.0 564 0.3394 0.9844 0.9417 0.9576 0.9576 5.25
0.0342 48.0 576 0.3381 0.9844 0.9417 0.9576 0.9576 5.25
0.0342 49.0 588 0.3427 0.9844 0.9417 0.9576 0.9576 5.25
0.0342 50.0 600 0.3385 0.9844 0.9417 0.9576 0.9576 5.25
0.0342 51.0 612 0.3376 0.9844 0.9417 0.9576 0.9576 5.25
0.0342 52.0 624 0.3377 0.9844 0.9417 0.9576 0.9576 5.25
0.0342 53.0 636 0.3372 0.9844 0.9417 0.9576 0.9576 5.25
0.0342 54.0 648 0.3492 0.9844 0.9417 0.9576 0.9576 5.25
0.0342 55.0 660 0.3564 0.9844 0.9417 0.9576 0.9576 5.25
0.0342 56.0 672 0.3556 0.9844 0.9417 0.9576 0.9576 5.25
0.0342 57.0 684 0.3441 0.9844 0.9417 0.9576 0.9576 5.25
0.0342 58.0 696 0.3406 0.9844 0.9417 0.9576 0.9576 5.25
0.0342 59.0 708 0.3341 0.9844 0.9417 0.9576 0.9576 5.25
0.0342 60.0 720 0.3333 0.9844 0.9417 0.9576 0.9576 5.25
0.0342 61.0 732 0.3367 0.9844 0.9417 0.9576 0.9576 5.25
0.0342 62.0 744 0.3379 0.9844 0.9417 0.9576 0.9576 5.25
0.0342 63.0 756 0.3366 0.9844 0.9417 0.9576 0.9576 5.25
0.0342 64.0 768 0.3376 0.9844 0.9417 0.9576 0.9576 5.25
0.0342 65.0 780 0.3384 0.9844 0.9417 0.9576 0.9576 5.25
0.0342 66.0 792 0.3444 0.9844 0.9417 0.9576 0.9576 5.25
0.0342 67.0 804 0.3422 0.9844 0.9417 0.9576 0.9576 5.25
0.0342 68.0 816 0.3444 0.9844 0.9417 0.9576 0.9576 5.25
0.0342 69.0 828 0.3407 0.9844 0.9417 0.9576 0.9576 5.25
0.0342 70.0 840 0.3380 0.9844 0.9417 0.9576 0.9576 5.25
0.0342 71.0 852 0.3376 0.9844 0.9417 0.9576 0.9576 5.25
0.0342 72.0 864 0.3442 0.9844 0.9417 0.9576 0.9576 5.25
0.0342 73.0 876 0.3493 0.9844 0.9417 0.9576 0.9576 5.25
0.0342 74.0 888 0.3550 0.9844 0.9417 0.9576 0.9576 5.25
0.0342 75.0 900 0.3600 0.9844 0.9417 0.9576 0.9576 5.25
0.0342 76.0 912 0.3592 0.9844 0.9417 0.9576 0.9576 5.25
0.0342 77.0 924 0.3571 0.9844 0.9417 0.9576 0.9576 5.25
0.0342 78.0 936 0.3584 0.9844 0.9417 0.9576 0.9576 5.25
0.0342 79.0 948 0.3601 0.9844 0.9417 0.9576 0.9576 5.25
0.0342 80.0 960 0.3585 0.9844 0.9417 0.9576 0.9576 5.25
0.0342 81.0 972 0.3552 0.9844 0.9417 0.9576 0.9576 5.25
0.0342 82.0 984 0.3561 0.9844 0.9417 0.9576 0.9576 5.25
0.0342 83.0 996 0.3555 0.9844 0.9417 0.9576 0.9576 5.25
0.0138 84.0 1008 0.3533 0.9844 0.9417 0.9576 0.9576 5.25
0.0138 85.0 1020 0.3491 0.9844 0.9417 0.9576 0.9576 5.25
0.0138 86.0 1032 0.3482 0.9844 0.9417 0.9576 0.9576 5.25
0.0138 87.0 1044 0.3477 0.9844 0.9417 0.9576 0.9576 5.25
0.0138 88.0 1056 0.3475 0.9844 0.9417 0.9576 0.9576 5.25
0.0138 89.0 1068 0.3482 0.9844 0.9417 0.9576 0.9576 5.25
0.0138 90.0 1080 0.3479 0.9844 0.9417 0.9576 0.9576 5.25
0.0138 91.0 1092 0.3475 0.9844 0.9417 0.9576 0.9576 5.25
0.0138 92.0 1104 0.3467 0.9844 0.9417 0.9576 0.9576 5.25
0.0138 93.0 1116 0.3464 0.9844 0.9417 0.9576 0.9576 5.25
0.0138 94.0 1128 0.3456 0.9844 0.9417 0.9576 0.9576 5.25
0.0138 95.0 1140 0.3452 0.9844 0.9417 0.9576 0.9576 5.25
0.0138 96.0 1152 0.3446 0.9844 0.9417 0.9576 0.9576 5.25
0.0138 97.0 1164 0.3455 0.9844 0.9417 0.9576 0.9576 5.25
0.0138 98.0 1176 0.3460 0.9844 0.9417 0.9576 0.9576 5.25
0.0138 99.0 1188 0.3465 0.9844 0.9417 0.9576 0.9576 5.25
0.0138 100.0 1200 0.3463 0.9844 0.9417 0.9576 0.9576 5.25

Framework versions

  • Transformers 4.41.2
  • Pytorch 2.3.0+cu121
  • Datasets 2.20.0
  • Tokenizers 0.19.1
Downloads last month
2
Safetensors
Model size
60.5M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for limaatulya/my_awesome_billsum_model_28

Base model

google-t5/t5-small
Finetuned
(1490)
this model