Edit model card

t5-small-finetuned-rahul-summariza

This model is a fine-tuned version of t5-small on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7002
  • Rouge1: 29.5043
  • Rouge2: 23.832
  • Rougel: 27.5786
  • Rougelsum: 28.404
  • Gen Len: 19.0

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5.6e-05
  • train_batch_size: 8
  • eval_batch_size: 2
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 20

Training results

Training Loss Epoch Step Validation Loss Rouge1 Rouge2 Rougel Rougelsum Gen Len
1.123 1.0 16 0.8258 27.2788 21.3634 25.7114 26.7324 19.0
0.9067 2.0 32 0.7539 28.873 23.5401 27.2337 27.939 19.0
0.8137 3.0 48 0.7280 29.1767 23.6599 27.7065 28.3569 19.0
0.7872 4.0 64 0.7230 29.0451 23.4597 27.2762 28.1324 19.0
0.7338 5.0 80 0.7133 29.4821 23.8113 27.4912 28.326 19.0
0.6913 6.0 96 0.7101 29.4237 23.8523 27.4109 28.2418 19.0
0.6679 7.0 112 0.7097 29.4237 23.8523 27.4109 28.2418 19.0
0.6963 8.0 128 0.7046 29.4237 23.8523 27.4109 28.2418 19.0
0.6223 9.0 144 0.7052 29.4237 23.7633 27.493 28.3362 19.0
0.6494 10.0 160 0.7019 29.4237 23.7633 27.493 28.3362 19.0
0.616 11.0 176 0.7010 29.4237 23.7633 27.493 28.3362 19.0
0.6058 12.0 192 0.7028 29.4237 23.7633 27.493 28.3362 19.0
0.5964 13.0 208 0.6996 29.4237 23.7633 27.493 28.3362 19.0
0.5958 14.0 224 0.6997 29.4237 23.7633 27.493 28.3362 19.0
0.57 15.0 240 0.6996 29.5043 23.832 27.5786 28.404 19.0
0.5714 16.0 256 0.6998 29.5043 23.832 27.5786 28.404 19.0
0.5648 17.0 272 0.6999 29.5043 23.832 27.5786 28.404 19.0
0.5258 18.0 288 0.7005 29.5043 23.832 27.5786 28.404 19.0
0.5692 19.0 304 0.7001 29.5043 23.832 27.5786 28.404 19.0
0.5708 20.0 320 0.7002 29.5043 23.832 27.5786 28.404 19.0

Framework versions

  • Transformers 4.23.1
  • Pytorch 1.12.1+cu113
  • Datasets 2.7.1
  • Tokenizers 0.13.2
Downloads last month
9