Edit model card

t5-small-samsum

This model is a fine-tuned version of google-t5/t5-small on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 1.6707
  • Rouge1: 43.8206
  • Rouge2: 19.9652
  • Rougel: 36.0416
  • Rougelsum: 40.0887
  • Gen Len: 17.0305

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 10

Training results

Training Loss Epoch Step Validation Loss Rouge1 Rouge2 Rougel Rougelsum Gen Len
2.016 1.0 1842 1.7872 40.6656 17.0772 33.6487 37.3124 16.9829
1.8798 2.0 3684 1.7375 42.1059 18.6064 35.0368 38.6458 16.7045
1.8219 3.0 5526 1.7062 43.2636 19.4321 35.6415 39.5613 16.8266
1.77 4.0 7368 1.6990 43.2211 19.5021 35.5155 39.6933 17.1905
1.7408 5.0 9210 1.6878 43.9084 19.8501 36.2255 40.2666 16.7766
1.7113 6.0 11052 1.6816 44.0573 20.1359 36.426 40.4933 16.9829
1.692 7.0 12894 1.6771 43.9234 19.9018 36.0759 40.1654 16.9158
1.6771 8.0 14736 1.6723 43.5824 19.8023 35.9709 39.963 16.9731
1.6604 9.0 16578 1.6718 43.8502 19.9263 36.157 40.1653 17.0134
1.6575 10.0 18420 1.6707 43.8206 19.9652 36.0416 40.0887 17.0305

Framework versions

  • Transformers 4.38.1
  • Pytorch 2.1.0+cu121
  • Datasets 2.17.1
  • Tokenizers 0.15.2
Downloads last month
2
Safetensors
Model size
60.5M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for vignesh-spericorn/t5-small-samsum

Base model

google-t5/t5-small
Finetuned
(1487)
this model