Edit model card

t5-base-cnn-dm-on-2000-news

This model is a fine-tuned version of flax-community/t5-base-cnn-dm on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6098
  • Rouge1: 40.1108
  • Rouge2: 35.0247
  • Rougel: 39.2043
  • Rougelsum: 39.4272
  • Gen Len: 19.0

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 10
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Rouge1 Rouge2 Rougel Rougelsum Gen Len
No log 1.0 211 0.8550 33.8522 24.4634 31.5244 32.1795 19.0
No log 2.0 422 0.6559 36.5992 28.76 34.8508 35.3186 19.0
0.9852 3.0 633 0.5826 38.1153 31.7278 36.8636 37.0955 19.0
0.9852 4.0 844 0.5637 38.3865 32.5346 37.1463 37.5175 19.0
0.3875 5.0 1055 0.5694 39.6094 33.9914 38.4403 38.6466 19.0
0.3875 6.0 1266 0.5727 39.7837 34.2137 38.6669 38.9124 19.0
0.3875 7.0 1477 0.5813 39.9572 34.6307 38.885 39.1153 19.0
0.207 8.0 1688 0.5934 39.867 34.7336 38.941 39.1977 19.0
0.207 9.0 1899 0.6053 40.0643 34.9731 39.1287 39.3602 19.0
0.138 10.0 2110 0.6098 40.1108 35.0247 39.2043 39.4272 19.0

Framework versions

  • Transformers 4.38.1
  • Pytorch 2.1.2
  • Datasets 2.1.0
  • Tokenizers 0.15.2
Downloads last month
4
Safetensors
Model size
223M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for imsumit18/t5-base-cnn-dm-on-2000-news

Finetuned
(2)
this model