chunwoolee0's picture
Update README.md
dd206cb
metadata
license: apache-2.0
base_model: t5-small
tags:
  - generated_from_trainer
datasets:
  - cnn_dailymail
metrics:
  - rouge
model-index:
  - name: cnn_dailymail_t5_small
    results:
      - task:
          name: Sequence-to-sequence Language Modeling
          type: text2text-generation
        dataset:
          name: cnn_dailymail
          type: cnn_dailymail
          config: default
          split: train
          args: default
        metrics:
          - name: Rouge1
            type: rouge
            value: 0.2321

cnn_dailymail_t5_small

This model is a fine-tuned version of t5-small on the cnn_dailymail dataset. It achieves the following results on the evaluation set:

  • Loss: 1.7271
  • Rouge1: 0.2321
  • Rouge2: 0.0955
  • Rougel: 0.1887
  • Rougelsum: 0.1887
  • Gen Len: 18.9998

Model description

Text-To-Text Transfer Transformer (T5) T5-Small is the checkpoint with 60 million parameters.

Intended uses & limitations

This is an exercise for finetuning of pretrained t5 model.

Training and evaluation data

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 4
  • eval_batch_size: 4
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 2

Training results

Training Loss Epoch Step Validation Loss Rouge1 Rouge2 Rougel Rougelsum Gen Len
1.9158 1.0 10000 1.7333 0.2313 0.0948 0.1879 0.1879 18.9998
1.9316 2.0 20000 1.7271 0.2321 0.0955 0.1887 0.1887 18.9998

Framework versions

  • Transformers 4.31.0
  • Pytorch 2.0.1+cu118
  • Datasets 2.14.4
  • Tokenizers 0.13.3