Edit model card

KoT5-test-add-data-from5ep

This model is a fine-tuned version of hyorea1/KoT5-test on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 1.1737
  • Rouge1: 11.8294
  • Rouge2: 3.2314
  • Rougel: 11.7891
  • Rougelsum: 11.8237
  • Gen Len: 35.2824

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 4
  • eval_batch_size: 4
  • seed: 100
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 8
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: cosine
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 10

Training results

Training Loss Epoch Step Validation Loss Rouge1 Rouge2 Rougel Rougelsum Gen Len
1.9029 0.16 400 1.1695 12.8243 3.2659 12.7542 12.8276 35.5743
1.7971 0.32 800 1.1646 12.259 3.0668 12.1254 12.1927 35.2353
1.4396 0.48 1200 1.1681 12.1151 3.1908 11.9507 12.0305 35.3125
1.0945 0.64 1600 1.1703 12.0576 2.9688 11.9292 11.9792 35.0926
1.1924 0.8 2000 1.1667 11.7835 2.9605 11.6755 11.7318 35.3596
1.3711 0.97 2400 1.1668 11.9873 3.1107 11.9369 12.0207 34.5309
1.6031 1.13 2800 1.1673 11.6049 3.1121 11.5527 11.5976 34.6551
1.5254 1.29 3200 1.1693 11.6803 2.8527 11.6116 11.6829 34.8066
1.641 1.45 3600 1.1737 11.8294 3.2314 11.7891 11.8237 35.2824

Framework versions

  • Transformers 4.25.1
  • Pytorch 1.12.1+cu113
  • Datasets 2.7.1
  • Tokenizers 0.13.2
Downloads last month
3
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.