DanSumT5-baseV_38821

This model is a fine-tuned version of Danish-summarisation/DanSumT5-base on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 2.2026
  • Rouge1: 34.9358
  • Rouge2: 11.6813
  • Rougel: 21.4935
  • Rougelsum: 27.4979
  • Gen Len: 126.3262

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 2
  • eval_batch_size: 2
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 8
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 20

Training results

Training Loss Epoch Step Validation Loss Rouge1 Rouge2 Rougel Rougelsum Gen Len
No log 1.0 232 2.4684 33.3966 9.9982 19.6472 27.3865 126.8712
No log 2.0 465 2.3905 34.2228 10.5192 20.3584 27.4209 126.8712
2.8064 3.0 697 2.3486 34.5949 11.0682 20.8844 27.3403 126.6738
2.8064 4.0 930 2.3193 34.6865 11.0996 20.9574 27.337 126.2318
2.5767 5.0 1162 2.2963 34.3101 11.0183 20.8461 27.155 126.721
2.5767 6.0 1395 2.2774 34.9299 11.5927 21.3549 27.7805 126.4249
2.483 7.0 1627 2.2646 34.4741 11.1383 21.2722 27.3822 126.3004
2.483 8.0 1860 2.2521 34.9384 11.2651 21.3153 27.5792 126.9828
2.4134 9.0 2092 2.2410 34.9546 11.424 21.1427 27.6608 126.7854
2.4134 10.0 2325 2.2326 34.7566 11.5721 21.4418 27.5167 126.7425
2.3576 11.0 2557 2.2263 34.5968 11.623 21.2384 27.365 126.4506
2.3576 12.0 2790 2.2194 34.7363 11.5612 21.47 27.6572 126.5665
2.3288 13.0 3022 2.2142 34.971 11.7203 21.49 27.7418 126.5665
2.3288 14.0 3255 2.2114 34.761 11.6621 21.3963 27.568 126.6266
2.3288 15.0 3487 2.2064 34.9197 11.5475 21.4017 27.6388 126.3305
2.2951 16.0 3720 2.2067 34.8124 11.615 21.5177 27.605 126.3605
2.2951 17.0 3952 2.2042 34.7608 11.4738 21.3464 27.379 126.4034
2.2832 18.0 4185 2.2032 34.7593 11.6239 21.4029 27.4669 126.2489
2.2832 19.0 4417 2.2029 34.8386 11.5919 21.4719 27.5147 126.2318
2.2571 19.96 4640 2.2026 34.9358 11.6813 21.4935 27.4979 126.3262

Framework versions

  • Transformers 4.32.1
  • Pytorch 2.1.0
  • Datasets 2.12.0
  • Tokenizers 0.13.3
Downloads last month
18
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for emilstabil/DanSumT5-baseV_38821

Finetuned
(2)
this model
Finetunes
1 model