Edit model card

H2-keywordextractor-finetuned-scope-summarization

This model is a fine-tuned version of transformer3/H2-keywordextractor on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.2073
  • Rouge1: 13.0222
  • Rouge2: 10.4851
  • Rougel: 13.0872
  • Rougelsum: 13.1095

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5.6e-05
  • train_batch_size: 15
  • eval_batch_size: 15
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 20

Training results

Training Loss Epoch Step Validation Loss Rouge1 Rouge2 Rougel Rougelsum
0.8852 1.0 23 0.3103 10.3278 6.2988 10.3528 10.3293
0.2901 2.0 46 0.2825 10.8308 7.5214 10.8428 10.8103
0.2625 3.0 69 0.2711 12.0182 8.6415 12.0115 12.0537
0.2453 4.0 92 0.2550 12.9535 9.6936 12.9952 13.0384
0.2353 5.0 115 0.2464 11.2808 7.8603 11.3196 11.281
0.2338 6.0 138 0.2389 12.6604 9.6355 12.6519 12.6377
0.2183 7.0 161 0.2307 13.2591 10.6628 13.2399 13.2554
0.2143 8.0 184 0.2252 13.537 11.1632 13.5668 13.5957
0.2055 9.0 207 0.2206 13.7032 11.6575 13.7226 13.774
0.2022 10.0 230 0.2158 13.7727 11.5365 13.7404 13.8018
0.1961 11.0 253 0.2166 13.4062 11.2919 13.4698 13.4854
0.2018 12.0 276 0.2116 13.8406 11.852 13.8309 13.8995
0.1946 13.0 299 0.2131 12.5757 9.5775 12.5738 12.6535
0.1943 14.0 322 0.2142 11.617 9.0291 11.5311 11.7201
0.2068 15.0 345 0.2080 12.9136 10.2865 12.9659 12.9787
0.2051 16.0 368 0.2041 13.6492 11.6388 13.6506 13.7041
0.1887 17.0 391 0.2119 11.4317 8.2482 11.386 11.4313
0.1886 18.0 414 0.2097 13.0287 10.6547 13.0829 13.118
0.1887 19.0 437 0.2079 13.0073 10.5381 13.0514 13.1089
0.186 20.0 460 0.2073 13.0222 10.4851 13.0872 13.1095

Framework versions

  • Transformers 4.39.3
  • Pytorch 2.2.2+cu121
  • Datasets 2.18.0
  • Tokenizers 0.15.2
Downloads last month
4
Safetensors
Model size
406M params
Tensor type
F32
·

Finetuned from