Edit model card

HealthScienceBART

This model is a fine-tuned version of facebook/bart-large-cnn on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 3.7248
  • Rouge1: 59.8432
  • Rouge2: 25.926
  • Rougel: 44.3683
  • Rougelsum: 56.3382
  • Bertscore Precision: 84.199
  • Bertscore Recall: 85.5429
  • Bertscore F1: 84.8633
  • Bleu: 0.2087
  • Gen Len: 234.8216

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 1
  • eval_batch_size: 1
  • seed: 42
  • gradient_accumulation_steps: 16
  • total_train_batch_size: 16
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 500
  • num_epochs: 1

Training results

Training Loss Epoch Step Validation Loss Rouge1 Rouge2 Rougel Rougelsum Bertscore Precision Bertscore Recall Bertscore F1 Bleu Gen Len
5.662 0.0826 100 5.4864 49.8946 18.6145 35.6824 47.1811 80.6966 82.5402 81.6048 0.1476 234.8216
5.2036 0.1653 200 4.9823 52.1848 20.4176 37.3029 48.9924 81.1422 83.2665 82.1871 0.1634 234.8216
4.7061 0.2479 300 4.6422 54.5492 21.4905 38.8501 51.1097 82.0428 83.8584 82.9376 0.1730 234.8216
4.657 0.3305 400 4.4252 54.072 22.1609 39.6324 50.5966 81.9494 84.1622 83.0371 0.1793 234.8216
4.3613 0.4131 500 4.2631 56.8149 23.0471 40.9892 53.0419 83.0301 84.669 83.8388 0.1871 234.8216
4.2804 0.4958 600 4.1142 56.8254 23.7321 41.7326 52.8585 82.8372 84.8241 83.8154 0.1915 234.8216
4.2477 0.5784 700 3.9926 57.2046 23.9303 42.3439 53.6018 83.216 84.9845 84.0878 0.1929 234.8216
4.1188 0.6610 800 3.9193 57.9987 24.8441 43.1811 54.4399 83.6075 85.2031 84.395 0.1999 234.8216
3.8678 0.7436 900 3.8320 59.1683 25.1465 43.4643 55.6762 83.9212 85.315 84.6099 0.2019 234.8216
3.8831 0.8263 1000 3.7889 59.3948 25.4051 43.821 55.8124 84.0802 85.4569 84.7606 0.2044 234.8216
3.7856 0.9089 1100 3.7498 59.535 25.6124 44.1831 56.071 84.0653 85.4796 84.7641 0.2063 234.8216
3.8875 0.9915 1200 3.7248 59.8432 25.926 44.3683 56.3382 84.199 85.5429 84.8633 0.2087 234.8216

Framework versions

  • Transformers 4.41.2
  • Pytorch 2.3.1+cu121
  • Datasets 2.20.0
  • Tokenizers 0.19.1
Downloads last month
1
Safetensors
Model size
406M params
Tensor type
F32
·

Finetuned from