Edit model card

bart-paraphrase-finetuned-xsum-v2

This model is a fine-tuned version of eugenesiow/bart-paraphrase on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.2329
  • Rouge1: 100.0
  • Rouge2: 100.0
  • Rougel: 100.0
  • Rougelsum: 100.0
  • Gen Len: 9.2619

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 2
  • eval_batch_size: 2
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 20
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Rouge1 Rouge2 Rougel Rougelsum Gen Len
No log 1.0 21 1.2954 66.7012 60.8612 66.5163 66.4352 13.2857
No log 2.0 42 0.6866 86.8284 82.7835 86.7208 86.784 9.5238
No log 3.0 63 0.4652 95.1892 93.5619 95.2567 95.1657 10.3095
No log 4.0 84 0.4280 97.7463 97.1782 97.8708 97.718 9.5
No log 5.0 105 0.3712 99.6435 99.5767 99.6435 99.6435 9.3571
No log 6.0 126 0.4451 99.2695 98.9418 99.1883 99.3506 9.3095
No log 7.0 147 0.3169 99.246 99.0232 99.246 99.4048 9.619
No log 8.0 168 0.2942 100.0 100.0 100.0 100.0 9.4048
No log 9.0 189 0.3105 100.0 100.0 100.0 100.0 9.1667
No log 10.0 210 0.3035 100.0 100.0 100.0 100.0 9.2619
No log 11.0 231 0.2983 100.0 100.0 100.0 100.0 10.5714
No log 12.0 252 0.2497 100.0 100.0 100.0 100.0 9.4286
No log 13.0 273 0.2911 100.0 100.0 100.0 100.0 9.1667
No log 14.0 294 0.2619 100.0 100.0 100.0 100.0 9.2143
No log 15.0 315 0.2510 100.0 100.0 100.0 100.0 9.2381
No log 16.0 336 0.2647 100.0 100.0 100.0 100.0 9.9048
No log 17.0 357 0.2438 100.0 100.0 100.0 100.0 9.2143
No log 18.0 378 0.2324 100.0 100.0 100.0 100.0 9.3095
No log 19.0 399 0.2296 100.0 100.0 100.0 100.0 9.3095
No log 20.0 420 0.2329 100.0 100.0 100.0 100.0 9.2619

Framework versions

  • Transformers 4.19.1
  • Pytorch 1.11.0+cu113
  • Datasets 2.2.1
  • Tokenizers 0.12.1
Downloads last month
2