Edit model card

bart-base-wsd-finetuned-cve-reason-2

This model is a fine-tuned version of mgkamalesh7/bart-base-wsd-finetuned-cve-reason on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.4159
  • Rouge1: 91.1806
  • Rouge2: 87.5256
  • Rougel: 91.1424
  • Rougelsum: 91.1124
  • Gen Len: 8.6492

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 200
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Rouge1 Rouge2 Rougel Rougelsum Gen Len
No log 1.0 56 0.4140 89.658 85.4214 89.5042 89.4542 8.7377
No log 2.0 112 0.3450 89.6585 86.3159 89.5529 89.537 8.7344
No log 3.0 168 0.4252 89.2531 85.5599 89.1241 89.1115 8.7311
No log 4.0 224 0.4278 89.4207 85.3854 89.1996 89.2622 8.7443
No log 5.0 280 0.4023 90.0865 86.5253 89.8973 89.9373 8.7475
No log 6.0 336 0.3831 89.6788 86.8093 89.5032 89.5405 8.6557
No log 7.0 392 0.4100 90.7802 87.4674 90.6112 90.6915 8.6721
No log 8.0 448 0.4425 90.4749 87.3615 90.4286 90.3849 8.6459
0.0544 9.0 504 0.4098 90.3948 86.9721 90.4295 90.3163 8.5541
0.0544 10.0 560 0.4289 89.8096 85.6718 89.6744 89.6666 8.6787
0.0544 11.0 616 0.4338 90.9849 87.3715 90.9871 90.9153 8.6787
0.0544 12.0 672 0.4159 91.1806 87.5256 91.1424 91.1124 8.6492

Framework versions

  • Transformers 4.42.3
  • Pytorch 2.3.0+cu121
  • Datasets 2.20.0
  • Tokenizers 0.19.1
Downloads last month
4
Safetensors
Model size
139M params
Tensor type
F32
·
Inference API
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.