Edit model card

nl+no_processing

This model is a fine-tuned version of facebook/mbart-large-cc25 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6038
  • Smatch Precision: 73.7
  • Smatch Recall: 76.48
  • Smatch Fscore: 75.06
  • Smatch Unparsable: 0
  • Percent Not Recoverable: 0.2323

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 2
  • eval_batch_size: 2
  • seed: 42
  • gradient_accumulation_steps: 8
  • total_train_batch_size: 16
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 25

Training results

Training Loss Epoch Step Validation Loss Smatch Precision Smatch Recall Smatch Fscore Smatch Unparsable Percent Not Recoverable
0.8025 1.0 3477 1.3793 18.51 65.71 28.88 0 0.0
0.13 2.0 6954 0.9377 27.0 71.3 39.16 0 0.1161
0.0953 3.0 10431 0.7509 34.09 72.74 46.42 0 0.1161
0.1386 4.0 13908 0.8524 33.38 73.32 45.87 2 0.0
0.0974 5.0 17385 0.6957 41.69 73.92 53.31 0 0.0
0.0705 6.0 20862 0.6145 47.98 75.12 58.55 0 0.0
0.2265 7.0 24339 0.6439 47.06 75.53 57.99 0 0.0
0.0506 8.0 27817 0.5974 53.0 76.95 62.77 0 0.0
0.064 9.0 31294 0.6387 51.83 77.47 62.11 0 0.0
0.0112 10.0 34771 0.6066 54.82 76.98 64.03 0 0.0
0.047 11.0 38248 0.5970 60.36 77.04 67.69 0 0.0
0.0134 12.0 41725 0.5675 61.72 77.15 68.58 0 0.0
0.0656 13.0 45202 0.6210 62.8 76.92 69.15 0 0.0581
0.015 14.0 48679 0.6257 62.8 77.32 69.31 0 0.0
0.0134 15.0 52156 0.5635 66.7 77.34 71.63 0 0.1161
0.0265 16.0 55634 0.5839 67.61 76.76 71.89 0 0.0581
0.0219 17.0 59111 0.5894 68.66 77.43 72.78 0 0.1161
0.0008 18.0 62588 0.5981 68.44 77.57 72.72 0 0.0
0.0157 19.0 66065 0.6184 69.88 77.42 73.46 0 0.0581
0.0334 20.0 69542 0.6026 70.76 77.37 73.92 0 0.2323
0.0619 21.0 73019 0.6021 72.03 77.0 74.44 0 0.1742
0.0075 22.0 76496 0.6166 72.33 76.74 74.47 0 0.0581
0.0164 23.0 79973 0.6100 72.75 77.03 74.83 0 0.2323
0.0011 24.0 83451 0.6037 73.7 76.51 75.08 0 0.2323
0.0865 25.0 86925 0.6038 73.7 76.48 75.06 0 0.2323

Framework versions

  • Transformers 4.34.0.dev0
  • Pytorch 2.0.1+cu117
  • Datasets 2.14.2
  • Tokenizers 0.13.3
Downloads last month
5
Safetensors
Model size
614M params
Tensor type
F32
ยท
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for BramVanroy/mbart-large-cc25-ft-amr30-nl

Finetuned
(23)
this model

Space using BramVanroy/mbart-large-cc25-ft-amr30-nl 1

Collection including BramVanroy/mbart-large-cc25-ft-amr30-nl