Edit model card

EN, ES and NL to AMR parsing

This model is a fine-tuned version of facebook/mbart-large-cc25 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6363
  • Smatch Precision: 75.39
  • Smatch Recall: 77.67
  • Smatch Fscore: 76.51
  • Smatch Unparsable: 0
  • Percent Not Recoverable: 0.2129

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 2
  • eval_batch_size: 2
  • seed: 42
  • gradient_accumulation_steps: 8
  • total_train_batch_size: 16
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 25

Training results

Training Loss Epoch Step Validation Loss Smatch Precision Smatch Recall Smatch Fscore Smatch Unparsable Percent Not Recoverable
0.3131 1.0 10431 1.5867 25.55 66.9 36.97 0 0.0194
0.0897 2.0 20862 1.0779 36.21 72.1 48.2 0 0.0968
0.1392 3.0 31294 0.7726 42.78 75.64 54.65 0 0.1936
0.085 4.0 41725 0.7040 46.38 76.85 57.85 0 0.0774
0.0008 5.0 52156 0.6874 47.47 76.12 58.47 0 0.1161
0.003 6.0 62588 0.6477 53.05 77.36 62.94 0 0.1742
0.0306 7.0 73019 0.6230 52.01 78.19 62.47 0 0.0968
0.0176 8.0 83451 0.6139 52.78 78.53 63.13 0 0.2129
0.0004 9.0 93882 0.6737 58.01 77.55 66.37 0 0.1355
0.0018 10.0 104313 0.6187 58.99 77.99 67.17 0 0.1161
0.0188 11.0 114745 0.6119 62.35 78.01 69.31 0 0.1161
0.0055 12.0 125176 0.6455 62.08 79.07 69.55 0 0.0774
0.0555 13.0 135607 0.6502 62.35 78.17 69.37 0 0.1355
0.0041 14.0 146039 0.6509 65.88 78.31 71.56 0 0.1742
0.0064 15.0 156470 0.6771 66.98 78.33 72.21 0 0.1355
0.0031 16.0 166902 0.6361 68.12 78.66 73.01 0 0.0774
0.0131 17.0 177333 0.6390 69.49 78.66 73.79 0 0.0968
0.0067 18.0 187764 0.6933 69.67 78.4 73.77 0 0.1549
0.0267 19.0 198196 0.6558 70.64 78.71 74.46 0 0.0774
0.0146 20.0 208627 0.6574 71.23 78.93 74.88 0 0.1161
0.0025 21.0 219058 0.6781 71.88 78.28 74.94 0 0.1936
0.0044 22.0 229490 0.6491 73.08 78.57 75.72 0 0.1161
0.0234 23.0 239921 0.6458 74.02 78.33 76.12 0 0.1549
0.0001 24.0 250353 0.6485 74.58 77.98 76.24 0 0.2129
0.0448 25.0 260775 0.6363 75.39 77.67 76.51 0 0.2129

Framework versions

  • Transformers 4.34.0.dev0
  • Pytorch 2.0.1+cu117
  • Datasets 2.14.2
  • Tokenizers 0.13.3
Downloads last month
6
Safetensors
Model size
614M params
Tensor type
F32
ยท
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for BramVanroy/mbart-large-cc25-ft-amr30-en_es_nl

Finetuned
(23)
this model

Space using BramVanroy/mbart-large-cc25-ft-amr30-en_es_nl 1

Collection including BramVanroy/mbart-large-cc25-ft-amr30-en_es_nl