xlm-roberta-base-finetuning-wrime-random4000-epoch6-test01
This model is a fine-tuned version of xlm-roberta-base on the None dataset. It achieves the following results on the evaluation set:
- Loss: 0.3173
- Jaccard (sample): [0.591182364729459, 0.3438395415472779, 0.6059322033898306, 0.036, 0.0, 0.14396887159533073, 0.1870967741935484, 0.0]
- Jaccard (macro): 0.2385
- Macro-precision: 0.4548
- Macro-recall: 0.2914
- Macro F1: 0.3307
- Micro F1: 0.5504
- Accuracy: 0.3125
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 6
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Jaccard (sample) | Jaccard (macro) | Macro-precision | Macro-recall | Macro F1 | Micro F1 | Accuracy |
---|---|---|---|---|---|---|---|---|---|---|
No log | 1.0 | 125 | 0.4209 | [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0] | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0775 |
No log | 2.0 | 250 | 0.3810 | [0.014084507042253521, 0.03816793893129771, 0.0, 0.0, 0.0, 0.004739336492890996, 0.0, 0.0] | 0.0071 | 0.2821 | 0.0074 | 0.0138 | 0.0194 | 0.0792 |
No log | 3.0 | 375 | 0.3545 | [0.5387323943661971, 0.271523178807947, 0.3476190476190476, 0.0, 0.0, 0.0047169811320754715, 0.016260162601626018, 0.0] | 0.1474 | 0.4623 | 0.1790 | 0.2106 | 0.4353 | 0.2475 |
0.4439 | 4.0 | 500 | 0.3286 | [0.5519848771266541, 0.3492957746478873, 0.5913978494623656, 0.0, 0.0, 0.155893536121673, 0.1310344827586207, 0.0] | 0.2225 | 0.3752 | 0.2767 | 0.3092 | 0.5339 | 0.2983 |
0.4439 | 5.0 | 625 | 0.3213 | [0.599271402550091, 0.3089171974522293, 0.5960698689956332, 0.10384615384615385, 0.0, 0.09691629955947137, 0.12857142857142856, 0.0] | 0.2292 | 0.4834 | 0.2749 | 0.3201 | 0.5519 | 0.3233 |
0.4439 | 6.0 | 750 | 0.3173 | [0.591182364729459, 0.3438395415472779, 0.6059322033898306, 0.036, 0.0, 0.14396887159533073, 0.1870967741935484, 0.0] | 0.2385 | 0.4548 | 0.2914 | 0.3307 | 0.5504 | 0.3125 |
Framework versions
- Transformers 4.35.2
- Pytorch 1.13.0
- Datasets 2.15.0
- Tokenizers 0.15.0
- Downloads last month
- 15