scenario-TCR-XLMV-XCOPA-2_data-xcopa_all

This model is a fine-tuned version of facebook/xlm-v-base on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6931
  • Accuracy: 0.5
  • F1: 0.4671

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 34
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 500

Training results

Training Loss Epoch Step Validation Loss Accuracy F1
No log 0.38 5 0.6932 0.4858 0.4767
No log 0.77 10 0.6931 0.515 0.5134
No log 1.15 15 0.6931 0.5158 0.5038
No log 1.54 20 0.6931 0.5108 0.5021
No log 1.92 25 0.6931 0.5217 0.5035
No log 2.31 30 0.6931 0.525 0.5069
No log 2.69 35 0.6931 0.5283 0.5070
No log 3.08 40 0.6931 0.5292 0.5125
No log 3.46 45 0.6931 0.5333 0.5122
No log 3.85 50 0.6930 0.5125 0.4970
No log 4.23 55 0.6930 0.5342 0.5251
No log 4.62 60 0.6931 0.5417 0.5217
No log 5.0 65 0.6931 0.5592 0.5482
No log 5.38 70 0.6931 0.5667 0.5517
No log 5.77 75 0.6931 0.5458 0.5362
No log 6.15 80 0.6931 0.535 0.5311
No log 6.54 85 0.6930 0.5433 0.5276
No log 6.92 90 0.6931 0.5025 0.4731
No log 7.31 95 0.6931 0.505 0.4715
No log 7.69 100 0.6931 0.5017 0.4514
No log 8.08 105 0.6931 0.5042 0.4831
No log 8.46 110 0.6931 0.5058 0.4785
No log 8.85 115 0.6931 0.5158 0.4872
No log 9.23 120 0.6931 0.5158 0.4890
No log 9.62 125 0.6931 0.5075 0.4829
No log 10.0 130 0.6931 0.505 0.4780
No log 10.38 135 0.6931 0.5 0.4709
No log 10.77 140 0.6931 0.485 0.4579
No log 11.15 145 0.6931 0.4858 0.4592
No log 11.54 150 0.6931 0.485 0.4569
No log 11.92 155 0.6931 0.4917 0.4611
No log 12.31 160 0.6931 0.4908 0.4664
No log 12.69 165 0.6931 0.4858 0.4602
No log 13.08 170 0.6931 0.4983 0.4756
No log 13.46 175 0.6931 0.4992 0.4788
No log 13.85 180 0.6931 0.4942 0.4717
No log 14.23 185 0.6931 0.4958 0.4735
No log 14.62 190 0.6931 0.5017 0.48
No log 15.0 195 0.6931 0.4942 0.4633
No log 15.38 200 0.6931 0.4942 0.4527
No log 15.77 205 0.6931 0.4925 0.4509
No log 16.15 210 0.6931 0.495 0.4570
No log 16.54 215 0.6931 0.4933 0.4581
No log 16.92 220 0.6931 0.5 0.4671

Framework versions

  • Transformers 4.33.3
  • Pytorch 2.1.1+cu121
  • Datasets 2.14.5
  • Tokenizers 0.13.3
Downloads last month
6
Inference API
Inference API (serverless) does not yet support transformers models for this pipeline type.

Model tree for haryoaw/scenario-TCR-XLMV-XCOPA-2_data-xcopa_all

Finetuned
(41)
this model