gossminn's picture
update model card README.md
af7cad8
metadata
license: mit
tags:
  - generated_from_trainer
metrics:
  - accuracy
model-index:
  - name: detect-femicide-news-xlmr-nl-fft-freeze2
    results: []

detect-femicide-news-xlmr-nl-fft-freeze2

This model is a fine-tuned version of xlm-roberta-base on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.4112
  • Accuracy: 0.8571
  • Precision Neg: 0.85
  • Precision Pos: 0.875
  • Recall Neg: 0.9444
  • Recall Pos: 0.7
  • F1 Score Neg: 0.8947
  • F1 Score Pos: 0.7778

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 24
  • eval_batch_size: 8
  • seed: 1996
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Accuracy Precision Neg Precision Pos Recall Neg Recall Pos F1 Score Neg F1 Score Pos
1.2929 1.0 23 1.0782 0.75 0.7391 0.8 0.9444 0.4 0.8293 0.5333
1.1345 2.0 46 0.8942 0.8214 0.8095 0.8571 0.9444 0.6 0.8718 0.7059
0.9799 3.0 69 0.7418 0.8214 0.8095 0.8571 0.9444 0.6 0.8718 0.7059
0.7871 4.0 92 0.5905 0.8214 0.8095 0.8571 0.9444 0.6 0.8718 0.7059
0.6852 5.0 115 0.4981 0.8214 0.8095 0.8571 0.9444 0.6 0.8718 0.7059
0.5988 6.0 138 0.4501 0.8214 0.8095 0.8571 0.9444 0.6 0.8718 0.7059
0.5976 7.0 161 0.4441 0.8214 0.8095 0.8571 0.9444 0.6 0.8718 0.7059
0.5877 8.0 184 0.4501 0.8214 0.8095 0.8571 0.9444 0.6 0.8718 0.7059
0.5621 9.0 207 0.4503 0.8214 0.8095 0.8571 0.9444 0.6 0.8718 0.7059
0.5658 10.0 230 0.4514 0.8214 0.8095 0.8571 0.9444 0.6 0.8718 0.7059
0.5648 11.0 253 0.4505 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.559 12.0 276 0.4499 0.8214 0.8095 0.8571 0.9444 0.6 0.8718 0.7059
0.5668 13.0 299 0.4449 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.5542 14.0 322 0.4448 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.5496 15.0 345 0.4406 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.555 16.0 368 0.4392 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.5479 17.0 391 0.4389 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.5487 18.0 414 0.4345 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.5529 19.0 437 0.4312 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.5439 20.0 460 0.4314 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.5444 21.0 483 0.4317 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.5322 22.0 506 0.4299 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.5314 23.0 529 0.4265 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.5286 24.0 552 0.4245 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.5395 25.0 575 0.4256 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.5419 26.0 598 0.4253 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.55 27.0 621 0.4264 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.5525 28.0 644 0.4261 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.5465 29.0 667 0.4251 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.5304 30.0 690 0.4277 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.541 31.0 713 0.4268 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.5344 32.0 736 0.4262 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.5316 33.0 759 0.4219 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.5415 34.0 782 0.4244 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.5419 35.0 805 0.4221 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.5284 36.0 828 0.4206 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.5472 37.0 851 0.4193 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.5172 38.0 874 0.4185 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.522 39.0 897 0.4168 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.5261 40.0 920 0.4172 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.5246 41.0 943 0.4167 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.5249 42.0 966 0.4164 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.5229 43.0 989 0.4155 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.5144 44.0 1012 0.4155 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.527 45.0 1035 0.4181 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.525 46.0 1058 0.4184 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.5258 47.0 1081 0.4167 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.5297 48.0 1104 0.4156 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.5299 49.0 1127 0.4167 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.5273 50.0 1150 0.4167 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.5296 51.0 1173 0.4170 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.5094 52.0 1196 0.4168 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.5171 53.0 1219 0.4167 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.5179 54.0 1242 0.4161 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.5144 55.0 1265 0.4158 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.5452 56.0 1288 0.4141 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.5193 57.0 1311 0.4155 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.5288 58.0 1334 0.4146 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.5343 59.0 1357 0.4139 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.5224 60.0 1380 0.4132 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.5177 61.0 1403 0.4137 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.5299 62.0 1426 0.4146 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.5293 63.0 1449 0.4139 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.5183 64.0 1472 0.4131 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.5124 65.0 1495 0.4132 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.5102 66.0 1518 0.4131 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.523 67.0 1541 0.4137 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.5142 68.0 1564 0.4135 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.5122 69.0 1587 0.4131 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.5198 70.0 1610 0.4132 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.5195 71.0 1633 0.4133 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.5228 72.0 1656 0.4131 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.5065 73.0 1679 0.4129 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.5174 74.0 1702 0.4118 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.5189 75.0 1725 0.4119 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.5243 76.0 1748 0.4119 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.5102 77.0 1771 0.4128 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.5318 78.0 1794 0.4130 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.5214 79.0 1817 0.4128 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.5163 80.0 1840 0.4133 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.5235 81.0 1863 0.4128 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.5393 82.0 1886 0.4131 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.5284 83.0 1909 0.4128 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.5242 84.0 1932 0.4122 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.505 85.0 1955 0.4120 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.5211 86.0 1978 0.4120 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.5367 87.0 2001 0.4122 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.5068 88.0 2024 0.4122 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.51 89.0 2047 0.4119 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.5302 90.0 2070 0.4118 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.5208 91.0 2093 0.4119 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.5199 92.0 2116 0.4119 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.5142 93.0 2139 0.4116 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.5225 94.0 2162 0.4116 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.5123 95.0 2185 0.4116 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.5123 96.0 2208 0.4113 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.4929 97.0 2231 0.4114 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.5067 98.0 2254 0.4112 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.5254 99.0 2277 0.4112 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778
0.5131 100.0 2300 0.4112 0.8571 0.85 0.875 0.9444 0.7 0.8947 0.7778

Framework versions

  • Transformers 4.16.2
  • Pytorch 1.10.2+cu113
  • Datasets 1.18.3
  • Tokenizers 0.11.0