Edit model card

detect-femicide-news-bert-nl-None

This model is a fine-tuned version of GroNLP/bert-base-dutch-cased on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.8162
  • Accuracy: 0.75
  • Precision Neg: 0.8235
  • Precision Pos: 0.6364
  • Recall Neg: 0.7778
  • Recall Pos: 0.7
  • F1 Score Neg: 0.8000
  • F1 Score Pos: 0.6667

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 24
  • eval_batch_size: 8
  • seed: 1996
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Accuracy Precision Neg Precision Pos Recall Neg Recall Pos F1 Score Neg F1 Score Pos
0.6636 1.0 23 0.6474 0.6429 0.8333 0.5 0.5556 0.8 0.6667 0.6154
0.572 2.0 46 0.5653 0.6071 0.6842 0.4444 0.7222 0.4 0.7027 0.4211
0.502 3.0 69 0.5601 0.6786 0.8462 0.5333 0.6111 0.8 0.7097 0.64
0.4576 4.0 92 0.5199 0.7143 0.7778 0.6 0.7778 0.6 0.7778 0.6
0.3803 5.0 115 0.5219 0.7143 0.7778 0.6 0.7778 0.6 0.7778 0.6
0.3466 6.0 138 0.5125 0.7143 0.7778 0.6 0.7778 0.6 0.7778 0.6
0.3325 7.0 161 0.4930 0.75 0.7895 0.6667 0.8333 0.6 0.8108 0.6316
0.3022 8.0 184 0.5144 0.75 0.7895 0.6667 0.8333 0.6 0.8108 0.6316
0.2854 9.0 207 0.5588 0.7143 0.7778 0.6 0.7778 0.6 0.7778 0.6
0.2797 10.0 230 0.5700 0.6786 0.7647 0.5455 0.7222 0.6 0.7429 0.5714
0.2645 11.0 253 0.5806 0.6786 0.7647 0.5455 0.7222 0.6 0.7429 0.5714
0.2411 12.0 276 0.5642 0.7143 0.7778 0.6 0.7778 0.6 0.7778 0.6
0.2554 13.0 299 0.6364 0.6786 0.8 0.5385 0.6667 0.7 0.7273 0.6087
0.2682 14.0 322 0.5656 0.7143 0.7778 0.6 0.7778 0.6 0.7778 0.6
0.2429 15.0 345 0.6249 0.7143 0.7778 0.6 0.7778 0.6 0.7778 0.6
0.2368 16.0 368 0.5914 0.7143 0.7778 0.6 0.7778 0.6 0.7778 0.6
0.2398 17.0 391 0.7456 0.6786 0.8462 0.5333 0.6111 0.8 0.7097 0.64
0.251 18.0 414 0.5602 0.75 0.7895 0.6667 0.8333 0.6 0.8108 0.6316
0.2403 19.0 437 0.5803 0.75 0.7895 0.6667 0.8333 0.6 0.8108 0.6316
0.2237 20.0 460 0.8165 0.6786 0.9091 0.5294 0.5556 0.9 0.6897 0.6667
0.2481 21.0 483 0.6195 0.7143 0.7778 0.6 0.7778 0.6 0.7778 0.6
0.2357 22.0 506 0.7081 0.6429 0.75 0.5 0.6667 0.6 0.7059 0.5455
0.2227 23.0 529 0.6786 0.6786 0.8 0.5385 0.6667 0.7 0.7273 0.6087
0.2137 24.0 552 0.6567 0.75 0.8235 0.6364 0.7778 0.7 0.8000 0.6667
0.2216 25.0 575 0.7286 0.7143 0.8571 0.5714 0.6667 0.8 0.75 0.6667
0.2289 26.0 598 0.6146 0.7143 0.7778 0.6 0.7778 0.6 0.7778 0.6
0.2268 27.0 621 0.6721 0.6786 0.8 0.5385 0.6667 0.7 0.7273 0.6087
0.2208 28.0 644 0.6894 0.6786 0.8 0.5385 0.6667 0.7 0.7273 0.6087
0.2252 29.0 667 0.5986 0.7857 0.8 0.75 0.8889 0.6 0.8421 0.6667
0.2127 30.0 690 0.6868 0.6429 0.75 0.5 0.6667 0.6 0.7059 0.5455
0.2259 31.0 713 0.6682 0.7143 0.7778 0.6 0.7778 0.6 0.7778 0.6
0.2253 32.0 736 0.8906 0.6786 0.9091 0.5294 0.5556 0.9 0.6897 0.6667
0.2421 33.0 759 0.6461 0.7143 0.7778 0.6 0.7778 0.6 0.7778 0.6
0.2181 34.0 782 0.7014 0.6786 0.7647 0.5455 0.7222 0.6 0.7429 0.5714
0.2199 35.0 805 0.7655 0.6786 0.8 0.5385 0.6667 0.7 0.7273 0.6087
0.201 36.0 828 0.7356 0.6429 0.75 0.5 0.6667 0.6 0.7059 0.5455
0.2192 37.0 851 0.6958 0.6786 0.7647 0.5455 0.7222 0.6 0.7429 0.5714
0.2164 38.0 874 0.7475 0.6429 0.75 0.5 0.6667 0.6 0.7059 0.5455
0.22 39.0 897 0.6847 0.7143 0.7778 0.6 0.7778 0.6 0.7778 0.6
0.2177 40.0 920 0.6463 0.7857 0.8 0.75 0.8889 0.6 0.8421 0.6667
0.2126 41.0 943 0.6793 0.7143 0.7778 0.6 0.7778 0.6 0.7778 0.6
0.2069 42.0 966 0.7303 0.7143 0.8125 0.5833 0.7222 0.7 0.7647 0.6364
0.2099 43.0 989 0.6598 0.7143 0.7778 0.6 0.7778 0.6 0.7778 0.6
0.2104 44.0 1012 0.7276 0.6786 0.7647 0.5455 0.7222 0.6 0.7429 0.5714
0.213 45.0 1035 0.7099 0.7143 0.7778 0.6 0.7778 0.6 0.7778 0.6
0.2083 46.0 1058 0.7545 0.6786 0.8 0.5385 0.6667 0.7 0.7273 0.6087
0.1958 47.0 1081 0.6533 0.75 0.7895 0.6667 0.8333 0.6 0.8108 0.6316
0.2096 48.0 1104 0.7141 0.7143 0.7778 0.6 0.7778 0.6 0.7778 0.6
0.2134 49.0 1127 0.7008 0.7143 0.7778 0.6 0.7778 0.6 0.7778 0.6
0.203 50.0 1150 0.6557 0.75 0.7895 0.6667 0.8333 0.6 0.8108 0.6316
0.2024 51.0 1173 0.7348 0.75 0.8235 0.6364 0.7778 0.7 0.8000 0.6667
0.2095 52.0 1196 0.7708 0.75 0.8235 0.6364 0.7778 0.7 0.8000 0.6667
0.1997 53.0 1219 0.7106 0.7143 0.7778 0.6 0.7778 0.6 0.7778 0.6
0.2048 54.0 1242 0.7530 0.75 0.8235 0.6364 0.7778 0.7 0.8000 0.6667
0.1963 55.0 1265 0.7520 0.7143 0.7778 0.6 0.7778 0.6 0.7778 0.6
0.2039 56.0 1288 0.7230 0.7143 0.7778 0.6 0.7778 0.6 0.7778 0.6
0.2023 57.0 1311 0.7644 0.7143 0.7778 0.6 0.7778 0.6 0.7778 0.6
0.2022 58.0 1334 0.7666 0.7143 0.7778 0.6 0.7778 0.6 0.7778 0.6
0.1898 59.0 1357 0.7961 0.75 0.8235 0.6364 0.7778 0.7 0.8000 0.6667
0.2155 60.0 1380 0.7763 0.75 0.8235 0.6364 0.7778 0.7 0.8000 0.6667
0.1948 61.0 1403 0.7545 0.75 0.8235 0.6364 0.7778 0.7 0.8000 0.6667
0.2124 62.0 1426 0.7344 0.7143 0.7778 0.6 0.7778 0.6 0.7778 0.6
0.1979 63.0 1449 0.7676 0.75 0.8235 0.6364 0.7778 0.7 0.8000 0.6667
0.1958 64.0 1472 0.7567 0.75 0.8235 0.6364 0.7778 0.7 0.8000 0.6667
0.1946 65.0 1495 0.7349 0.7143 0.7778 0.6 0.7778 0.6 0.7778 0.6
0.1888 66.0 1518 0.7472 0.7143 0.7778 0.6 0.7778 0.6 0.7778 0.6
0.1889 67.0 1541 0.7202 0.7857 0.8 0.75 0.8889 0.6 0.8421 0.6667
0.2077 68.0 1564 0.7193 0.7857 0.8 0.75 0.8889 0.6 0.8421 0.6667
0.1882 69.0 1587 0.7541 0.7143 0.7778 0.6 0.7778 0.6 0.7778 0.6
0.1903 70.0 1610 0.8058 0.75 0.8235 0.6364 0.7778 0.7 0.8000 0.6667
0.2017 71.0 1633 0.7862 0.7143 0.7778 0.6 0.7778 0.6 0.7778 0.6
0.1929 72.0 1656 0.8000 0.75 0.8235 0.6364 0.7778 0.7 0.8000 0.6667
0.192 73.0 1679 0.8199 0.75 0.8235 0.6364 0.7778 0.7 0.8000 0.6667
0.1903 74.0 1702 0.8044 0.75 0.8235 0.6364 0.7778 0.7 0.8000 0.6667
0.1953 75.0 1725 0.7943 0.75 0.8235 0.6364 0.7778 0.7 0.8000 0.6667
0.1908 76.0 1748 0.7805 0.75 0.8235 0.6364 0.7778 0.7 0.8000 0.6667
0.1975 77.0 1771 0.7595 0.7143 0.7778 0.6 0.7778 0.6 0.7778 0.6
0.1943 78.0 1794 0.7908 0.75 0.8235 0.6364 0.7778 0.7 0.8000 0.6667
0.192 79.0 1817 0.8389 0.75 0.8235 0.6364 0.7778 0.7 0.8000 0.6667
0.1879 80.0 1840 0.7925 0.7143 0.7778 0.6 0.7778 0.6 0.7778 0.6
0.1933 81.0 1863 0.8149 0.75 0.8235 0.6364 0.7778 0.7 0.8000 0.6667
0.1867 82.0 1886 0.7925 0.7143 0.7778 0.6 0.7778 0.6 0.7778 0.6
0.1906 83.0 1909 0.8118 0.75 0.8235 0.6364 0.7778 0.7 0.8000 0.6667
0.1895 84.0 1932 0.8108 0.75 0.8235 0.6364 0.7778 0.7 0.8000 0.6667
0.1925 85.0 1955 0.7962 0.7143 0.7778 0.6 0.7778 0.6 0.7778 0.6
0.1851 86.0 1978 0.7942 0.7143 0.7778 0.6 0.7778 0.6 0.7778 0.6
0.1952 87.0 2001 0.8104 0.75 0.8235 0.6364 0.7778 0.7 0.8000 0.6667
0.1821 88.0 2024 0.8187 0.75 0.8235 0.6364 0.7778 0.7 0.8000 0.6667
0.1946 89.0 2047 0.8378 0.75 0.8235 0.6364 0.7778 0.7 0.8000 0.6667
0.1904 90.0 2070 0.8407 0.75 0.8235 0.6364 0.7778 0.7 0.8000 0.6667
0.1931 91.0 2093 0.8351 0.75 0.8235 0.6364 0.7778 0.7 0.8000 0.6667
0.1883 92.0 2116 0.8269 0.75 0.8235 0.6364 0.7778 0.7 0.8000 0.6667
0.1845 93.0 2139 0.8110 0.75 0.8235 0.6364 0.7778 0.7 0.8000 0.6667
0.1883 94.0 2162 0.8209 0.75 0.8235 0.6364 0.7778 0.7 0.8000 0.6667
0.1991 95.0 2185 0.8194 0.75 0.8235 0.6364 0.7778 0.7 0.8000 0.6667
0.187 96.0 2208 0.8182 0.75 0.8235 0.6364 0.7778 0.7 0.8000 0.6667
0.1842 97.0 2231 0.8168 0.75 0.8235 0.6364 0.7778 0.7 0.8000 0.6667
0.1837 98.0 2254 0.8164 0.75 0.8235 0.6364 0.7778 0.7 0.8000 0.6667
0.1849 99.0 2277 0.8173 0.75 0.8235 0.6364 0.7778 0.7 0.8000 0.6667
0.189 100.0 2300 0.8162 0.75 0.8235 0.6364 0.7778 0.7 0.8000 0.6667

Framework versions

  • Transformers 4.16.2
  • Pytorch 1.10.2+cu113
  • Datasets 1.18.3
  • Tokenizers 0.11.0
Downloads last month
5