MyPoliBERT-ver03
This model is a fine-tuned version of bert-base-uncased on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.2439
- Democracy F1: 0.9216
- Democracy Accuracy: 0.9287
- Economy F1: 0.9057
- Economy Accuracy: 0.9084
- Race F1: 0.9429
- Race Accuracy: 0.9458
- Leadership F1: 0.8377
- Leadership Accuracy: 0.8396
- Development F1: 0.8682
- Development Accuracy: 0.8778
- Corruption F1: 0.9283
- Corruption Accuracy: 0.9326
- Instability F1: 0.9105
- Instability Accuracy: 0.9181
- Safety F1: 0.9073
- Safety Accuracy: 0.9092
- Administration F1: 0.8761
- Administration Accuracy: 0.8875
- Education F1: 0.9559
- Education Accuracy: 0.9578
- Religion F1: 0.9464
- Religion Accuracy: 0.9482
- Environment F1: 0.9753
- Environment Accuracy: 0.9760
- Overall F1: 0.9147
- Overall Accuracy: 0.9191
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 64
- eval_batch_size: 64
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 128
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 16
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Democracy F1 | Democracy Accuracy | Economy F1 | Economy Accuracy | Race F1 | Race Accuracy | Leadership F1 | Leadership Accuracy | Development F1 | Development Accuracy | Corruption F1 | Corruption Accuracy | Instability F1 | Instability Accuracy | Safety F1 | Safety Accuracy | Administration F1 | Administration Accuracy | Education F1 | Education Accuracy | Religion F1 | Religion Accuracy | Environment F1 | Environment Accuracy | Overall F1 | Overall Accuracy |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
No log | 1.0 | 169 | 0.3705 | 0.8469 | 0.8947 | 0.8597 | 0.8797 | 0.8721 | 0.9001 | 0.7362 | 0.7790 | 0.7856 | 0.8293 | 0.8657 | 0.8953 | 0.8488 | 0.8821 | 0.8559 | 0.8693 | 0.7863 | 0.8490 | 0.9008 | 0.9252 | 0.8963 | 0.9144 | 0.9307 | 0.9471 | 0.8488 | 0.8804 |
No log | 2.0 | 338 | 0.3281 | 0.8524 | 0.8968 | 0.8675 | 0.8869 | 0.8893 | 0.9107 | 0.7786 | 0.8048 | 0.8049 | 0.8425 | 0.8685 | 0.9006 | 0.8550 | 0.8897 | 0.8701 | 0.8847 | 0.8019 | 0.8539 | 0.9177 | 0.9357 | 0.9166 | 0.9287 | 0.9455 | 0.9556 | 0.8640 | 0.8909 |
0.3523 | 3.0 | 507 | 0.3011 | 0.8783 | 0.9077 | 0.8832 | 0.8955 | 0.9178 | 0.9281 | 0.8126 | 0.8238 | 0.8165 | 0.8487 | 0.8894 | 0.9105 | 0.8755 | 0.8988 | 0.8824 | 0.8934 | 0.8208 | 0.8630 | 0.9239 | 0.9389 | 0.9292 | 0.9365 | 0.9423 | 0.9536 | 0.8810 | 0.8999 |
0.3523 | 4.0 | 676 | 0.2837 | 0.8798 | 0.9090 | 0.8885 | 0.8986 | 0.9226 | 0.9322 | 0.8167 | 0.8293 | 0.8342 | 0.8613 | 0.9001 | 0.9164 | 0.8776 | 0.9012 | 0.8839 | 0.8956 | 0.8245 | 0.8659 | 0.9403 | 0.9478 | 0.9357 | 0.9411 | 0.9522 | 0.9593 | 0.8880 | 0.9048 |
0.3523 | 5.0 | 845 | 0.2709 | 0.8934 | 0.9155 | 0.8972 | 0.9051 | 0.9316 | 0.9387 | 0.8271 | 0.8355 | 0.8471 | 0.8665 | 0.9085 | 0.9224 | 0.8891 | 0.9071 | 0.8910 | 0.9003 | 0.8468 | 0.8769 | 0.9439 | 0.9500 | 0.9400 | 0.9441 | 0.9668 | 0.9695 | 0.8985 | 0.9110 |
0.2577 | 6.0 | 1014 | 0.2641 | 0.9023 | 0.9192 | 0.8994 | 0.9047 | 0.9357 | 0.9413 | 0.8300 | 0.8344 | 0.8498 | 0.8689 | 0.9155 | 0.9253 | 0.8995 | 0.9112 | 0.8975 | 0.9027 | 0.8659 | 0.8869 | 0.9450 | 0.9510 | 0.9394 | 0.9432 | 0.9701 | 0.9721 | 0.9042 | 0.9134 |
0.2577 | 7.0 | 1183 | 0.2573 | 0.9088 | 0.9233 | 0.8999 | 0.9038 | 0.9387 | 0.9434 | 0.8316 | 0.8351 | 0.8608 | 0.8724 | 0.9202 | 0.9255 | 0.9047 | 0.9133 | 0.9021 | 0.9051 | 0.8685 | 0.8806 | 0.9499 | 0.9541 | 0.9422 | 0.9454 | 0.9719 | 0.9733 | 0.9083 | 0.9146 |
0.2577 | 8.0 | 1352 | 0.2520 | 0.9118 | 0.9239 | 0.9052 | 0.9101 | 0.9420 | 0.9454 | 0.8304 | 0.8358 | 0.8590 | 0.8735 | 0.9223 | 0.9287 | 0.9046 | 0.9146 | 0.9021 | 0.9068 | 0.8713 | 0.8860 | 0.9518 | 0.9558 | 0.9444 | 0.9467 | 0.9736 | 0.9747 | 0.9099 | 0.9168 |
0.2107 | 9.0 | 1521 | 0.2506 | 0.9137 | 0.9244 | 0.9026 | 0.9075 | 0.9417 | 0.9456 | 0.8345 | 0.8373 | 0.8644 | 0.8774 | 0.9242 | 0.9307 | 0.9040 | 0.9151 | 0.9031 | 0.9058 | 0.8753 | 0.8875 | 0.9528 | 0.9560 | 0.9442 | 0.9467 | 0.9739 | 0.9751 | 0.9112 | 0.9174 |
0.2107 | 10.0 | 1690 | 0.2467 | 0.9170 | 0.9276 | 0.9027 | 0.9062 | 0.9430 | 0.9458 | 0.8376 | 0.8386 | 0.8625 | 0.8754 | 0.9228 | 0.9289 | 0.9106 | 0.9181 | 0.9063 | 0.9099 | 0.8760 | 0.8871 | 0.9542 | 0.9558 | 0.9429 | 0.9452 | 0.9749 | 0.9759 | 0.9125 | 0.9179 |
0.2107 | 11.0 | 1859 | 0.2450 | 0.9175 | 0.9278 | 0.9049 | 0.9083 | 0.9427 | 0.9460 | 0.8340 | 0.8383 | 0.8635 | 0.8765 | 0.9251 | 0.9300 | 0.9082 | 0.9166 | 0.9068 | 0.9099 | 0.8729 | 0.8882 | 0.9532 | 0.9562 | 0.9444 | 0.9469 | 0.9741 | 0.9753 | 0.9123 | 0.9183 |
0.1842 | 12.0 | 2028 | 0.2464 | 0.9215 | 0.9285 | 0.9044 | 0.9079 | 0.9428 | 0.9458 | 0.8375 | 0.8410 | 0.8645 | 0.8745 | 0.9276 | 0.9324 | 0.9105 | 0.9166 | 0.9053 | 0.9064 | 0.8737 | 0.8860 | 0.9558 | 0.9575 | 0.9448 | 0.9465 | 0.9744 | 0.9755 | 0.9136 | 0.9182 |
0.1842 | 13.0 | 2197 | 0.2440 | 0.9195 | 0.9278 | 0.9058 | 0.9088 | 0.9422 | 0.9456 | 0.8369 | 0.8405 | 0.8681 | 0.8778 | 0.9284 | 0.9328 | 0.9091 | 0.9164 | 0.9073 | 0.9092 | 0.8743 | 0.8869 | 0.9564 | 0.9584 | 0.9448 | 0.9465 | 0.9751 | 0.9759 | 0.9140 | 0.9189 |
0.1842 | 14.0 | 2366 | 0.2443 | 0.9220 | 0.9285 | 0.9053 | 0.9084 | 0.9425 | 0.9454 | 0.8369 | 0.8397 | 0.8652 | 0.8761 | 0.9281 | 0.9331 | 0.9100 | 0.9174 | 0.9066 | 0.9083 | 0.8735 | 0.8865 | 0.9561 | 0.9578 | 0.9452 | 0.9469 | 0.9750 | 0.9759 | 0.9138 | 0.9187 |
0.169 | 15.0 | 2535 | 0.2441 | 0.9217 | 0.9289 | 0.9057 | 0.9083 | 0.9428 | 0.9458 | 0.8364 | 0.8375 | 0.8657 | 0.8761 | 0.9277 | 0.9320 | 0.9103 | 0.9179 | 0.9072 | 0.9090 | 0.8758 | 0.8875 | 0.9551 | 0.9571 | 0.9458 | 0.9478 | 0.9754 | 0.9762 | 0.9141 | 0.9187 |
0.169 | 15.9080 | 2688 | 0.2439 | 0.9216 | 0.9287 | 0.9057 | 0.9084 | 0.9429 | 0.9458 | 0.8377 | 0.8396 | 0.8682 | 0.8778 | 0.9283 | 0.9326 | 0.9105 | 0.9181 | 0.9073 | 0.9092 | 0.8761 | 0.8875 | 0.9559 | 0.9578 | 0.9464 | 0.9482 | 0.9753 | 0.9760 | 0.9147 | 0.9191 |
Framework versions
- Transformers 4.48.2
- Pytorch 2.5.1+cu124
- Datasets 3.2.0
- Tokenizers 0.21.0
- Downloads last month
- 49
Inference Providers
NEW
This model is not currently available via any of the supported Inference Providers.
Model tree for YagiASAFAS/MyPoliBERT-ver02
Base model
google-bert/bert-base-uncased