metadata
license: apache-2.0
tags:
- generated_from_trainer
datasets:
- crows_pairs
metrics:
- accuracy
model-index:
- name: bert-base-uncased_crows_pairs_finetuned
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: crows_pairs
type: crows_pairs
config: crows_pairs
split: test
args: crows_pairs
metrics:
- name: Accuracy
type: accuracy
value: 0.7649006622516556
bert-base-uncased_crows_pairs_finetuned
This model is a fine-tuned version of bert-base-uncased on the crows_pairs dataset. It achieves the following results on the evaluation set:
- Loss: 2.1731
- Accuracy: 0.7649
- Tp: 0.3344
- Tn: 0.4305
- Fp: 0.1126
- Fn: 0.1225
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 64
- eval_batch_size: 64
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 50
Training results
Training Loss | Epoch | Step | Validation Loss | Accuracy | Tp | Tn | Fp | Fn |
---|---|---|---|---|---|---|---|---|
0.703 | 1.05 | 20 | 0.6889 | 0.5430 | 0.0 | 0.5430 | 0.0 | 0.4570 |
0.6884 | 2.11 | 40 | 0.6886 | 0.5 | 0.3013 | 0.1987 | 0.3444 | 0.1556 |
0.5388 | 3.16 | 60 | 0.6347 | 0.7053 | 0.1821 | 0.5232 | 0.0199 | 0.2748 |
0.2228 | 4.21 | 80 | 0.9802 | 0.6987 | 0.1887 | 0.5099 | 0.0331 | 0.2682 |
0.1644 | 5.26 | 100 | 0.7523 | 0.7583 | 0.3675 | 0.3907 | 0.1523 | 0.0894 |
0.0478 | 6.32 | 120 | 1.5712 | 0.6954 | 0.2053 | 0.4901 | 0.0530 | 0.2517 |
0.0465 | 7.37 | 140 | 1.2587 | 0.7351 | 0.2781 | 0.4570 | 0.0861 | 0.1788 |
0.0313 | 8.42 | 160 | 1.5825 | 0.7450 | 0.3775 | 0.3675 | 0.1755 | 0.0795 |
0.0137 | 9.47 | 180 | 1.3570 | 0.7318 | 0.2815 | 0.4503 | 0.0927 | 0.1755 |
0.04 | 10.53 | 200 | 2.1377 | 0.6921 | 0.1887 | 0.5033 | 0.0397 | 0.2682 |
0.0041 | 11.58 | 220 | 1.6776 | 0.7351 | 0.3278 | 0.4073 | 0.1358 | 0.1291 |
0.0042 | 12.63 | 240 | 1.8873 | 0.7086 | 0.2980 | 0.4106 | 0.1325 | 0.1589 |
0.0009 | 13.68 | 260 | 2.2464 | 0.6987 | 0.3543 | 0.3444 | 0.1987 | 0.1026 |
0.014 | 14.74 | 280 | 1.9753 | 0.7252 | 0.3245 | 0.4007 | 0.1424 | 0.1325 |
0.0026 | 15.79 | 300 | 1.8852 | 0.7417 | 0.2914 | 0.4503 | 0.0927 | 0.1656 |
0.0147 | 16.84 | 320 | 2.0273 | 0.7351 | 0.3113 | 0.4238 | 0.1192 | 0.1457 |
0.0009 | 17.89 | 340 | 1.7328 | 0.7483 | 0.3278 | 0.4205 | 0.1225 | 0.1291 |
0.0085 | 18.95 | 360 | 2.0146 | 0.7450 | 0.2815 | 0.4636 | 0.0795 | 0.1755 |
0.0001 | 20.0 | 380 | 2.0808 | 0.7450 | 0.3113 | 0.4338 | 0.1093 | 0.1457 |
0.0001 | 21.05 | 400 | 2.2655 | 0.7417 | 0.3609 | 0.3808 | 0.1623 | 0.0960 |
0.0034 | 22.11 | 420 | 2.0298 | 0.7583 | 0.3079 | 0.4503 | 0.0927 | 0.1490 |
0.0082 | 23.16 | 440 | 2.0650 | 0.7550 | 0.3344 | 0.4205 | 0.1225 | 0.1225 |
0.0001 | 24.21 | 460 | 2.2472 | 0.7450 | 0.2748 | 0.4702 | 0.0728 | 0.1821 |
0.0001 | 25.26 | 480 | 2.3655 | 0.7351 | 0.3709 | 0.3642 | 0.1788 | 0.0861 |
0.0004 | 26.32 | 500 | 2.1407 | 0.7550 | 0.3510 | 0.4040 | 0.1391 | 0.1060 |
0.0001 | 27.37 | 520 | 2.1168 | 0.7450 | 0.3642 | 0.3808 | 0.1623 | 0.0927 |
0.0002 | 28.42 | 540 | 2.2050 | 0.7517 | 0.3775 | 0.3742 | 0.1689 | 0.0795 |
0.0 | 29.47 | 560 | 2.0560 | 0.7682 | 0.3212 | 0.4470 | 0.0960 | 0.1358 |
0.0 | 30.53 | 580 | 2.0859 | 0.7715 | 0.3179 | 0.4536 | 0.0894 | 0.1391 |
0.0 | 31.58 | 600 | 2.0958 | 0.7715 | 0.3179 | 0.4536 | 0.0894 | 0.1391 |
0.0 | 32.63 | 620 | 2.1039 | 0.7715 | 0.3179 | 0.4536 | 0.0894 | 0.1391 |
0.0 | 33.68 | 640 | 2.1113 | 0.7715 | 0.3179 | 0.4536 | 0.0894 | 0.1391 |
0.0 | 34.74 | 660 | 2.1180 | 0.7715 | 0.3179 | 0.4536 | 0.0894 | 0.1391 |
0.0 | 35.79 | 680 | 2.1127 | 0.7715 | 0.3278 | 0.4437 | 0.0993 | 0.1291 |
0.0 | 36.84 | 700 | 2.1376 | 0.7682 | 0.3377 | 0.4305 | 0.1126 | 0.1192 |
0.0 | 37.89 | 720 | 2.1460 | 0.7616 | 0.3377 | 0.4238 | 0.1192 | 0.1192 |
0.0 | 38.95 | 740 | 2.1507 | 0.7649 | 0.3377 | 0.4272 | 0.1159 | 0.1192 |
0.0 | 40.0 | 760 | 2.1548 | 0.7682 | 0.3377 | 0.4305 | 0.1126 | 0.1192 |
0.0 | 41.05 | 780 | 2.1586 | 0.7682 | 0.3377 | 0.4305 | 0.1126 | 0.1192 |
0.0 | 42.11 | 800 | 2.1620 | 0.7682 | 0.3377 | 0.4305 | 0.1126 | 0.1192 |
0.0 | 43.16 | 820 | 2.1649 | 0.7682 | 0.3377 | 0.4305 | 0.1126 | 0.1192 |
0.0 | 44.21 | 840 | 2.1674 | 0.7682 | 0.3377 | 0.4305 | 0.1126 | 0.1192 |
0.0 | 45.26 | 860 | 2.1690 | 0.7682 | 0.3377 | 0.4305 | 0.1126 | 0.1192 |
0.0 | 46.32 | 880 | 2.1705 | 0.7682 | 0.3377 | 0.4305 | 0.1126 | 0.1192 |
0.0 | 47.37 | 900 | 2.1717 | 0.7649 | 0.3344 | 0.4305 | 0.1126 | 0.1225 |
0.0 | 48.42 | 920 | 2.1726 | 0.7649 | 0.3344 | 0.4305 | 0.1126 | 0.1225 |
0.0 | 49.47 | 940 | 2.1731 | 0.7649 | 0.3344 | 0.4305 | 0.1126 | 0.1225 |
Framework versions
- Transformers 4.26.1
- Pytorch 1.13.1
- Datasets 2.10.1
- Tokenizers 0.13.2