asun17904's picture
update model card README.md
99477ff
metadata
license: apache-2.0
tags:
  - generated_from_trainer
datasets:
  - crows_pairs
metrics:
  - accuracy
model-index:
  - name: bert-base-uncased_crows_pairs_classifieronly
    results:
      - task:
          name: Text Classification
          type: text-classification
        dataset:
          name: crows_pairs
          type: crows_pairs
          config: crows_pairs
          split: test
          args: crows_pairs
        metrics:
          - name: Accuracy
            type: accuracy
            value: 0.4735099337748344

bert-base-uncased_crows_pairs_classifieronly

This model is a fine-tuned version of bert-base-uncased on the crows_pairs dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6928
  • Accuracy: 0.4735
  • Tp: 0.3278
  • Tn: 0.1457
  • Fp: 0.3179
  • Fn: 0.2086

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 64
  • eval_batch_size: 64
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 50

Training results

Training Loss Epoch Step Validation Loss Accuracy Tp Tn Fp Fn
0.7033 1.05 20 0.6951 0.4636 0.0 0.4636 0.0 0.5364
0.7008 2.11 40 0.6951 0.4636 0.0 0.4636 0.0 0.5364
0.6998 3.16 60 0.6939 0.4570 0.0 0.4570 0.0066 0.5364
0.6958 4.21 80 0.6948 0.4636 0.0 0.4636 0.0 0.5364
0.7025 5.26 100 0.6981 0.4636 0.0 0.4636 0.0 0.5364
0.7083 6.32 120 0.6917 0.5397 0.5364 0.0033 0.4603 0.0
0.7016 7.37 140 0.6939 0.4570 0.0 0.4570 0.0066 0.5364
0.7061 8.42 160 0.6984 0.4636 0.0 0.4636 0.0 0.5364
0.698 9.47 180 0.6948 0.4636 0.0 0.4636 0.0 0.5364
0.7009 10.53 200 0.6931 0.4967 0.1623 0.3344 0.1291 0.3742
0.7047 11.58 220 0.6998 0.4636 0.0 0.4636 0.0 0.5364
0.6945 12.63 240 0.6935 0.4868 0.0364 0.4503 0.0132 0.5
0.708 13.68 260 0.6924 0.5364 0.5033 0.0331 0.4305 0.0331
0.7007 14.74 280 0.6935 0.4834 0.0331 0.4503 0.0132 0.5033
0.6999 15.79 300 0.6938 0.4636 0.0066 0.4570 0.0066 0.5298
0.6993 16.84 320 0.6939 0.4570 0.0 0.4570 0.0066 0.5364
0.7002 17.89 340 0.6953 0.4636 0.0 0.4636 0.0 0.5364
0.7025 18.95 360 0.6956 0.4636 0.0 0.4636 0.0 0.5364
0.7008 20.0 380 0.6905 0.5397 0.5364 0.0033 0.4603 0.0
0.7065 21.05 400 0.6970 0.4636 0.0 0.4636 0.0 0.5364
0.6996 22.11 420 0.6954 0.4636 0.0 0.4636 0.0 0.5364
0.7 23.16 440 0.6962 0.4636 0.0 0.4636 0.0 0.5364
0.7028 24.21 460 0.6948 0.4636 0.0 0.4636 0.0 0.5364
0.6924 25.26 480 0.6930 0.4834 0.2020 0.2815 0.1821 0.3344
0.6973 26.32 500 0.6941 0.4636 0.0 0.4636 0.0 0.5364
0.6953 27.37 520 0.6938 0.4603 0.0033 0.4570 0.0066 0.5331
0.6971 28.42 540 0.6928 0.4735 0.3411 0.1325 0.3311 0.1954
0.7086 29.47 560 0.6924 0.5199 0.5066 0.0132 0.4503 0.0298
0.6959 30.53 580 0.6925 0.5066 0.4636 0.0430 0.4205 0.0728
0.7103 31.58 600 0.6919 0.5397 0.5364 0.0033 0.4603 0.0
0.7019 32.63 620 0.6916 0.5397 0.5364 0.0033 0.4603 0.0
0.6941 33.68 640 0.6935 0.4868 0.0364 0.4503 0.0132 0.5
0.6878 34.74 660 0.6959 0.4636 0.0 0.4636 0.0 0.5364
0.6995 35.79 680 0.6954 0.4636 0.0 0.4636 0.0 0.5364
0.6968 36.84 700 0.6916 0.5397 0.5364 0.0033 0.4603 0.0
0.6997 37.89 720 0.6921 0.5331 0.5265 0.0066 0.4570 0.0099
0.6975 38.95 740 0.6964 0.4636 0.0 0.4636 0.0 0.5364
0.7026 40.0 760 0.6956 0.4636 0.0 0.4636 0.0 0.5364
0.7057 41.05 780 0.6943 0.4636 0.0 0.4636 0.0 0.5364
0.7028 42.11 800 0.6953 0.4636 0.0 0.4636 0.0 0.5364
0.6987 43.16 820 0.6938 0.4603 0.0033 0.4570 0.0066 0.5331
0.6973 44.21 840 0.6933 0.4868 0.0497 0.4371 0.0265 0.4868
0.7119 45.26 860 0.6930 0.4801 0.1788 0.3013 0.1623 0.3576
0.7041 46.32 880 0.6928 0.4967 0.3179 0.1788 0.2848 0.2185
0.7114 47.37 900 0.6926 0.4967 0.4139 0.0828 0.3808 0.1225
0.702 48.42 920 0.6929 0.4735 0.2318 0.2417 0.2219 0.3046
0.6945 49.47 940 0.6928 0.4735 0.3278 0.1457 0.3179 0.2086

Framework versions

  • Transformers 4.26.1
  • Pytorch 1.13.1
  • Datasets 2.10.1
  • Tokenizers 0.13.2