librarian-bot's picture
Librarian Bot: Add base_model information to model
e314a21
|
raw
history blame
6.41 kB
metadata
license: apache-2.0
tags:
  - generated_from_trainer
datasets:
  - crows_pairs
metrics:
  - accuracy
base_model: google/multiberts-seed_0
model-index:
  - name: multiberts-seed_0_crows_pairs_classifieronly
    results:
      - task:
          type: text-classification
          name: Text Classification
        dataset:
          name: crows_pairs
          type: crows_pairs
          config: crows_pairs
          split: test
          args: crows_pairs
        metrics:
          - type: accuracy
            value: 0.5132450331125827
            name: Accuracy

multiberts-seed_0_crows_pairs_classifieronly

This model is a fine-tuned version of google/multiberts-seed_0 on the crows_pairs dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6940
  • Accuracy: 0.5132
  • Tp: 0.0695
  • Tn: 0.4437
  • Fp: 0.0199
  • Fn: 0.4669

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 64
  • eval_batch_size: 64
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 50

Training results

Training Loss Epoch Step Validation Loss Accuracy Tp Tn Fp Fn
0.7145 1.05 20 0.6927 0.5166 0.4172 0.0993 0.3642 0.1192
0.7091 2.11 40 0.6916 0.5166 0.4967 0.0199 0.4437 0.0397
0.7005 3.16 60 0.6943 0.4702 0.1854 0.2848 0.1788 0.3510
0.7106 4.21 80 0.6976 0.4669 0.0033 0.4636 0.0 0.5331
0.7038 5.26 100 0.6933 0.5132 0.3411 0.1722 0.2914 0.1954
0.7012 6.32 120 0.6974 0.4669 0.0033 0.4636 0.0 0.5331
0.7 7.37 140 0.6918 0.5199 0.4768 0.0430 0.4205 0.0596
0.7082 8.42 160 0.6960 0.4702 0.0066 0.4636 0.0 0.5298
0.699 9.47 180 0.6936 0.5199 0.2781 0.2417 0.2219 0.2583
0.7004 10.53 200 0.7043 0.4636 0.0 0.4636 0.0 0.5364
0.6948 11.58 220 0.6907 0.5232 0.5199 0.0033 0.4603 0.0166
0.7132 12.63 240 0.6979 0.4669 0.0033 0.4636 0.0 0.5331
0.7062 13.68 260 0.6930 0.5232 0.3444 0.1788 0.2848 0.1921
0.6983 14.74 280 0.6966 0.4669 0.0033 0.4636 0.0 0.5331
0.6996 15.79 300 0.6927 0.5265 0.3742 0.1523 0.3113 0.1623
0.7039 16.84 320 0.6972 0.4669 0.0033 0.4636 0.0 0.5331
0.6862 17.89 340 0.6914 0.5232 0.4967 0.0265 0.4371 0.0397
0.6943 18.95 360 0.6947 0.4934 0.0430 0.4503 0.0132 0.4934
0.7063 20.0 380 0.6907 0.5232 0.5199 0.0033 0.4603 0.0166
0.7087 21.05 400 0.6947 0.4834 0.0331 0.4503 0.0132 0.5033
0.7033 22.11 420 0.6945 0.4934 0.0464 0.4470 0.0166 0.4901
0.7025 23.16 440 0.6934 0.4967 0.2185 0.2781 0.1854 0.3179
0.7035 24.21 460 0.6944 0.5033 0.0563 0.4470 0.0166 0.4801
0.6958 25.26 480 0.6941 0.5199 0.0795 0.4404 0.0232 0.4570
0.6955 26.32 500 0.6946 0.4868 0.0364 0.4503 0.0132 0.5
0.7046 27.37 520 0.6932 0.5199 0.2715 0.2483 0.2152 0.2649
0.6955 28.42 540 0.6954 0.4702 0.0066 0.4636 0.0 0.5298
0.7095 29.47 560 0.6961 0.4702 0.0066 0.4636 0.0 0.5298
0.6953 30.53 580 0.6919 0.5132 0.4371 0.0762 0.3874 0.0993
0.7124 31.58 600 0.6946 0.4834 0.0331 0.4503 0.0132 0.5033
0.6929 32.63 620 0.6936 0.5265 0.1457 0.3808 0.0828 0.3907
0.7103 33.68 640 0.6926 0.5265 0.3477 0.1788 0.2848 0.1887
0.6993 34.74 660 0.6937 0.5166 0.1192 0.3974 0.0662 0.4172
0.6975 35.79 680 0.6936 0.5199 0.1291 0.3907 0.0728 0.4073
0.6935 36.84 700 0.6937 0.5199 0.1026 0.4172 0.0464 0.4338
0.7039 37.89 720 0.6925 0.5232 0.3642 0.1589 0.3046 0.1722
0.6999 38.95 740 0.6941 0.5099 0.0662 0.4437 0.0199 0.4702
0.6965 40.0 760 0.6948 0.4735 0.0166 0.4570 0.0066 0.5199
0.7039 41.05 780 0.6944 0.4934 0.0430 0.4503 0.0132 0.4934
0.7026 42.11 800 0.6939 0.5199 0.0795 0.4404 0.0232 0.4570
0.7072 43.16 820 0.6930 0.5199 0.2781 0.2417 0.2219 0.2583
0.7064 44.21 840 0.6930 0.5199 0.2781 0.2417 0.2219 0.2583
0.6982 45.26 860 0.6933 0.5199 0.2119 0.3079 0.1556 0.3245
0.6921 46.32 880 0.6935 0.5232 0.1490 0.3742 0.0894 0.3874
0.6983 47.37 900 0.6939 0.5199 0.0762 0.4437 0.0199 0.4603
0.6938 48.42 920 0.6940 0.5132 0.0695 0.4437 0.0199 0.4669
0.6984 49.47 940 0.6940 0.5132 0.0695 0.4437 0.0199 0.4669

Framework versions

  • Transformers 4.26.1
  • Pytorch 1.13.1
  • Datasets 2.10.1
  • Tokenizers 0.13.2