File size: 4,475 Bytes
8a629e6 bfedaf9 8a629e6 bfedaf9 8a629e6 bfedaf9 8a629e6 a4cba8f 8a629e6 a4cba8f 5c98e86 bfedaf9 8a629e6 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 |
---
license: apache-2.0
tags:
- generated_from_trainer
datasets:
- crows_pairs
metrics:
- accuracy
model-index:
- name: t5-small_crows_pairs_finetuned
results:
- task:
name: Sequence-to-sequence Language Modeling
type: text2text-generation
dataset:
name: crows_pairs
type: crows_pairs
config: crows_pairs
split: test
args: crows_pairs
metrics:
- name: Accuracy
type: accuracy
value: 0.6390728476821192
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# t5-small_crows_pairs_finetuned
This model is a fine-tuned version of [t5-small](https://huggingface.co/t5-small) on the crows_pairs dataset.
It achieves the following results on the evaluation set:
- Loss: 0.7111
- Accuracy: 0.6391
- Tp: 0.4934
- Tn: 0.1457
- Fp: 0.3510
- Fn: 0.0099
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0003
- train_batch_size: 64
- eval_batch_size: 64
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 30
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Tp | Tn | Fp | Fn |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:------:|:------:|:------:|:------:|
| 0.6595 | 1.05 | 20 | 0.3672 | 0.5033 | 0.5033 | 0.0 | 0.4967 | 0.0 |
| 0.4048 | 2.11 | 40 | 0.3723 | 0.5033 | 0.5033 | 0.0 | 0.4967 | 0.0 |
| 0.3397 | 3.16 | 60 | 0.3397 | 0.5033 | 0.5033 | 0.0 | 0.4967 | 0.0 |
| 0.3215 | 4.21 | 80 | 0.3227 | 0.5132 | 0.5033 | 0.0099 | 0.4868 | 0.0 |
| 0.3078 | 5.26 | 100 | 0.3381 | 0.6060 | 0.5033 | 0.1026 | 0.3940 | 0.0 |
| 0.2258 | 6.32 | 120 | 0.3012 | 0.5629 | 0.5 | 0.0629 | 0.4338 | 0.0033 |
| 0.2099 | 7.37 | 140 | 0.3018 | 0.5894 | 0.5 | 0.0894 | 0.4073 | 0.0033 |
| 0.1531 | 8.42 | 160 | 0.3379 | 0.5464 | 0.5033 | 0.0430 | 0.4536 | 0.0 |
| 0.129 | 9.47 | 180 | 0.3602 | 0.5993 | 0.5 | 0.0993 | 0.3974 | 0.0033 |
| 0.0956 | 10.53 | 200 | 0.3846 | 0.5762 | 0.5 | 0.0762 | 0.4205 | 0.0033 |
| 0.0736 | 11.58 | 220 | 0.4245 | 0.5695 | 0.5033 | 0.0662 | 0.4305 | 0.0 |
| 0.0474 | 12.63 | 240 | 0.4938 | 0.5695 | 0.5033 | 0.0662 | 0.4305 | 0.0 |
| 0.0369 | 13.68 | 260 | 0.5201 | 0.5960 | 0.5 | 0.0960 | 0.4007 | 0.0033 |
| 0.0323 | 14.74 | 280 | 0.5559 | 0.5993 | 0.4934 | 0.1060 | 0.3907 | 0.0099 |
| 0.0267 | 15.79 | 300 | 0.5965 | 0.5894 | 0.5 | 0.0894 | 0.4073 | 0.0033 |
| 0.026 | 16.84 | 320 | 0.6052 | 0.5960 | 0.4967 | 0.0993 | 0.3974 | 0.0066 |
| 0.0194 | 17.89 | 340 | 0.6144 | 0.6126 | 0.4934 | 0.1192 | 0.3775 | 0.0099 |
| 0.0242 | 18.95 | 360 | 0.6286 | 0.6126 | 0.4934 | 0.1192 | 0.3775 | 0.0099 |
| 0.0274 | 20.0 | 380 | 0.6313 | 0.6325 | 0.4901 | 0.1424 | 0.3543 | 0.0132 |
| 0.0151 | 21.05 | 400 | 0.6685 | 0.6192 | 0.4934 | 0.1258 | 0.3709 | 0.0099 |
| 0.0131 | 22.11 | 420 | 0.6815 | 0.6258 | 0.4934 | 0.1325 | 0.3642 | 0.0099 |
| 0.0095 | 23.16 | 440 | 0.6961 | 0.6192 | 0.4967 | 0.1225 | 0.3742 | 0.0066 |
| 0.0064 | 24.21 | 460 | 0.6980 | 0.6325 | 0.4934 | 0.1391 | 0.3576 | 0.0099 |
| 0.0103 | 25.26 | 480 | 0.7117 | 0.6192 | 0.4934 | 0.1258 | 0.3709 | 0.0099 |
| 0.0083 | 26.32 | 500 | 0.7096 | 0.6258 | 0.4934 | 0.1325 | 0.3642 | 0.0099 |
| 0.0079 | 27.37 | 520 | 0.7198 | 0.6258 | 0.4934 | 0.1325 | 0.3642 | 0.0099 |
| 0.01 | 28.42 | 540 | 0.7210 | 0.6258 | 0.4934 | 0.1325 | 0.3642 | 0.0099 |
| 0.011 | 29.47 | 560 | 0.7111 | 0.6391 | 0.4934 | 0.1457 | 0.3510 | 0.0099 |
### Framework versions
- Transformers 4.26.1
- Pytorch 1.13.1
- Datasets 2.10.1
- Tokenizers 0.13.2
|