File size: 6,293 Bytes
ddddf13
 
 
 
3c57a9f
 
 
 
ddddf13
 
3c57a9f
 
 
 
 
 
 
 
 
 
 
 
 
 
ddddf13
 
 
 
 
 
 
3c57a9f
 
 
 
 
 
 
 
ddddf13
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
8ba18fe
b2ff2c6
ddddf13
 
 
 
5006896
ddddf13
3c57a9f
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
ddddf13
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
---
license: mit
tags:
- generated_from_trainer
datasets:
- crows_pairs
metrics:
- accuracy
model-index:
- name: gpt2_crows_pairs_finetuned
  results:
  - task:
      name: Text Classification
      type: text-classification
    dataset:
      name: crows_pairs
      type: crows_pairs
      config: crows_pairs
      split: test
      args: crows_pairs
    metrics:
    - name: Accuracy
      type: accuracy
      value: 0.7781456953642384
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# gpt2_crows_pairs_finetuned

This model is a fine-tuned version of [gpt2](https://huggingface.co/gpt2) on the crows_pairs dataset.
It achieves the following results on the evaluation set:
- Loss: 2.0946
- Accuracy: 0.7781
- Tp: 0.3444
- Tn: 0.4338
- Fp: 0.1159
- Fn: 0.1060

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 64
- eval_batch_size: 64
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 50

### Training results

| Training Loss | Epoch | Step | Validation Loss | Accuracy | Tp     | Tn     | Fp     | Fn     |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:------:|:------:|:------:|:------:|
| 0.7371        | 1.05  | 20   | 0.7345          | 0.4669   | 0.4305 | 0.0364 | 0.5132 | 0.0199 |
| 0.6794        | 2.11  | 40   | 0.6829          | 0.5397   | 0.3013 | 0.2384 | 0.3113 | 0.1490 |
| 0.5972        | 3.16  | 60   | 0.6602          | 0.6291   | 0.3411 | 0.2881 | 0.2616 | 0.1093 |
| 0.4691        | 4.21  | 80   | 0.6568          | 0.6788   | 0.3742 | 0.3046 | 0.2450 | 0.0762 |
| 0.3645        | 5.26  | 100  | 0.5872          | 0.7252   | 0.2815 | 0.4437 | 0.1060 | 0.1689 |
| 0.2645        | 6.32  | 120  | 0.6835          | 0.7185   | 0.2318 | 0.4868 | 0.0629 | 0.2185 |
| 0.1698        | 7.37  | 140  | 0.7757          | 0.7483   | 0.2914 | 0.4570 | 0.0927 | 0.1589 |
| 0.1386        | 8.42  | 160  | 0.7445          | 0.7417   | 0.2881 | 0.4536 | 0.0960 | 0.1623 |
| 0.077         | 9.47  | 180  | 1.0591          | 0.7252   | 0.3642 | 0.3609 | 0.1887 | 0.0861 |
| 0.0836        | 10.53 | 200  | 1.0908          | 0.7185   | 0.2649 | 0.4536 | 0.0960 | 0.1854 |
| 0.0485        | 11.58 | 220  | 1.2155          | 0.7450   | 0.3709 | 0.3742 | 0.1755 | 0.0795 |
| 0.0298        | 12.63 | 240  | 1.1973          | 0.7417   | 0.3245 | 0.4172 | 0.1325 | 0.1258 |
| 0.0444        | 13.68 | 260  | 1.4213          | 0.7384   | 0.3675 | 0.3709 | 0.1788 | 0.0828 |
| 0.0215        | 14.74 | 280  | 1.4907          | 0.7450   | 0.3278 | 0.4172 | 0.1325 | 0.1225 |
| 0.0483        | 15.79 | 300  | 1.5485          | 0.7583   | 0.2781 | 0.4801 | 0.0695 | 0.1722 |
| 0.0129        | 16.84 | 320  | 1.7145          | 0.7550   | 0.2748 | 0.4801 | 0.0695 | 0.1755 |
| 0.0525        | 17.89 | 340  | 1.7827          | 0.7550   | 0.3642 | 0.3907 | 0.1589 | 0.0861 |
| 0.0074        | 18.95 | 360  | 1.6230          | 0.7682   | 0.2980 | 0.4702 | 0.0795 | 0.1523 |
| 0.004         | 20.0  | 380  | 1.8522          | 0.7384   | 0.3444 | 0.3940 | 0.1556 | 0.1060 |
| 0.0067        | 21.05 | 400  | 1.8479          | 0.7616   | 0.3046 | 0.4570 | 0.0927 | 0.1457 |
| 0.001         | 22.11 | 420  | 1.9830          | 0.7682   | 0.2947 | 0.4735 | 0.0762 | 0.1556 |
| 0.01          | 23.16 | 440  | 1.9412          | 0.7715   | 0.3113 | 0.4603 | 0.0894 | 0.1391 |
| 0.0048        | 24.21 | 460  | 2.0075          | 0.7649   | 0.3510 | 0.4139 | 0.1358 | 0.0993 |
| 0.0025        | 25.26 | 480  | 2.0912          | 0.7649   | 0.2980 | 0.4669 | 0.0828 | 0.1523 |
| 0.0013        | 26.32 | 500  | 2.1548          | 0.7715   | 0.3444 | 0.4272 | 0.1225 | 0.1060 |
| 0.0041        | 27.37 | 520  | 2.1337          | 0.7682   | 0.3543 | 0.4139 | 0.1358 | 0.0960 |
| 0.0005        | 28.42 | 540  | 2.1242          | 0.7550   | 0.3576 | 0.3974 | 0.1523 | 0.0927 |
| 0.0124        | 29.47 | 560  | 2.1297          | 0.7583   | 0.3642 | 0.3940 | 0.1556 | 0.0861 |
| 0.0104        | 30.53 | 580  | 2.0057          | 0.7583   | 0.3179 | 0.4404 | 0.1093 | 0.1325 |
| 0.0156        | 31.58 | 600  | 2.0365          | 0.7483   | 0.2881 | 0.4603 | 0.0894 | 0.1623 |
| 0.0003        | 32.63 | 620  | 1.9614          | 0.7649   | 0.3212 | 0.4437 | 0.1060 | 0.1291 |
| 0.0029        | 33.68 | 640  | 1.9658          | 0.7682   | 0.3245 | 0.4437 | 0.1060 | 0.1258 |
| 0.0001        | 34.74 | 660  | 1.9913          | 0.7649   | 0.3013 | 0.4636 | 0.0861 | 0.1490 |
| 0.0001        | 35.79 | 680  | 2.0039          | 0.7649   | 0.3013 | 0.4636 | 0.0861 | 0.1490 |
| 0.0004        | 36.84 | 700  | 1.9657          | 0.7715   | 0.3146 | 0.4570 | 0.0927 | 0.1358 |
| 0.0003        | 37.89 | 720  | 1.9787          | 0.7748   | 0.3245 | 0.4503 | 0.0993 | 0.1258 |
| 0.0007        | 38.95 | 740  | 1.9888          | 0.7781   | 0.3377 | 0.4404 | 0.1093 | 0.1126 |
| 0.0002        | 40.0  | 760  | 2.0293          | 0.7682   | 0.3477 | 0.4205 | 0.1291 | 0.1026 |
| 0.0002        | 41.05 | 780  | 1.9914          | 0.7781   | 0.3245 | 0.4536 | 0.0960 | 0.1258 |
| 0.0003        | 42.11 | 800  | 2.0444          | 0.7583   | 0.2914 | 0.4669 | 0.0828 | 0.1589 |
| 0.0072        | 43.16 | 820  | 2.0247          | 0.7649   | 0.3278 | 0.4371 | 0.1126 | 0.1225 |
| 0.0001        | 44.21 | 840  | 2.0398          | 0.7682   | 0.3278 | 0.4404 | 0.1093 | 0.1225 |
| 0.0001        | 45.26 | 860  | 2.0358          | 0.7682   | 0.3278 | 0.4404 | 0.1093 | 0.1225 |
| 0.0011        | 46.32 | 880  | 2.0432          | 0.7682   | 0.3278 | 0.4404 | 0.1093 | 0.1225 |
| 0.0001        | 47.37 | 900  | 2.0923          | 0.7781   | 0.3444 | 0.4338 | 0.1159 | 0.1060 |
| 0.0           | 48.42 | 920  | 2.0975          | 0.7781   | 0.3444 | 0.4338 | 0.1159 | 0.1060 |
| 0.0002        | 49.47 | 940  | 2.0946          | 0.7781   | 0.3444 | 0.4338 | 0.1159 | 0.1060 |


### Framework versions

- Transformers 4.26.1
- Pytorch 1.13.1
- Datasets 2.10.1
- Tokenizers 0.13.2