henryscheible commited on
Commit
da975b1
1 Parent(s): 3397040

update model card README.md

Browse files
Files changed (1) hide show
  1. README.md +55 -2
README.md CHANGED
@@ -2,9 +2,26 @@
2
  license: apache-2.0
3
  tags:
4
  - generated_from_trainer
 
 
 
 
5
  model-index:
6
  - name: t5-small_stereoset_finetuned_HBRPOI
7
- results: []
 
 
 
 
 
 
 
 
 
 
 
 
 
8
  ---
9
 
10
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
@@ -12,7 +29,14 @@ should probably proofread and complete it, then remove this comment. -->
12
 
13
  # t5-small_stereoset_finetuned_HBRPOI
14
 
15
- This model is a fine-tuned version of [t5-small](https://huggingface.co/t5-small) on an unknown dataset.
 
 
 
 
 
 
 
16
 
17
  ## Model description
18
 
@@ -39,6 +63,35 @@ The following hyperparameters were used during training:
39
  - lr_scheduler_type: linear
40
  - num_epochs: 10
41
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
42
  ### Framework versions
43
 
44
  - Transformers 4.26.1
 
2
  license: apache-2.0
3
  tags:
4
  - generated_from_trainer
5
+ datasets:
6
+ - stereoset
7
+ metrics:
8
+ - accuracy
9
  model-index:
10
  - name: t5-small_stereoset_finetuned_HBRPOI
11
+ results:
12
+ - task:
13
+ name: Sequence-to-sequence Language Modeling
14
+ type: text2text-generation
15
+ dataset:
16
+ name: stereoset
17
+ type: stereoset
18
+ config: intersentence
19
+ split: validation
20
+ args: intersentence
21
+ metrics:
22
+ - name: Accuracy
23
+ type: accuracy
24
+ value: 0.6028257456828885
25
  ---
26
 
27
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
 
29
 
30
  # t5-small_stereoset_finetuned_HBRPOI
31
 
32
+ This model is a fine-tuned version of [t5-small](https://huggingface.co/t5-small) on the stereoset dataset.
33
+ It achieves the following results on the evaluation set:
34
+ - Loss: 0.4383
35
+ - Accuracy: 0.6028
36
+ - Tp: 0.4890
37
+ - Tn: 0.1138
38
+ - Fp: 0.3854
39
+ - Fn: 0.0118
40
 
41
  ## Model description
42
 
 
63
  - lr_scheduler_type: linear
64
  - num_epochs: 10
65
 
66
+ ### Training results
67
+
68
+ | Training Loss | Epoch | Step | Validation Loss | Accuracy | Tp | Tn | Fp | Fn |
69
+ |:-------------:|:-----:|:----:|:---------------:|:--------:|:------:|:------:|:------:|:------:|
70
+ | 0.4447 | 0.43 | 20 | 0.3978 | 0.5008 | 0.5008 | 0.0 | 0.4992 | 0.0 |
71
+ | 0.3776 | 0.85 | 40 | 0.3448 | 0.6232 | 0.5008 | 0.1224 | 0.3768 | 0.0 |
72
+ | 0.3649 | 1.28 | 60 | 0.3269 | 0.5612 | 0.5 | 0.0612 | 0.4380 | 0.0008 |
73
+ | 0.3275 | 1.7 | 80 | 0.3218 | 0.5330 | 0.4992 | 0.0338 | 0.4655 | 0.0016 |
74
+ | 0.2969 | 2.13 | 100 | 0.3104 | 0.6256 | 0.4961 | 0.1295 | 0.3697 | 0.0047 |
75
+ | 0.3283 | 2.55 | 120 | 0.3111 | 0.5730 | 0.4992 | 0.0738 | 0.4254 | 0.0016 |
76
+ | 0.3046 | 2.98 | 140 | 0.3040 | 0.5416 | 0.4992 | 0.0424 | 0.4568 | 0.0016 |
77
+ | 0.2603 | 3.4 | 160 | 0.3057 | 0.5447 | 0.4992 | 0.0455 | 0.4537 | 0.0016 |
78
+ | 0.2828 | 3.83 | 180 | 0.3186 | 0.5479 | 0.4984 | 0.0495 | 0.4498 | 0.0024 |
79
+ | 0.2326 | 4.26 | 200 | 0.3036 | 0.6193 | 0.4937 | 0.1256 | 0.3736 | 0.0071 |
80
+ | 0.2289 | 4.68 | 220 | 0.3328 | 0.5479 | 0.4976 | 0.0502 | 0.4490 | 0.0031 |
81
+ | 0.2234 | 5.11 | 240 | 0.3140 | 0.5777 | 0.4976 | 0.0801 | 0.4192 | 0.0031 |
82
+ | 0.2225 | 5.53 | 260 | 0.3245 | 0.5691 | 0.4976 | 0.0714 | 0.4278 | 0.0031 |
83
+ | 0.187 | 5.96 | 280 | 0.3300 | 0.5785 | 0.4961 | 0.0824 | 0.4168 | 0.0047 |
84
+ | 0.179 | 6.38 | 300 | 0.3344 | 0.5848 | 0.4961 | 0.0887 | 0.4105 | 0.0047 |
85
+ | 0.1523 | 6.81 | 320 | 0.3528 | 0.5895 | 0.4969 | 0.0926 | 0.4066 | 0.0039 |
86
+ | 0.1499 | 7.23 | 340 | 0.3788 | 0.6232 | 0.4906 | 0.1327 | 0.3666 | 0.0102 |
87
+ | 0.1292 | 7.66 | 360 | 0.3889 | 0.5942 | 0.4914 | 0.1028 | 0.3964 | 0.0094 |
88
+ | 0.13 | 8.09 | 380 | 0.3959 | 0.5903 | 0.4937 | 0.0965 | 0.4027 | 0.0071 |
89
+ | 0.1216 | 8.51 | 400 | 0.4169 | 0.5856 | 0.4922 | 0.0934 | 0.4058 | 0.0086 |
90
+ | 0.1306 | 8.94 | 420 | 0.4227 | 0.6005 | 0.4898 | 0.1107 | 0.3885 | 0.0110 |
91
+ | 0.0968 | 9.36 | 440 | 0.4334 | 0.5965 | 0.4914 | 0.1052 | 0.3940 | 0.0094 |
92
+ | 0.1044 | 9.79 | 460 | 0.4383 | 0.6028 | 0.4890 | 0.1138 | 0.3854 | 0.0118 |
93
+
94
+
95
  ### Framework versions
96
 
97
  - Transformers 4.26.1