dmjimenezbravo commited on
Commit
2849df7
1 Parent(s): 82fec5e

End of training

Browse files
README.md CHANGED
@@ -16,9 +16,9 @@ should probably proofread and complete it, then remove this comment. -->
16
 
17
  This model is a fine-tuned version of [mrm8488/electricidad-small-discriminator](https://huggingface.co/mrm8488/electricidad-small-discriminator) on an unknown dataset.
18
  It achieves the following results on the evaluation set:
19
- - Loss: 1.1726
20
- - Accuracy: 0.7483
21
- - F1: 0.7483
22
 
23
  ## Model description
24
 
@@ -43,37 +43,77 @@ The following hyperparameters were used during training:
43
  - seed: 42
44
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
45
  - lr_scheduler_type: linear
46
- - num_epochs: 20
47
 
48
  ### Training results
49
 
50
  | Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 |
51
  |:-------------:|:-----:|:-----:|:---------------:|:--------:|:------:|
52
- | 0.925 | 1.0 | 1222 | 0.7993 | 0.6779 | 0.6779 |
53
- | 0.7624 | 2.0 | 2444 | 0.6550 | 0.7409 | 0.7409 |
54
- | 0.6442 | 3.0 | 3666 | 0.5699 | 0.7813 | 0.7813 |
55
- | 0.5843 | 4.0 | 4888 | 0.5064 | 0.8099 | 0.8099 |
56
- | 0.5355 | 5.0 | 6110 | 0.4266 | 0.8470 | 0.8470 |
57
- | 0.4697 | 6.0 | 7332 | 0.3900 | 0.8619 | 0.8619 |
58
- | 0.4214 | 7.0 | 8554 | 0.3775 | 0.8657 | 0.8657 |
59
- | 0.4 | 8.0 | 9776 | 0.3037 | 0.8987 | 0.8987 |
60
- | 0.3537 | 9.0 | 10998 | 0.2824 | 0.9120 | 0.9120 |
61
- | 0.3192 | 10.0 | 12220 | 0.2157 | 0.9327 | 0.9327 |
62
- | 0.2983 | 11.0 | 13442 | 0.1769 | 0.9483 | 0.9483 |
63
- | 0.2653 | 12.0 | 14664 | 0.1630 | 0.9527 | 0.9527 |
64
- | 0.2454 | 13.0 | 15886 | 0.1549 | 0.9573 | 0.9573 |
65
- | 0.2293 | 14.0 | 17108 | 0.1396 | 0.9632 | 0.9632 |
66
- | 0.211 | 15.0 | 18330 | 0.1378 | 0.9642 | 0.9642 |
67
- | 0.2096 | 16.0 | 19552 | 0.1285 | 0.9673 | 0.9673 |
68
- | 0.1934 | 17.0 | 20774 | 0.1209 | 0.9713 | 0.9713 |
69
- | 0.1679 | 18.0 | 21996 | 0.1103 | 0.9747 | 0.9747 |
70
- | 0.1782 | 19.0 | 23218 | 0.1054 | 0.9747 | 0.9747 |
71
- | 0.1647 | 20.0 | 24440 | 0.1048 | 0.9749 | 0.9749 |
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
72
 
73
 
74
  ### Framework versions
75
 
76
  - Transformers 4.18.0
77
- - Pytorch 1.10.0+cu111
78
  - Datasets 2.1.0
79
  - Tokenizers 0.12.1
 
16
 
17
  This model is a fine-tuned version of [mrm8488/electricidad-small-discriminator](https://huggingface.co/mrm8488/electricidad-small-discriminator) on an unknown dataset.
18
  It achieves the following results on the evaluation set:
19
+ - Loss: 2.3327
20
+ - Accuracy: 0.7642
21
+ - F1: 0.7642
22
 
23
  ## Model description
24
 
 
43
  - seed: 42
44
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
45
  - lr_scheduler_type: linear
46
+ - num_epochs: 60
47
 
48
  ### Training results
49
 
50
  | Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 |
51
  |:-------------:|:-----:|:-----:|:---------------:|:--------:|:------:|
52
+ | 0.88 | 1.0 | 1222 | 0.7491 | 0.6943 | 0.6943 |
53
+ | 0.7292 | 2.0 | 2444 | 0.6253 | 0.7544 | 0.7544 |
54
+ | 0.6346 | 3.0 | 3666 | 0.5292 | 0.7971 | 0.7971 |
55
+ | 0.565 | 4.0 | 4888 | 0.4831 | 0.8168 | 0.8168 |
56
+ | 0.4898 | 5.0 | 6110 | 0.4086 | 0.8532 | 0.8532 |
57
+ | 0.4375 | 6.0 | 7332 | 0.3411 | 0.8831 | 0.8831 |
58
+ | 0.3968 | 7.0 | 8554 | 0.2735 | 0.9100 | 0.9100 |
59
+ | 0.3321 | 8.0 | 9776 | 0.2343 | 0.9253 | 0.9253 |
60
+ | 0.3045 | 9.0 | 10998 | 0.1855 | 0.9450 | 0.9450 |
61
+ | 0.2837 | 10.0 | 12220 | 0.1539 | 0.9591 | 0.9591 |
62
+ | 0.2411 | 11.0 | 13442 | 0.1309 | 0.9650 | 0.9650 |
63
+ | 0.2203 | 12.0 | 14664 | 0.1100 | 0.9716 | 0.9716 |
64
+ | 0.1953 | 13.0 | 15886 | 0.1067 | 0.9760 | 0.9760 |
65
+ | 0.1836 | 14.0 | 17108 | 0.0755 | 0.9813 | 0.9813 |
66
+ | 0.1611 | 15.0 | 18330 | 0.0731 | 0.9829 | 0.9829 |
67
+ | 0.1479 | 16.0 | 19552 | 0.0746 | 0.9839 | 0.9839 |
68
+ | 0.138 | 17.0 | 20774 | 0.0516 | 0.9895 | 0.9895 |
69
+ | 0.129 | 18.0 | 21996 | 0.0481 | 0.9903 | 0.9903 |
70
+ | 0.1182 | 19.0 | 23218 | 0.0401 | 0.9926 | 0.9926 |
71
+ | 0.1065 | 20.0 | 24440 | 0.0488 | 0.9895 | 0.9895 |
72
+ | 0.096 | 21.0 | 25662 | 0.0333 | 0.9928 | 0.9928 |
73
+ | 0.0889 | 22.0 | 26884 | 0.0222 | 0.9951 | 0.9951 |
74
+ | 0.0743 | 23.0 | 28106 | 0.0236 | 0.9951 | 0.9951 |
75
+ | 0.0821 | 24.0 | 29328 | 0.0322 | 0.9931 | 0.9931 |
76
+ | 0.0866 | 25.0 | 30550 | 0.0135 | 0.9974 | 0.9974 |
77
+ | 0.0616 | 26.0 | 31772 | 0.0100 | 0.9980 | 0.9980 |
78
+ | 0.0641 | 27.0 | 32994 | 0.0112 | 0.9977 | 0.9977 |
79
+ | 0.0603 | 28.0 | 34216 | 0.0071 | 0.9987 | 0.9987 |
80
+ | 0.0491 | 29.0 | 35438 | 0.0088 | 0.9982 | 0.9982 |
81
+ | 0.0563 | 30.0 | 36660 | 0.0071 | 0.9982 | 0.9982 |
82
+ | 0.0467 | 31.0 | 37882 | 0.0045 | 0.9990 | 0.9990 |
83
+ | 0.0545 | 32.0 | 39104 | 0.0057 | 0.9987 | 0.9987 |
84
+ | 0.0519 | 33.0 | 40326 | 0.0048 | 0.9992 | 0.9992 |
85
+ | 0.0524 | 34.0 | 41548 | 0.0030 | 0.9995 | 0.9995 |
86
+ | 0.044 | 35.0 | 42770 | 0.0046 | 0.9990 | 0.9990 |
87
+ | 0.0442 | 36.0 | 43992 | 0.0029 | 0.9995 | 0.9995 |
88
+ | 0.0352 | 37.0 | 45214 | 0.0035 | 0.9995 | 0.9995 |
89
+ | 0.0348 | 38.0 | 46436 | 0.0029 | 0.9995 | 0.9995 |
90
+ | 0.0295 | 39.0 | 47658 | 0.0023 | 0.9995 | 0.9995 |
91
+ | 0.0289 | 40.0 | 48880 | 0.0035 | 0.9995 | 0.9995 |
92
+ | 0.0292 | 41.0 | 50102 | 0.0023 | 0.9995 | 0.9995 |
93
+ | 0.0259 | 42.0 | 51324 | 0.0027 | 0.9995 | 0.9995 |
94
+ | 0.0217 | 43.0 | 52546 | 0.0031 | 0.9995 | 0.9995 |
95
+ | 0.0278 | 44.0 | 53768 | 0.0018 | 0.9995 | 0.9995 |
96
+ | 0.0254 | 45.0 | 54990 | 0.0023 | 0.9995 | 0.9995 |
97
+ | 0.0164 | 46.0 | 56212 | 0.0016 | 0.9997 | 0.9997 |
98
+ | 0.0277 | 47.0 | 57434 | 0.0027 | 0.9997 | 0.9997 |
99
+ | 0.0158 | 48.0 | 58656 | 0.0029 | 0.9997 | 0.9997 |
100
+ | 0.0178 | 49.0 | 59878 | 0.0023 | 0.9997 | 0.9997 |
101
+ | 0.022 | 50.0 | 61100 | 0.0019 | 0.9997 | 0.9997 |
102
+ | 0.0167 | 51.0 | 62322 | 0.0018 | 0.9997 | 0.9997 |
103
+ | 0.0159 | 52.0 | 63544 | 0.0017 | 0.9997 | 0.9997 |
104
+ | 0.0105 | 53.0 | 64766 | 0.0016 | 0.9997 | 0.9997 |
105
+ | 0.0111 | 54.0 | 65988 | 0.0015 | 0.9997 | 0.9997 |
106
+ | 0.0139 | 55.0 | 67210 | 0.0021 | 0.9997 | 0.9997 |
107
+ | 0.0152 | 56.0 | 68432 | 0.0026 | 0.9997 | 0.9997 |
108
+ | 0.0191 | 57.0 | 69654 | 0.0022 | 0.9997 | 0.9997 |
109
+ | 0.0075 | 58.0 | 70876 | 0.0017 | 0.9997 | 0.9997 |
110
+ | 0.0141 | 59.0 | 72098 | 0.0016 | 0.9997 | 0.9997 |
111
+ | 0.0086 | 60.0 | 73320 | 0.0014 | 0.9997 | 0.9997 |
112
 
113
 
114
  ### Framework versions
115
 
116
  - Transformers 4.18.0
117
+ - Pytorch 1.11.0+cu113
118
  - Datasets 2.1.0
119
  - Tokenizers 0.12.1
runs/Apr26_07-53-45_f33f70b78e26/events.out.tfevents.1650959685.f33f70b78e26.73.0 CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:80da0ee63b6564db03e21de4eb9fce475d9cd8e02327899712b2dc0cdbd3ecd8
3
- size 49890
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:948d204f8dddfb03ae692667caa1a6eae560207450d3cac9a3a6ceb8b0d8a10f
3
+ size 50250
runs/Apr26_07-53-45_f33f70b78e26/events.out.tfevents.1650970849.f33f70b78e26.73.2 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:d03c25e05e53daf1e2c4fcc5607d78e269d53e8a4d7f180491631d3999c45efc
3
+ size 416