amanneo commited on
Commit
35a420a
1 Parent(s): f0b8acd

Training in progress epoch 0

Browse files
Files changed (3) hide show
  1. README.md +8 -108
  2. config.json +1 -1
  3. tf_model.h5 +1 -1
README.md CHANGED
@@ -1,5 +1,4 @@
1
  ---
2
- license: mit
3
  tags:
4
  - generated_from_keras_callback
5
  model-index:
@@ -12,13 +11,13 @@ probably proofread and complete it, then remove this comment. -->
12
 
13
  # amanneo/mail-generator-mini-v2
14
 
15
- This model is a fine-tuned version of [gpt2](https://huggingface.co/gpt2) on an unknown dataset.
16
  It achieves the following results on the evaluation set:
17
- - Train Loss: 2.6097
18
- - Train Accuracy: 0.0148
19
- - Validation Loss: 5.6699
20
- - Validation Accuracy: 0.0024
21
- - Epoch: 99
22
 
23
  ## Model description
24
 
@@ -37,113 +36,14 @@ More information needed
37
  ### Training hyperparameters
38
 
39
  The following hyperparameters were used during training:
40
- - optimizer: {'name': 'AdamWeightDecay', 'learning_rate': {'class_name': 'WarmUp', 'config': {'initial_learning_rate': 5e-05, 'decay_schedule_fn': {'class_name': 'PolynomialDecay', 'config': {'initial_learning_rate': 5e-05, 'decay_steps': -994, 'end_learning_rate': 0.0, 'power': 1.0, 'cycle': False, 'name': None}, '__passive_serialization__': True}, 'warmup_steps': 1000, 'power': 1.0, 'name': None}}, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-08, 'amsgrad': False, 'weight_decay_rate': 0.01}
41
  - training_precision: mixed_float16
42
 
43
  ### Training results
44
 
45
  | Train Loss | Train Accuracy | Validation Loss | Validation Accuracy | Epoch |
46
  |:----------:|:--------------:|:---------------:|:-------------------:|:-----:|
47
- | 10.9264 | 0.0001 | 10.8892 | 0.0 | 0 |
48
- | 10.8962 | 0.0 | 10.8383 | 0.0 | 1 |
49
- | 10.8438 | 0.0 | 10.7538 | 0.0 | 2 |
50
- | 10.7546 | 0.0020 | 10.6423 | 0.0107 | 3 |
51
- | 10.6424 | 0.0056 | 10.5139 | 0.0298 | 4 |
52
- | 10.5249 | 0.0235 | 10.3801 | 0.0405 | 5 |
53
- | 10.3882 | 0.0376 | 10.2493 | 0.0369 | 6 |
54
- | 10.2541 | 0.0368 | 10.1269 | 0.0345 | 7 |
55
- | 10.1381 | 0.0340 | 10.0139 | 0.0333 | 8 |
56
- | 10.0207 | 0.0340 | 9.9117 | 0.0333 | 9 |
57
- | 9.9080 | 0.0344 | 9.8192 | 0.0345 | 10 |
58
- | 9.8170 | 0.0341 | 9.7364 | 0.0357 | 11 |
59
- | 9.7316 | 0.0355 | 9.6610 | 0.0357 | 12 |
60
- | 9.6433 | 0.0352 | 9.5937 | 0.0357 | 13 |
61
- | 9.5732 | 0.0344 | 9.5323 | 0.0357 | 14 |
62
- | 9.5079 | 0.0337 | 9.4767 | 0.0357 | 15 |
63
- | 9.4539 | 0.0356 | 9.4238 | 0.0393 | 16 |
64
- | 9.3966 | 0.0360 | 9.3716 | 0.0393 | 17 |
65
- | 9.3395 | 0.0365 | 9.3216 | 0.0405 | 18 |
66
- | 9.2843 | 0.0367 | 9.2726 | 0.0405 | 19 |
67
- | 9.2256 | 0.0374 | 9.2261 | 0.0381 | 20 |
68
- | 9.1701 | 0.0380 | 9.1775 | 0.0405 | 21 |
69
- | 9.1118 | 0.0386 | 9.1264 | 0.0381 | 22 |
70
- | 9.0491 | 0.0395 | 9.0728 | 0.0393 | 23 |
71
- | 8.9904 | 0.0394 | 9.0188 | 0.0393 | 24 |
72
- | 8.9246 | 0.0393 | 8.9622 | 0.0393 | 25 |
73
- | 8.8673 | 0.0395 | 8.9069 | 0.0405 | 26 |
74
- | 8.7917 | 0.0399 | 8.8490 | 0.0405 | 27 |
75
- | 8.7272 | 0.0401 | 8.7869 | 0.0405 | 28 |
76
- | 8.6517 | 0.0405 | 8.7232 | 0.0405 | 29 |
77
- | 8.5852 | 0.0414 | 8.6613 | 0.0405 | 30 |
78
- | 8.5070 | 0.0409 | 8.5959 | 0.0417 | 31 |
79
- | 8.4285 | 0.0416 | 8.5308 | 0.0417 | 32 |
80
- | 8.3451 | 0.0414 | 8.4608 | 0.0417 | 33 |
81
- | 8.2669 | 0.0420 | 8.3895 | 0.0429 | 34 |
82
- | 8.1864 | 0.0425 | 8.3191 | 0.0429 | 35 |
83
- | 8.1022 | 0.0442 | 8.2480 | 0.0440 | 36 |
84
- | 8.0111 | 0.0440 | 8.1736 | 0.0440 | 37 |
85
- | 7.9213 | 0.0440 | 8.0987 | 0.0429 | 38 |
86
- | 7.8306 | 0.0446 | 8.0241 | 0.0429 | 39 |
87
- | 7.7430 | 0.0458 | 7.9455 | 0.0440 | 40 |
88
- | 7.6522 | 0.0458 | 7.8680 | 0.0429 | 41 |
89
- | 7.5519 | 0.0462 | 7.7897 | 0.0429 | 42 |
90
- | 7.4592 | 0.0452 | 7.7132 | 0.0369 | 43 |
91
- | 7.3597 | 0.0459 | 7.6370 | 0.0381 | 44 |
92
- | 7.2576 | 0.0450 | 7.5578 | 0.0381 | 45 |
93
- | 7.1593 | 0.0461 | 7.4838 | 0.0357 | 46 |
94
- | 7.0610 | 0.0429 | 7.4000 | 0.0321 | 47 |
95
- | 6.9531 | 0.0410 | 7.3260 | 0.0310 | 48 |
96
- | 6.8525 | 0.0395 | 7.2492 | 0.0250 | 49 |
97
- | 6.7464 | 0.0357 | 7.1719 | 0.0190 | 50 |
98
- | 6.6459 | 0.0344 | 7.0909 | 0.0155 | 51 |
99
- | 6.5266 | 0.0293 | 7.0174 | 0.0083 | 52 |
100
- | 6.4225 | 0.0300 | 6.9445 | 0.0071 | 53 |
101
- | 6.3291 | 0.0280 | 6.8702 | 0.0060 | 54 |
102
- | 6.2112 | 0.0216 | 6.8056 | 0.0060 | 55 |
103
- | 6.1003 | 0.0239 | 6.7480 | 0.0071 | 56 |
104
- | 5.9894 | 0.0223 | 6.6831 | 0.0048 | 57 |
105
- | 5.9002 | 0.0219 | 6.6178 | 0.0048 | 58 |
106
- | 5.7988 | 0.0205 | 6.5591 | 0.0048 | 59 |
107
- | 5.7025 | 0.0175 | 6.4957 | 0.0060 | 60 |
108
- | 5.6062 | 0.0238 | 6.4483 | 0.0083 | 61 |
109
- | 5.4956 | 0.0204 | 6.4248 | 0.0095 | 62 |
110
- | 5.4109 | 0.0205 | 6.3885 | 0.0048 | 63 |
111
- | 5.3245 | 0.0204 | 6.3385 | 0.0048 | 64 |
112
- | 5.2150 | 0.0217 | 6.3269 | 0.0036 | 65 |
113
- | 5.1480 | 0.0183 | 6.2668 | 0.0060 | 66 |
114
- | 5.0382 | 0.0212 | 6.2572 | 0.0060 | 67 |
115
- | 4.9511 | 0.0212 | 6.2532 | 0.0036 | 68 |
116
- | 4.8594 | 0.0179 | 6.1685 | 0.0048 | 69 |
117
- | 4.7745 | 0.0194 | 6.1300 | 0.0036 | 70 |
118
- | 4.6754 | 0.0181 | 6.1305 | 0.0012 | 71 |
119
- | 4.6001 | 0.0164 | 6.0840 | 0.0024 | 72 |
120
- | 4.5066 | 0.0192 | 6.1035 | 0.0 | 73 |
121
- | 4.4232 | 0.0171 | 6.0793 | 0.0 | 74 |
122
- | 4.3369 | 0.0159 | 6.0368 | 0.0 | 75 |
123
- | 4.2634 | 0.0151 | 6.0093 | 0.0012 | 76 |
124
- | 4.1771 | 0.0156 | 6.0330 | 0.0012 | 77 |
125
- | 4.1148 | 0.0171 | 6.0128 | 0.0 | 78 |
126
- | 4.0189 | 0.0158 | 5.9698 | 0.0 | 79 |
127
- | 3.9439 | 0.0162 | 5.9701 | 0.0024 | 80 |
128
- | 3.8643 | 0.0183 | 5.9446 | 0.0012 | 81 |
129
- | 3.7742 | 0.0145 | 5.9148 | 0.0012 | 82 |
130
- | 3.6931 | 0.0164 | 5.8891 | 0.0 | 83 |
131
- | 3.6257 | 0.0155 | 5.8629 | 0.0 | 84 |
132
- | 3.5501 | 0.0135 | 5.8868 | 0.0 | 85 |
133
- | 3.4736 | 0.0160 | 5.8476 | 0.0036 | 86 |
134
- | 3.4140 | 0.0152 | 5.8242 | 0.0012 | 87 |
135
- | 3.3366 | 0.0163 | 5.8312 | 0.0 | 88 |
136
- | 3.2676 | 0.0166 | 5.8227 | 0.0012 | 89 |
137
- | 3.1893 | 0.0144 | 5.7875 | 0.0012 | 90 |
138
- | 3.1314 | 0.0147 | 5.7660 | 0.0012 | 91 |
139
- | 3.0667 | 0.0143 | 5.7545 | 0.0 | 92 |
140
- | 2.9981 | 0.0140 | 5.7507 | 0.0024 | 93 |
141
- | 2.9351 | 0.0135 | 5.7543 | 0.0012 | 94 |
142
- | 2.8627 | 0.0133 | 5.7375 | 0.0048 | 95 |
143
- | 2.7816 | 0.0158 | 5.7470 | 0.0036 | 96 |
144
- | 2.7239 | 0.0154 | 5.7121 | 0.0036 | 97 |
145
- | 2.6559 | 0.0145 | 5.7199 | 0.0036 | 98 |
146
- | 2.6097 | 0.0148 | 5.6699 | 0.0024 | 99 |
147
 
148
 
149
  ### Framework versions
 
1
  ---
 
2
  tags:
3
  - generated_from_keras_callback
4
  model-index:
 
11
 
12
  # amanneo/mail-generator-mini-v2
13
 
14
+ This model was trained from scratch on an unknown dataset.
15
  It achieves the following results on the evaluation set:
16
+ - Train Loss: 2.5928
17
+ - Train Accuracy: 0.0171
18
+ - Validation Loss: 5.5430
19
+ - Validation Accuracy: 0.0048
20
+ - Epoch: 0
21
 
22
  ## Model description
23
 
 
36
  ### Training hyperparameters
37
 
38
  The following hyperparameters were used during training:
39
+ - optimizer: {'inner_optimizer': {'class_name': 'AdamWeightDecay', 'config': {'name': 'AdamWeightDecay', 'learning_rate': {'class_name': 'WarmUp', 'config': {'initial_learning_rate': 5e-05, 'decay_schedule_fn': {'class_name': 'PolynomialDecay', 'config': {'initial_learning_rate': 5e-05, 'decay_steps': -994, 'end_learning_rate': 0.0, 'power': 1.0, 'cycle': False, 'name': None}, '__passive_serialization__': True}, 'warmup_steps': 1000, 'power': 1.0, 'name': None}}, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-08, 'amsgrad': False, 'weight_decay_rate': 0.01}}, 'dynamic': True, 'initial_scale': 32768.0, 'dynamic_growth_steps': 2000}
40
  - training_precision: mixed_float16
41
 
42
  ### Training results
43
 
44
  | Train Loss | Train Accuracy | Validation Loss | Validation Accuracy | Epoch |
45
  |:----------:|:--------------:|:---------------:|:-------------------:|:-----:|
46
+ | 2.5928 | 0.0171 | 5.5430 | 0.0048 | 0 |
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
47
 
48
 
49
  ### Framework versions
config.json CHANGED
@@ -1,5 +1,5 @@
1
  {
2
- "_name_or_path": "gpt2",
3
  "activation_function": "gelu_new",
4
  "architectures": [
5
  "GPT2LMHeadModel"
 
1
  {
2
+ "_name_or_path": "amanneo/mail-generator-mini-v2",
3
  "activation_function": "gelu_new",
4
  "architectures": [
5
  "GPT2LMHeadModel"
tf_model.h5 CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:3e3b4caddce0bdfc953007d892e1405e9b5a9081556526a9018abd760e884684
3
  size 497935464
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:2615973d922e453e083596c559b8a617c20342e8f26faaf2ef4874af4ea874c5
3
  size 497935464