metadata
tags:
- generated_from_keras_callback
model-index:
- name: amanneo/mail-generator-mini-v2
results: []
amanneo/mail-generator-mini-v2
This model was trained from scratch on an unknown dataset. It achieves the following results on the evaluation set:
- Train Loss: 0.5212
- Train Accuracy: 0.0027
- Validation Loss: 5.5781
- Validation Accuracy: 0.0
- Epoch: 99
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'inner_optimizer': {'class_name': 'AdamWeightDecay', 'config': {'name': 'AdamWeightDecay', 'learning_rate': {'class_name': 'WarmUp', 'config': {'initial_learning_rate': 5e-05, 'decay_schedule_fn': {'class_name': 'PolynomialDecay', 'config': {'initial_learning_rate': 5e-05, 'decay_steps': -994, 'end_learning_rate': 0.0, 'power': 1.0, 'cycle': False, 'name': None}, 'passive_serialization': True}, 'warmup_steps': 1000, 'power': 1.0, 'name': None}}, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-08, 'amsgrad': False, 'weight_decay_rate': 0.01}}, 'dynamic': True, 'initial_scale': 32768.0, 'dynamic_growth_steps': 2000}
- training_precision: mixed_float16
Training results
Train Loss | Train Accuracy | Validation Loss | Validation Accuracy | Epoch |
---|---|---|---|---|
2.5928 | 0.0171 | 5.5430 | 0.0048 | 0 |
2.6003 | 0.0207 | 5.5430 | 0.0048 | 1 |
2.5954 | 0.0171 | 5.5508 | 0.0048 | 2 |
2.5775 | 0.0190 | 5.5508 | 0.0024 | 3 |
2.5758 | 0.0231 | 5.5508 | 0.0024 | 4 |
2.5742 | 0.0207 | 5.5586 | 0.0048 | 5 |
2.5547 | 0.0209 | 5.5586 | 0.0048 | 6 |
2.5566 | 0.0188 | 5.5586 | 0.0048 | 7 |
2.5391 | 0.0193 | 5.5586 | 0.0048 | 8 |
2.5378 | 0.0215 | 5.5508 | 0.0048 | 9 |
2.5238 | 0.0188 | 5.5469 | 0.0048 | 10 |
2.5150 | 0.0160 | 5.5508 | 0.0048 | 11 |
2.4967 | 0.0174 | 5.5508 | 0.0071 | 12 |
2.4691 | 0.0193 | 5.5430 | 0.0071 | 13 |
2.4626 | 0.0163 | 5.5430 | 0.0071 | 14 |
2.4417 | 0.0231 | 5.5352 | 0.0048 | 15 |
2.4323 | 0.0215 | 5.5352 | 0.0048 | 16 |
2.4193 | 0.0226 | 5.5469 | 0.0048 | 17 |
2.4170 | 0.0185 | 5.5469 | 0.0048 | 18 |
2.3743 | 0.0193 | 5.5312 | 0.0048 | 19 |
2.3730 | 0.0207 | 5.5312 | 0.0048 | 20 |
2.3535 | 0.0198 | 5.5312 | 0.0048 | 21 |
2.3372 | 0.0182 | 5.5312 | 0.0071 | 22 |
2.3324 | 0.0177 | 5.5312 | 0.0048 | 23 |
2.3011 | 0.0204 | 5.5195 | 0.0048 | 24 |
2.2650 | 0.0212 | 5.5117 | 0.0048 | 25 |
2.2568 | 0.0198 | 5.5078 | 0.0048 | 26 |
2.2331 | 0.0196 | 5.5156 | 0.0048 | 27 |
2.2021 | 0.0193 | 5.5078 | 0.0048 | 28 |
2.1807 | 0.0204 | 5.5039 | 0.0048 | 29 |
2.1691 | 0.0190 | 5.5 | 0.0 | 30 |
2.1463 | 0.0174 | 5.4766 | 0.0 | 31 |
2.1097 | 0.0196 | 5.4844 | 0.0 | 32 |
2.1014 | 0.0179 | 5.4844 | 0.0024 | 33 |
2.0833 | 0.0177 | 5.4844 | 0.0024 | 34 |
2.0423 | 0.0201 | 5.4844 | 0.0 | 35 |
2.0163 | 0.0198 | 5.4844 | 0.0 | 36 |
1.9909 | 0.0168 | 5.4883 | 0.0 | 37 |
1.9774 | 0.0207 | 5.4805 | 0.0 | 38 |
1.9414 | 0.0207 | 5.4844 | 0.0 | 39 |
1.9206 | 0.0215 | 5.4766 | 0.0 | 40 |
1.8849 | 0.0182 | 5.4805 | 0.0 | 41 |
1.8732 | 0.0193 | 5.4648 | 0.0 | 42 |
1.8460 | 0.0160 | 5.4609 | 0.0 | 43 |
1.8171 | 0.0168 | 5.4648 | 0.0 | 44 |
1.7791 | 0.0201 | 5.4531 | 0.0 | 45 |
1.7583 | 0.0158 | 5.4570 | 0.0 | 46 |
1.7360 | 0.0171 | 5.4570 | 0.0 | 47 |
1.7061 | 0.0120 | 5.4297 | 0.0 | 48 |
1.6802 | 0.0155 | 5.4258 | 0.0 | 49 |
1.6551 | 0.0182 | 5.4141 | 0.0 | 50 |
1.6289 | 0.0130 | 5.4219 | 0.0 | 51 |
1.5981 | 0.0130 | 5.3945 | 0.0 | 52 |
1.5656 | 0.0128 | 5.4297 | 0.0 | 53 |
1.5535 | 0.0168 | 5.4219 | 0.0 | 54 |
1.5184 | 0.0141 | 5.4102 | 0.0 | 55 |
1.4943 | 0.0149 | 5.4023 | 0.0 | 56 |
1.4616 | 0.0122 | 5.4062 | 0.0 | 57 |
1.4344 | 0.0111 | 5.4062 | 0.0 | 58 |
1.3965 | 0.0111 | 5.4141 | 0.0 | 59 |
1.3643 | 0.0122 | 5.4375 | 0.0 | 60 |
1.3309 | 0.0087 | 5.4453 | 0.0 | 61 |
1.3215 | 0.0090 | 5.4648 | 0.0 | 62 |
1.3058 | 0.0084 | 5.4727 | 0.0 | 63 |
1.2700 | 0.0109 | 5.4453 | 0.0 | 64 |
1.2396 | 0.0079 | 5.4609 | 0.0 | 65 |
1.2189 | 0.0092 | 5.4375 | 0.0 | 66 |
1.1855 | 0.0079 | 5.4375 | 0.0 | 67 |
1.1592 | 0.0073 | 5.4375 | 0.0 | 68 |
1.1219 | 0.0071 | 5.4648 | 0.0 | 69 |
1.1071 | 0.0065 | 5.4570 | 0.0 | 70 |
1.0848 | 0.0060 | 5.4375 | 0.0 | 71 |
1.0581 | 0.0076 | 5.4453 | 0.0 | 72 |
1.0316 | 0.0090 | 5.4570 | 0.0 | 73 |
1.0068 | 0.0063 | 5.4219 | 0.0 | 74 |
0.9832 | 0.0060 | 5.4570 | 0.0 | 75 |
0.9534 | 0.0046 | 5.4570 | 0.0 | 76 |
0.9378 | 0.0057 | 5.4648 | 0.0 | 77 |
0.9170 | 0.0033 | 5.4844 | 0.0 | 78 |
0.8941 | 0.0041 | 5.4883 | 0.0 | 79 |
0.8666 | 0.0030 | 5.4922 | 0.0 | 80 |
0.8419 | 0.0054 | 5.4375 | 0.0 | 81 |
0.8200 | 0.0035 | 5.4492 | 0.0 | 82 |
0.8020 | 0.0022 | 5.4648 | 0.0 | 83 |
0.7785 | 0.0057 | 5.4883 | 0.0 | 84 |
0.7607 | 0.0052 | 5.4648 | 0.0 | 85 |
0.7454 | 0.0041 | 5.5078 | 0.0 | 86 |
0.7208 | 0.0024 | 5.5078 | 0.0 | 87 |
0.7040 | 0.0027 | 5.5078 | 0.0 | 88 |
0.6799 | 0.0041 | 5.5156 | 0.0 | 89 |
0.6594 | 0.0030 | 5.5312 | 0.0 | 90 |
0.6397 | 0.0030 | 5.5312 | 0.0 | 91 |
0.6217 | 0.0030 | 5.5195 | 0.0 | 92 |
0.6112 | 0.0033 | 5.5195 | 0.0 | 93 |
0.5937 | 0.0046 | 5.5625 | 0.0 | 94 |
0.5745 | 0.0035 | 5.5625 | 0.0 | 95 |
0.5616 | 0.0027 | 5.5586 | 0.0 | 96 |
0.5468 | 0.0043 | 5.5742 | 0.0 | 97 |
0.5354 | 0.0027 | 5.5781 | 0.0 | 98 |
0.5212 | 0.0027 | 5.5781 | 0.0 | 99 |
Framework versions
- Transformers 4.23.1
- TensorFlow 2.9.2
- Datasets 2.6.1
- Tokenizers 0.13.1