bedus-creation's picture
Training in progress epoch 140
65701ea
---
license: apache-2.0
base_model: bedus-creation/t5-small-dataset-i-lim-to-eng
tags:
- generated_from_keras_callback
model-index:
- name: bedus-creation/t5-small-dataset-i-lim-to-eng-003
results: []
---
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# bedus-creation/t5-small-dataset-i-lim-to-eng-003
This model is a fine-tuned version of [bedus-creation/t5-small-dataset-i-lim-to-eng](https://huggingface.co/bedus-creation/t5-small-dataset-i-lim-to-eng) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 0.0193
- Validation Loss: 0.2406
- Epoch: 140
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 2e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01}
- training_precision: float32
### Training results
| Train Loss | Validation Loss | Epoch |
|:----------:|:---------------:|:-----:|
| 0.1690 | 0.1414 | 0 |
| 0.1533 | 0.1296 | 1 |
| 0.1523 | 0.1231 | 2 |
| 0.1481 | 0.1228 | 3 |
| 0.1450 | 0.1153 | 4 |
| 0.1393 | 0.1138 | 5 |
| 0.1336 | 0.1114 | 6 |
| 0.1311 | 0.1103 | 7 |
| 0.1267 | 0.1114 | 8 |
| 0.1251 | 0.1084 | 9 |
| 0.1263 | 0.1072 | 10 |
| 0.1202 | 0.1090 | 11 |
| 0.1126 | 0.1100 | 12 |
| 0.1136 | 0.1141 | 13 |
| 0.1056 | 0.1086 | 14 |
| 0.1067 | 0.1053 | 15 |
| 0.1095 | 0.1095 | 16 |
| 0.1031 | 0.1069 | 17 |
| 0.0960 | 0.1139 | 18 |
| 0.0976 | 0.1065 | 19 |
| 0.0964 | 0.1053 | 20 |
| 0.0929 | 0.1055 | 21 |
| 0.0912 | 0.1077 | 22 |
| 0.0933 | 0.1045 | 23 |
| 0.0887 | 0.1102 | 24 |
| 0.0884 | 0.1041 | 25 |
| 0.0868 | 0.1138 | 26 |
| 0.0891 | 0.1101 | 27 |
| 0.0858 | 0.1173 | 28 |
| 0.0824 | 0.1063 | 29 |
| 0.0837 | 0.1041 | 30 |
| 0.0784 | 0.1171 | 31 |
| 0.0785 | 0.1113 | 32 |
| 0.0820 | 0.1099 | 33 |
| 0.0740 | 0.1129 | 34 |
| 0.0735 | 0.1182 | 35 |
| 0.0745 | 0.1158 | 36 |
| 0.0751 | 0.1153 | 37 |
| 0.0729 | 0.1131 | 38 |
| 0.0693 | 0.1154 | 39 |
| 0.0661 | 0.1171 | 40 |
| 0.0633 | 0.1197 | 41 |
| 0.0689 | 0.1171 | 42 |
| 0.0593 | 0.1180 | 43 |
| 0.0657 | 0.1185 | 44 |
| 0.0576 | 0.1169 | 45 |
| 0.0596 | 0.1206 | 46 |
| 0.0599 | 0.1229 | 47 |
| 0.0597 | 0.1180 | 48 |
| 0.0507 | 0.1256 | 49 |
| 0.0577 | 0.1206 | 50 |
| 0.0530 | 0.1281 | 51 |
| 0.0538 | 0.1244 | 52 |
| 0.0498 | 0.1215 | 53 |
| 0.0487 | 0.1385 | 54 |
| 0.0470 | 0.1336 | 55 |
| 0.0478 | 0.1303 | 56 |
| 0.0472 | 0.1336 | 57 |
| 0.0445 | 0.1604 | 58 |
| 0.0488 | 0.1391 | 59 |
| 0.0463 | 0.1478 | 60 |
| 0.0467 | 0.1341 | 61 |
| 0.0403 | 0.1467 | 62 |
| 0.0370 | 0.1514 | 63 |
| 0.0438 | 0.1644 | 64 |
| 0.0502 | 0.1422 | 65 |
| 0.0386 | 0.1503 | 66 |
| 0.0370 | 0.1471 | 67 |
| 0.0400 | 0.1423 | 68 |
| 0.0388 | 0.1444 | 69 |
| 0.0357 | 0.1651 | 70 |
| 0.0307 | 0.1751 | 71 |
| 0.0306 | 0.1713 | 72 |
| 0.0285 | 0.1650 | 73 |
| 0.0317 | 0.1629 | 74 |
| 0.0367 | 0.1772 | 75 |
| 0.0341 | 0.1592 | 76 |
| 0.0330 | 0.1590 | 77 |
| 0.0287 | 0.1638 | 78 |
| 0.0319 | 0.1604 | 79 |
| 0.0256 | 0.1733 | 80 |
| 0.0267 | 0.1736 | 81 |
| 0.0271 | 0.1746 | 82 |
| 0.0264 | 0.1843 | 83 |
| 0.0271 | 0.1800 | 84 |
| 0.0292 | 0.1751 | 85 |
| 0.0283 | 0.1910 | 86 |
| 0.0258 | 0.1864 | 87 |
| 0.0228 | 0.1821 | 88 |
| 0.0253 | 0.1875 | 89 |
| 0.0211 | 0.1846 | 90 |
| 0.0210 | 0.1902 | 91 |
| 0.0288 | 0.1962 | 92 |
| 0.0196 | 0.2071 | 93 |
| 0.0207 | 0.2053 | 94 |
| 0.0184 | 0.2031 | 95 |
| 0.0200 | 0.2099 | 96 |
| 0.0235 | 0.2027 | 97 |
| 0.0183 | 0.2034 | 98 |
| 0.0268 | 0.2116 | 99 |
| 0.0180 | 0.2024 | 100 |
| 0.0205 | 0.2085 | 101 |
| 0.0203 | 0.2072 | 102 |
| 0.0186 | 0.2075 | 103 |
| 0.0189 | 0.2121 | 104 |
| 0.0199 | 0.2118 | 105 |
| 0.0190 | 0.2220 | 106 |
| 0.0182 | 0.2143 | 107 |
| 0.0136 | 0.2213 | 108 |
| 0.0202 | 0.2218 | 109 |
| 0.0151 | 0.2183 | 110 |
| 0.0135 | 0.2267 | 111 |
| 0.0133 | 0.2274 | 112 |
| 0.0183 | 0.2433 | 113 |
| 0.0169 | 0.2462 | 114 |
| 0.0156 | 0.2340 | 115 |
| 0.0160 | 0.2384 | 116 |
| 0.0149 | 0.2497 | 117 |
| 0.0131 | 0.2528 | 118 |
| 0.0207 | 0.2387 | 119 |
| 0.0133 | 0.2451 | 120 |
| 0.0143 | 0.2299 | 121 |
| 0.0176 | 0.2286 | 122 |
| 0.0125 | 0.2377 | 123 |
| 0.0128 | 0.2407 | 124 |
| 0.0157 | 0.2450 | 125 |
| 0.0121 | 0.2536 | 126 |
| 0.0139 | 0.2527 | 127 |
| 0.0141 | 0.2509 | 128 |
| 0.0093 | 0.2509 | 129 |
| 0.0151 | 0.2589 | 130 |
| 0.0114 | 0.2520 | 131 |
| 0.0126 | 0.2599 | 132 |
| 0.0109 | 0.2648 | 133 |
| 0.0106 | 0.2593 | 134 |
| 0.0118 | 0.2744 | 135 |
| 0.0192 | 0.2526 | 136 |
| 0.0129 | 0.2431 | 137 |
| 0.0110 | 0.2484 | 138 |
| 0.0137 | 0.2453 | 139 |
| 0.0193 | 0.2406 | 140 |
### Framework versions
- Transformers 4.33.2
- TensorFlow 2.13.0
- Datasets 2.14.5
- Tokenizers 0.13.3