bedus-creation's picture
Update README.md
d88a532
---
license: apache-2.0
base_model: mBart
tags:
- generated_from_keras_callback
model-index:
- name: bedus-creation/t5-small-dataset-ii-eng-lim
results: []
---
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# bedus-creation/t5-small-dataset-ii-eng-lim
This model is a fine-tuned version of [mBart](https://huggingface.co/mBart) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 6.0880
- Validation Loss: 6.2594
- Epoch: 99
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 2e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01}
- training_precision: float32
### Training results
| Train Loss | Validation Loss | Epoch |
|:----------:|:---------------:|:-----:|
| 8.3860 | 7.8693 | 0 |
| 7.8568 | 7.6558 | 1 |
| 7.6900 | 7.5352 | 2 |
| 7.5904 | 7.4631 | 3 |
| 7.5155 | 7.4041 | 4 |
| 7.4554 | 7.3553 | 5 |
| 7.4005 | 7.3036 | 6 |
| 7.3547 | 7.2561 | 7 |
| 7.3104 | 7.2076 | 8 |
| 7.2651 | 7.1736 | 9 |
| 7.2302 | 7.1315 | 10 |
| 7.1888 | 7.0968 | 11 |
| 7.1616 | 7.0651 | 12 |
| 7.1290 | 7.0307 | 13 |
| 7.1066 | 7.0053 | 14 |
| 7.0729 | 6.9707 | 15 |
| 7.0388 | 6.9448 | 16 |
| 7.0169 | 6.9307 | 17 |
| 6.9924 | 6.9024 | 18 |
| 6.9716 | 6.8793 | 19 |
| 6.9503 | 6.8574 | 20 |
| 6.9252 | 6.8467 | 21 |
| 6.9136 | 6.8283 | 22 |
| 6.8915 | 6.8110 | 23 |
| 6.8697 | 6.7949 | 24 |
| 6.8531 | 6.7795 | 25 |
| 6.8336 | 6.7697 | 26 |
| 6.8255 | 6.7512 | 27 |
| 6.8080 | 6.7408 | 28 |
| 6.7928 | 6.7286 | 29 |
| 6.7752 | 6.7145 | 30 |
| 6.7629 | 6.7035 | 31 |
| 6.7467 | 6.6857 | 32 |
| 6.7329 | 6.6796 | 33 |
| 6.7216 | 6.6668 | 34 |
| 6.7067 | 6.6644 | 35 |
| 6.6935 | 6.6473 | 36 |
| 6.6810 | 6.6427 | 37 |
| 6.6713 | 6.6261 | 38 |
| 6.6551 | 6.6150 | 39 |
| 6.6422 | 6.6055 | 40 |
| 6.6346 | 6.5983 | 41 |
| 6.6254 | 6.5894 | 42 |
| 6.6066 | 6.5755 | 43 |
| 6.6023 | 6.5741 | 44 |
| 6.5900 | 6.5606 | 45 |
| 6.5781 | 6.5552 | 46 |
| 6.5597 | 6.5443 | 47 |
| 6.5578 | 6.5378 | 48 |
| 6.5426 | 6.5306 | 49 |
| 6.5304 | 6.5201 | 50 |
| 6.5179 | 6.5205 | 51 |
| 6.5142 | 6.5051 | 52 |
| 6.5010 | 6.4979 | 53 |
| 6.4840 | 6.5017 | 54 |
| 6.4787 | 6.4823 | 55 |
| 6.4734 | 6.4735 | 56 |
| 6.4619 | 6.4677 | 57 |
| 6.4496 | 6.4637 | 58 |
| 6.4344 | 6.4539 | 59 |
| 6.4290 | 6.4470 | 60 |
| 6.4159 | 6.4421 | 61 |
| 6.4069 | 6.4314 | 62 |
| 6.3964 | 6.4247 | 63 |
| 6.3887 | 6.4217 | 64 |
| 6.3783 | 6.4150 | 65 |
| 6.3670 | 6.4078 | 66 |
| 6.3593 | 6.3974 | 67 |
| 6.3500 | 6.3996 | 68 |
| 6.3359 | 6.3906 | 69 |
| 6.3358 | 6.3818 | 70 |
| 6.3298 | 6.3764 | 71 |
| 6.3158 | 6.3746 | 72 |
| 6.3026 | 6.3638 | 73 |
| 6.2904 | 6.3611 | 74 |
| 6.2861 | 6.3627 | 75 |
| 6.2820 | 6.3596 | 76 |
| 6.2658 | 6.3496 | 77 |
| 6.2554 | 6.3430 | 78 |
| 6.2552 | 6.3374 | 79 |
| 6.2468 | 6.3300 | 80 |
| 6.2316 | 6.3230 | 81 |
| 6.2314 | 6.3171 | 82 |
| 6.2198 | 6.3162 | 83 |
| 6.2084 | 6.3126 | 84 |
| 6.2020 | 6.3108 | 85 |
| 6.1906 | 6.3039 | 86 |
| 6.1851 | 6.2929 | 87 |
| 6.1749 | 6.2924 | 88 |
| 6.1678 | 6.3043 | 89 |
| 6.1576 | 6.2845 | 90 |
| 6.1566 | 6.2820 | 91 |
| 6.1454 | 6.2695 | 92 |
| 6.1351 | 6.2746 | 93 |
| 6.1313 | 6.2629 | 94 |
| 6.1211 | 6.2618 | 95 |
| 6.1107 | 6.2512 | 96 |
| 6.1092 | 6.2542 | 97 |
| 6.0974 | 6.2551 | 98 |
| 6.0880 | 6.2594 | 99 |
### Framework versions
- Transformers 4.33.2
- TensorFlow 2.13.0
- Datasets 2.14.5
- Tokenizers 0.13.3