bedus-creation's picture
Training in progress epoch 149
7306345
---
license: apache-2.0
base_model: bedus-creation/mBart-small-dataset-i-eng-lim
tags:
- generated_from_keras_callback
model-index:
- name: bedus-creation/mBart-small-dataset-ii-eng-lim-004
results: []
---
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# bedus-creation/mBart-small-dataset-ii-eng-lim-004
This model is a fine-tuned version of [bedus-creation/mBart-small-dataset-i-eng-lim](https://huggingface.co/bedus-creation/mBart-small-dataset-i-eng-lim) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 0.1507
- Validation Loss: 0.4808
- Epoch: 149
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 1e-04, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01}
- training_precision: float32
### Training results
| Train Loss | Validation Loss | Epoch |
|:----------:|:---------------:|:-----:|
| 0.9940 | 0.4653 | 0 |
| 0.4659 | 0.3647 | 1 |
| 0.4011 | 0.3331 | 2 |
| 0.3798 | 0.3284 | 3 |
| 0.3640 | 0.3210 | 4 |
| 0.3539 | 0.3087 | 5 |
| 0.3456 | 0.3106 | 6 |
| 0.3377 | 0.3049 | 7 |
| 0.3340 | 0.2998 | 8 |
| 0.3285 | 0.2974 | 9 |
| 0.3246 | 0.2980 | 10 |
| 0.3202 | 0.2950 | 11 |
| 0.3174 | 0.2910 | 12 |
| 0.3154 | 0.2932 | 13 |
| 0.3124 | 0.2882 | 14 |
| 0.3094 | 0.2895 | 15 |
| 0.3092 | 0.2880 | 16 |
| 0.3073 | 0.2861 | 17 |
| 0.3043 | 0.2842 | 18 |
| 0.3037 | 0.2856 | 19 |
| 0.3009 | 0.2834 | 20 |
| 0.2999 | 0.2859 | 21 |
| 0.2983 | 0.2836 | 22 |
| 0.2973 | 0.2809 | 23 |
| 0.2952 | 0.2825 | 24 |
| 0.2942 | 0.2809 | 25 |
| 0.2933 | 0.2792 | 26 |
| 0.2914 | 0.2813 | 27 |
| 0.2898 | 0.2817 | 28 |
| 0.2884 | 0.2794 | 29 |
| 0.2866 | 0.2797 | 30 |
| 0.2853 | 0.2797 | 31 |
| 0.2849 | 0.2844 | 32 |
| 0.2835 | 0.2798 | 33 |
| 0.2821 | 0.2803 | 34 |
| 0.2823 | 0.2828 | 35 |
| 0.2798 | 0.2796 | 36 |
| 0.2797 | 0.2788 | 37 |
| 0.2766 | 0.2811 | 38 |
| 0.2765 | 0.2800 | 39 |
| 0.2747 | 0.2852 | 40 |
| 0.2731 | 0.2825 | 41 |
| 0.2720 | 0.2841 | 42 |
| 0.2709 | 0.2855 | 43 |
| 0.2693 | 0.2843 | 44 |
| 0.2678 | 0.2863 | 45 |
| 0.2667 | 0.2912 | 46 |
| 0.2645 | 0.2863 | 47 |
| 0.2633 | 0.2862 | 48 |
| 0.2618 | 0.2881 | 49 |
| 0.2607 | 0.2890 | 50 |
| 0.2585 | 0.2928 | 51 |
| 0.2585 | 0.2903 | 52 |
| 0.2562 | 0.2904 | 53 |
| 0.2545 | 0.2902 | 54 |
| 0.2541 | 0.2937 | 55 |
| 0.2528 | 0.2930 | 56 |
| 0.2512 | 0.3014 | 57 |
| 0.2484 | 0.2979 | 58 |
| 0.2478 | 0.3002 | 59 |
| 0.2460 | 0.3034 | 60 |
| 0.2449 | 0.3000 | 61 |
| 0.2442 | 0.3010 | 62 |
| 0.2418 | 0.3054 | 63 |
| 0.2399 | 0.3046 | 64 |
| 0.2395 | 0.3072 | 65 |
| 0.2374 | 0.3117 | 66 |
| 0.2368 | 0.3081 | 67 |
| 0.2351 | 0.3149 | 68 |
| 0.2334 | 0.3155 | 69 |
| 0.2335 | 0.3123 | 70 |
| 0.2310 | 0.3193 | 71 |
| 0.2296 | 0.3169 | 72 |
| 0.2277 | 0.3220 | 73 |
| 0.2275 | 0.3200 | 74 |
| 0.2248 | 0.3223 | 75 |
| 0.2253 | 0.3235 | 76 |
| 0.2224 | 0.3266 | 77 |
| 0.2225 | 0.3289 | 78 |
| 0.2201 | 0.3288 | 79 |
| 0.2188 | 0.3330 | 80 |
| 0.2158 | 0.3389 | 81 |
| 0.2157 | 0.3379 | 82 |
| 0.2145 | 0.3447 | 83 |
| 0.2135 | 0.3436 | 84 |
| 0.2128 | 0.3525 | 85 |
| 0.2116 | 0.3464 | 86 |
| 0.2104 | 0.3494 | 87 |
| 0.2081 | 0.3540 | 88 |
| 0.2071 | 0.3561 | 89 |
| 0.2059 | 0.3598 | 90 |
| 0.2043 | 0.3608 | 91 |
| 0.2032 | 0.3721 | 92 |
| 0.2027 | 0.3668 | 93 |
| 0.2022 | 0.3608 | 94 |
| 0.2012 | 0.3675 | 95 |
| 0.1997 | 0.3695 | 96 |
| 0.1974 | 0.3703 | 97 |
| 0.1953 | 0.3704 | 98 |
| 0.1961 | 0.3744 | 99 |
| 0.1949 | 0.3669 | 100 |
| 0.1948 | 0.3772 | 101 |
| 0.1922 | 0.3772 | 102 |
| 0.1906 | 0.3775 | 103 |
| 0.1904 | 0.3803 | 104 |
| 0.1901 | 0.3873 | 105 |
| 0.1881 | 0.3880 | 106 |
| 0.1868 | 0.3921 | 107 |
| 0.1867 | 0.3933 | 108 |
| 0.1848 | 0.3928 | 109 |
| 0.1848 | 0.3894 | 110 |
| 0.1835 | 0.3983 | 111 |
| 0.1818 | 0.3985 | 112 |
| 0.1816 | 0.4025 | 113 |
| 0.1814 | 0.4023 | 114 |
| 0.1796 | 0.4089 | 115 |
| 0.1774 | 0.4137 | 116 |
| 0.1770 | 0.4162 | 117 |
| 0.1772 | 0.4145 | 118 |
| 0.1748 | 0.4173 | 119 |
| 0.1750 | 0.4226 | 120 |
| 0.1730 | 0.4262 | 121 |
| 0.1729 | 0.4208 | 122 |
| 0.1727 | 0.4161 | 123 |
| 0.1710 | 0.4221 | 124 |
| 0.1712 | 0.4267 | 125 |
| 0.1688 | 0.4319 | 126 |
| 0.1679 | 0.4339 | 127 |
| 0.1681 | 0.4388 | 128 |
| 0.1660 | 0.4455 | 129 |
| 0.1666 | 0.4419 | 130 |
| 0.1662 | 0.4351 | 131 |
| 0.1642 | 0.4405 | 132 |
| 0.1633 | 0.4486 | 133 |
| 0.1631 | 0.4483 | 134 |
| 0.1617 | 0.4470 | 135 |
| 0.1608 | 0.4542 | 136 |
| 0.1591 | 0.4589 | 137 |
| 0.1597 | 0.4482 | 138 |
| 0.1573 | 0.4584 | 139 |
| 0.1576 | 0.4552 | 140 |
| 0.1578 | 0.4612 | 141 |
| 0.1553 | 0.4602 | 142 |
| 0.1554 | 0.4616 | 143 |
| 0.1539 | 0.4653 | 144 |
| 0.1536 | 0.4658 | 145 |
| 0.1528 | 0.4671 | 146 |
| 0.1531 | 0.4758 | 147 |
| 0.1521 | 0.4708 | 148 |
| 0.1507 | 0.4808 | 149 |
### Framework versions
- Transformers 4.33.3
- TensorFlow 2.13.0
- Datasets 2.14.5
- Tokenizers 0.13.3