metadata
license: apache-2.0
base_model: bedus-creation/mBart-small-dataset-i-eng-lim
tags:
- generated_from_keras_callback
model-index:
- name: bedus-creation/mBart-small-dataset-ii-eng-lim-004
results: []
bedus-creation/mBart-small-dataset-ii-eng-lim-004
This model is a fine-tuned version of bedus-creation/mBart-small-dataset-i-eng-lim on an unknown dataset. It achieves the following results on the evaluation set:
- Train Loss: 0.2731
- Validation Loss: 0.2825
- Epoch: 41
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 1e-04, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01}
- training_precision: float32
Training results
Train Loss | Validation Loss | Epoch |
---|---|---|
0.9940 | 0.4653 | 0 |
0.4659 | 0.3647 | 1 |
0.4011 | 0.3331 | 2 |
0.3798 | 0.3284 | 3 |
0.3640 | 0.3210 | 4 |
0.3539 | 0.3087 | 5 |
0.3456 | 0.3106 | 6 |
0.3377 | 0.3049 | 7 |
0.3340 | 0.2998 | 8 |
0.3285 | 0.2974 | 9 |
0.3246 | 0.2980 | 10 |
0.3202 | 0.2950 | 11 |
0.3174 | 0.2910 | 12 |
0.3154 | 0.2932 | 13 |
0.3124 | 0.2882 | 14 |
0.3094 | 0.2895 | 15 |
0.3092 | 0.2880 | 16 |
0.3073 | 0.2861 | 17 |
0.3043 | 0.2842 | 18 |
0.3037 | 0.2856 | 19 |
0.3009 | 0.2834 | 20 |
0.2999 | 0.2859 | 21 |
0.2983 | 0.2836 | 22 |
0.2973 | 0.2809 | 23 |
0.2952 | 0.2825 | 24 |
0.2942 | 0.2809 | 25 |
0.2933 | 0.2792 | 26 |
0.2914 | 0.2813 | 27 |
0.2898 | 0.2817 | 28 |
0.2884 | 0.2794 | 29 |
0.2866 | 0.2797 | 30 |
0.2853 | 0.2797 | 31 |
0.2849 | 0.2844 | 32 |
0.2835 | 0.2798 | 33 |
0.2821 | 0.2803 | 34 |
0.2823 | 0.2828 | 35 |
0.2798 | 0.2796 | 36 |
0.2797 | 0.2788 | 37 |
0.2766 | 0.2811 | 38 |
0.2765 | 0.2800 | 39 |
0.2747 | 0.2852 | 40 |
0.2731 | 0.2825 | 41 |
Framework versions
- Transformers 4.33.3
- TensorFlow 2.13.0
- Datasets 2.14.5
- Tokenizers 0.13.3