File size: 3,147 Bytes
281a4a4
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
4b451cc
 
 
281a4a4
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
517e6fa
9bc160a
4f8a6ba
886c64a
8531c2e
7474e52
d5acdf7
ec453e5
7b94a9e
b43be5f
1eba9a4
446b226
6ff1712
8125ed8
b448a0d
266ffea
f6abad2
83dff42
01cf9d3
031b3e4
2dceeb1
d4fcdbd
a2e7888
2517e74
cd4bd95
6c4cbd9
f5f3e83
42161b0
faed3c4
529e1ad
3c92dec
4a479ef
acdc7fe
6810056
73d83ff
0c03140
389d9dd
49161f0
d9d9d77
a269ce8
4b451cc
281a4a4
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
---
license: apache-2.0
base_model: bedus-creation/mBart-small-dataset-i-eng-lim
tags:
- generated_from_keras_callback
model-index:
- name: bedus-creation/mBart-small-dataset-ii-eng-lim-004
  results: []
---

<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->

# bedus-creation/mBart-small-dataset-ii-eng-lim-004

This model is a fine-tuned version of [bedus-creation/mBart-small-dataset-i-eng-lim](https://huggingface.co/bedus-creation/mBart-small-dataset-i-eng-lim) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 0.2731
- Validation Loss: 0.2825
- Epoch: 41

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 1e-04, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01}
- training_precision: float32

### Training results

| Train Loss | Validation Loss | Epoch |
|:----------:|:---------------:|:-----:|
| 0.9940     | 0.4653          | 0     |
| 0.4659     | 0.3647          | 1     |
| 0.4011     | 0.3331          | 2     |
| 0.3798     | 0.3284          | 3     |
| 0.3640     | 0.3210          | 4     |
| 0.3539     | 0.3087          | 5     |
| 0.3456     | 0.3106          | 6     |
| 0.3377     | 0.3049          | 7     |
| 0.3340     | 0.2998          | 8     |
| 0.3285     | 0.2974          | 9     |
| 0.3246     | 0.2980          | 10    |
| 0.3202     | 0.2950          | 11    |
| 0.3174     | 0.2910          | 12    |
| 0.3154     | 0.2932          | 13    |
| 0.3124     | 0.2882          | 14    |
| 0.3094     | 0.2895          | 15    |
| 0.3092     | 0.2880          | 16    |
| 0.3073     | 0.2861          | 17    |
| 0.3043     | 0.2842          | 18    |
| 0.3037     | 0.2856          | 19    |
| 0.3009     | 0.2834          | 20    |
| 0.2999     | 0.2859          | 21    |
| 0.2983     | 0.2836          | 22    |
| 0.2973     | 0.2809          | 23    |
| 0.2952     | 0.2825          | 24    |
| 0.2942     | 0.2809          | 25    |
| 0.2933     | 0.2792          | 26    |
| 0.2914     | 0.2813          | 27    |
| 0.2898     | 0.2817          | 28    |
| 0.2884     | 0.2794          | 29    |
| 0.2866     | 0.2797          | 30    |
| 0.2853     | 0.2797          | 31    |
| 0.2849     | 0.2844          | 32    |
| 0.2835     | 0.2798          | 33    |
| 0.2821     | 0.2803          | 34    |
| 0.2823     | 0.2828          | 35    |
| 0.2798     | 0.2796          | 36    |
| 0.2797     | 0.2788          | 37    |
| 0.2766     | 0.2811          | 38    |
| 0.2765     | 0.2800          | 39    |
| 0.2747     | 0.2852          | 40    |
| 0.2731     | 0.2825          | 41    |


### Framework versions

- Transformers 4.33.3
- TensorFlow 2.13.0
- Datasets 2.14.5
- Tokenizers 0.13.3