File size: 5,787 Bytes
6ea6f2f
 
d0e2028
6ea6f2f
 
 
 
 
 
 
 
 
 
 
 
d0e2028
6ea6f2f
3df2d23
 
 
6ea6f2f
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
25f6133
ff30341
e3c8509
6c2f3ab
40e6615
2b5f59d
fc35459
219142e
1e0ffcd
41213fc
7eb507c
64e514f
3960e9d
4f16080
2e90df9
d9969b1
f81a95b
ff8cd4c
cddaf36
fc802f5
15ecfd9
d59c414
83ece39
7ce78d0
bbcad9f
9a7cbc9
9ddda77
1c55bc3
 
a59c794
1a87b7a
4996b6b
50f881f
81b0c44
87b35eb
55b0826
92276a2
6650e0d
ef5782b
82ed31f
e6cfcf0
a8d1188
32f0564
 
 
a3c3029
fcb117f
5724768
d8612b9
c15007f
474fe74
e0273ea
6d9bbb4
b455273
4d5d1f8
6ea49d8
0dbe9ea
1ac703e
744ee66
26c902d
256b3b7
d91b2fe
 
b4151c0
19954c2
23c5ef1
3009268
8ee8b16
df17981
c9ad1dc
ce8fb32
f54cc54
6876116
12822af
18d5d30
49ab585
b5a3e77
e43545e
1687d9c
4cdd67c
c5bc86a
8506a7b
69c04fa
2e1efbd
2915355
304b3e6
b7756e7
9633c87
f99d809
c091d00
7c62719
d65149d
986f7c9
08138b9
768a390
e769d70
eed8ff6
7260197
1603513
23d9133
aa18475
f8c4a4c
dd2b096
217724b
2219111
3df2d23
6ea6f2f
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
---
license: apache-2.0
base_model: bedus-creation/mBart-small-dataset-ii-eng-lim-003
tags:
- generated_from_keras_callback
model-index:
- name: bedus-creation/mBart-small-dataset-ii-eng-lim-003
  results: []
---

<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->

# bedus-creation/mBart-small-dataset-ii-eng-lim-003

This model is a fine-tuned version of [bedus-creation/mBart-small-dataset-ii-eng-lim-003](https://huggingface.co/bedus-creation/mBart-small-dataset-ii-eng-lim-003) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 0.1207
- Validation Loss: 0.3391
- Epoch: 105

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 1e-04, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01}
- training_precision: float32

### Training results

| Train Loss | Validation Loss | Epoch |
|:----------:|:---------------:|:-----:|
| 0.2093     | 0.2072          | 0     |
| 0.2068     | 0.2056          | 1     |
| 0.2062     | 0.2023          | 2     |
| 0.2045     | 0.2054          | 3     |
| 0.2027     | 0.2188          | 4     |
| 0.2019     | 0.2067          | 5     |
| 0.1997     | 0.2056          | 6     |
| 0.1991     | 0.2074          | 7     |
| 0.1978     | 0.2024          | 8     |
| 0.1962     | 0.2067          | 9     |
| 0.1955     | 0.2074          | 10    |
| 0.1945     | 0.2089          | 11    |
| 0.1928     | 0.2168          | 12    |
| 0.1907     | 0.2201          | 13    |
| 0.1900     | 0.2102          | 14    |
| 0.1888     | 0.2130          | 15    |
| 0.1882     | 0.2211          | 16    |
| 0.1870     | 0.2117          | 17    |
| 0.1857     | 0.2134          | 18    |
| 0.1838     | 0.2147          | 19    |
| 0.1824     | 0.2187          | 20    |
| 0.1812     | 0.2224          | 21    |
| 0.1813     | 0.2249          | 22    |
| 0.1798     | 0.2200          | 23    |
| 0.1787     | 0.2273          | 24    |
| 0.1772     | 0.2263          | 25    |
| 0.1780     | 0.2273          | 26    |
| 0.1764     | 0.2270          | 27    |
| 0.1754     | 0.2245          | 28    |
| 0.1738     | 0.2260          | 29    |
| 0.1730     | 0.2327          | 30    |
| 0.1720     | 0.2300          | 31    |
| 0.1702     | 0.2347          | 32    |
| 0.1698     | 0.2396          | 33    |
| 0.1689     | 0.2340          | 34    |
| 0.1693     | 0.2345          | 35    |
| 0.1661     | 0.2424          | 36    |
| 0.1663     | 0.2388          | 37    |
| 0.1658     | 0.2436          | 38    |
| 0.1654     | 0.2506          | 39    |
| 0.1639     | 0.2406          | 40    |
| 0.1635     | 0.2524          | 41    |
| 0.1619     | 0.2379          | 42    |
| 0.1609     | 0.2449          | 43    |
| 0.1602     | 0.2466          | 44    |
| 0.1602     | 0.2537          | 45    |
| 0.1586     | 0.2457          | 46    |
| 0.1576     | 0.2589          | 47    |
| 0.1573     | 0.2547          | 48    |
| 0.1566     | 0.2532          | 49    |
| 0.1546     | 0.2565          | 50    |
| 0.1540     | 0.2544          | 51    |
| 0.1545     | 0.2637          | 52    |
| 0.1515     | 0.2580          | 53    |
| 0.1520     | 0.2654          | 54    |
| 0.1524     | 0.2650          | 55    |
| 0.1513     | 0.2701          | 56    |
| 0.1500     | 0.2767          | 57    |
| 0.1492     | 0.2646          | 58    |
| 0.1483     | 0.2696          | 59    |
| 0.1480     | 0.2729          | 60    |
| 0.1475     | 0.2709          | 61    |
| 0.1458     | 0.2757          | 62    |
| 0.1460     | 0.2778          | 63    |
| 0.1446     | 0.2775          | 64    |
| 0.1440     | 0.2727          | 65    |
| 0.1438     | 0.2862          | 66    |
| 0.1444     | 0.2719          | 67    |
| 0.1423     | 0.2827          | 68    |
| 0.1418     | 0.2830          | 69    |
| 0.1402     | 0.2787          | 70    |
| 0.1404     | 0.2799          | 71    |
| 0.1388     | 0.2857          | 72    |
| 0.1392     | 0.2889          | 73    |
| 0.1398     | 0.2868          | 74    |
| 0.1389     | 0.2920          | 75    |
| 0.1359     | 0.3010          | 76    |
| 0.1369     | 0.2873          | 77    |
| 0.1366     | 0.2921          | 78    |
| 0.1358     | 0.2895          | 79    |
| 0.1343     | 0.3071          | 80    |
| 0.1344     | 0.2981          | 81    |
| 0.1341     | 0.3033          | 82    |
| 0.1328     | 0.3008          | 83    |
| 0.1332     | 0.2933          | 84    |
| 0.1317     | 0.3155          | 85    |
| 0.1310     | 0.3091          | 86    |
| 0.1307     | 0.3205          | 87    |
| 0.1295     | 0.3142          | 88    |
| 0.1295     | 0.3141          | 89    |
| 0.1299     | 0.3103          | 90    |
| 0.1282     | 0.3209          | 91    |
| 0.1284     | 0.3167          | 92    |
| 0.1272     | 0.3242          | 93    |
| 0.1270     | 0.3159          | 94    |
| 0.1245     | 0.3275          | 95    |
| 0.1244     | 0.3218          | 96    |
| 0.1248     | 0.3270          | 97    |
| 0.1241     | 0.3354          | 98    |
| 0.1231     | 0.3430          | 99    |
| 0.1233     | 0.3318          | 100   |
| 0.1222     | 0.3387          | 101   |
| 0.1225     | 0.3367          | 102   |
| 0.1221     | 0.3501          | 103   |
| 0.1214     | 0.3370          | 104   |
| 0.1207     | 0.3391          | 105   |


### Framework versions

- Transformers 4.33.3
- TensorFlow 2.13.0
- Datasets 2.14.5
- Tokenizers 0.13.3