File size: 7,576 Bytes
281a4a4
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
7306345
 
 
281a4a4
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
517e6fa
9bc160a
4f8a6ba
886c64a
8531c2e
7474e52
d5acdf7
ec453e5
7b94a9e
b43be5f
1eba9a4
446b226
6ff1712
8125ed8
b448a0d
266ffea
f6abad2
83dff42
01cf9d3
031b3e4
2dceeb1
d4fcdbd
a2e7888
2517e74
cd4bd95
6c4cbd9
f5f3e83
42161b0
faed3c4
529e1ad
3c92dec
4a479ef
acdc7fe
6810056
73d83ff
0c03140
389d9dd
49161f0
d9d9d77
a269ce8
4b451cc
be7a832
d5886cf
ca788d3
1c22a2c
fc22eb8
75cd300
7ad1848
c75c308
99aeb16
2b74d33
1478f27
e0131ca
c780c90
f3c89d6
99122be
140773b
94a18cc
5f3cd7f
9cf49e5
5431a9f
ef02fce
261f84f
99b75a9
33293c8
ce0b914
3ba47c4
be15f03
0910d94
eb6baab
e7224e4
2917c0e
8d8ab2d
6fef368
151adb5
4c9d917
03f87d9
99e58db
aa792c4
543f2b0
ffa0f82
d18024e
3e18126
a39483d
a8a3595
4054f7a
667949d
a68ed6b
5760206
00662d7
f7942f8
cd684da
5e5b3b1
206dcf8
44e18a8
165a840
92e3cbb
49fd0cf
173959c
110738e
68ab972
9f81257
b7d1ece
c09d42a
58fc575
5123e08
425372a
e62ffe6
174ea44
34c03a9
31ed3cc
e429ca3
e828344
e96c93b
b19ed73
e245ba9
3c4945b
5dc0ce7
6c8b718
68b5ac3
c610641
f56846e
491df2e
c9733c4
d7e3fc2
91bd6c5
60636b7
7dd20d7
2dc32c8
c53861d
23f7fac
ee97cdb
9ef949a
3ef6a6d
1963494
7018220
6a7d37e
3d581ae
4cb1ad3
b8b302a
6a393e9
fbe1aac
15c97b2
c8f9d75
b3c20b1
fb9a363
e832c75
6951aa5
7306345
281a4a4
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
---
license: apache-2.0
base_model: bedus-creation/mBart-small-dataset-i-eng-lim
tags:
- generated_from_keras_callback
model-index:
- name: bedus-creation/mBart-small-dataset-ii-eng-lim-004
  results: []
---

<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->

# bedus-creation/mBart-small-dataset-ii-eng-lim-004

This model is a fine-tuned version of [bedus-creation/mBart-small-dataset-i-eng-lim](https://huggingface.co/bedus-creation/mBart-small-dataset-i-eng-lim) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 0.1507
- Validation Loss: 0.4808
- Epoch: 149

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 1e-04, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01}
- training_precision: float32

### Training results

| Train Loss | Validation Loss | Epoch |
|:----------:|:---------------:|:-----:|
| 0.9940     | 0.4653          | 0     |
| 0.4659     | 0.3647          | 1     |
| 0.4011     | 0.3331          | 2     |
| 0.3798     | 0.3284          | 3     |
| 0.3640     | 0.3210          | 4     |
| 0.3539     | 0.3087          | 5     |
| 0.3456     | 0.3106          | 6     |
| 0.3377     | 0.3049          | 7     |
| 0.3340     | 0.2998          | 8     |
| 0.3285     | 0.2974          | 9     |
| 0.3246     | 0.2980          | 10    |
| 0.3202     | 0.2950          | 11    |
| 0.3174     | 0.2910          | 12    |
| 0.3154     | 0.2932          | 13    |
| 0.3124     | 0.2882          | 14    |
| 0.3094     | 0.2895          | 15    |
| 0.3092     | 0.2880          | 16    |
| 0.3073     | 0.2861          | 17    |
| 0.3043     | 0.2842          | 18    |
| 0.3037     | 0.2856          | 19    |
| 0.3009     | 0.2834          | 20    |
| 0.2999     | 0.2859          | 21    |
| 0.2983     | 0.2836          | 22    |
| 0.2973     | 0.2809          | 23    |
| 0.2952     | 0.2825          | 24    |
| 0.2942     | 0.2809          | 25    |
| 0.2933     | 0.2792          | 26    |
| 0.2914     | 0.2813          | 27    |
| 0.2898     | 0.2817          | 28    |
| 0.2884     | 0.2794          | 29    |
| 0.2866     | 0.2797          | 30    |
| 0.2853     | 0.2797          | 31    |
| 0.2849     | 0.2844          | 32    |
| 0.2835     | 0.2798          | 33    |
| 0.2821     | 0.2803          | 34    |
| 0.2823     | 0.2828          | 35    |
| 0.2798     | 0.2796          | 36    |
| 0.2797     | 0.2788          | 37    |
| 0.2766     | 0.2811          | 38    |
| 0.2765     | 0.2800          | 39    |
| 0.2747     | 0.2852          | 40    |
| 0.2731     | 0.2825          | 41    |
| 0.2720     | 0.2841          | 42    |
| 0.2709     | 0.2855          | 43    |
| 0.2693     | 0.2843          | 44    |
| 0.2678     | 0.2863          | 45    |
| 0.2667     | 0.2912          | 46    |
| 0.2645     | 0.2863          | 47    |
| 0.2633     | 0.2862          | 48    |
| 0.2618     | 0.2881          | 49    |
| 0.2607     | 0.2890          | 50    |
| 0.2585     | 0.2928          | 51    |
| 0.2585     | 0.2903          | 52    |
| 0.2562     | 0.2904          | 53    |
| 0.2545     | 0.2902          | 54    |
| 0.2541     | 0.2937          | 55    |
| 0.2528     | 0.2930          | 56    |
| 0.2512     | 0.3014          | 57    |
| 0.2484     | 0.2979          | 58    |
| 0.2478     | 0.3002          | 59    |
| 0.2460     | 0.3034          | 60    |
| 0.2449     | 0.3000          | 61    |
| 0.2442     | 0.3010          | 62    |
| 0.2418     | 0.3054          | 63    |
| 0.2399     | 0.3046          | 64    |
| 0.2395     | 0.3072          | 65    |
| 0.2374     | 0.3117          | 66    |
| 0.2368     | 0.3081          | 67    |
| 0.2351     | 0.3149          | 68    |
| 0.2334     | 0.3155          | 69    |
| 0.2335     | 0.3123          | 70    |
| 0.2310     | 0.3193          | 71    |
| 0.2296     | 0.3169          | 72    |
| 0.2277     | 0.3220          | 73    |
| 0.2275     | 0.3200          | 74    |
| 0.2248     | 0.3223          | 75    |
| 0.2253     | 0.3235          | 76    |
| 0.2224     | 0.3266          | 77    |
| 0.2225     | 0.3289          | 78    |
| 0.2201     | 0.3288          | 79    |
| 0.2188     | 0.3330          | 80    |
| 0.2158     | 0.3389          | 81    |
| 0.2157     | 0.3379          | 82    |
| 0.2145     | 0.3447          | 83    |
| 0.2135     | 0.3436          | 84    |
| 0.2128     | 0.3525          | 85    |
| 0.2116     | 0.3464          | 86    |
| 0.2104     | 0.3494          | 87    |
| 0.2081     | 0.3540          | 88    |
| 0.2071     | 0.3561          | 89    |
| 0.2059     | 0.3598          | 90    |
| 0.2043     | 0.3608          | 91    |
| 0.2032     | 0.3721          | 92    |
| 0.2027     | 0.3668          | 93    |
| 0.2022     | 0.3608          | 94    |
| 0.2012     | 0.3675          | 95    |
| 0.1997     | 0.3695          | 96    |
| 0.1974     | 0.3703          | 97    |
| 0.1953     | 0.3704          | 98    |
| 0.1961     | 0.3744          | 99    |
| 0.1949     | 0.3669          | 100   |
| 0.1948     | 0.3772          | 101   |
| 0.1922     | 0.3772          | 102   |
| 0.1906     | 0.3775          | 103   |
| 0.1904     | 0.3803          | 104   |
| 0.1901     | 0.3873          | 105   |
| 0.1881     | 0.3880          | 106   |
| 0.1868     | 0.3921          | 107   |
| 0.1867     | 0.3933          | 108   |
| 0.1848     | 0.3928          | 109   |
| 0.1848     | 0.3894          | 110   |
| 0.1835     | 0.3983          | 111   |
| 0.1818     | 0.3985          | 112   |
| 0.1816     | 0.4025          | 113   |
| 0.1814     | 0.4023          | 114   |
| 0.1796     | 0.4089          | 115   |
| 0.1774     | 0.4137          | 116   |
| 0.1770     | 0.4162          | 117   |
| 0.1772     | 0.4145          | 118   |
| 0.1748     | 0.4173          | 119   |
| 0.1750     | 0.4226          | 120   |
| 0.1730     | 0.4262          | 121   |
| 0.1729     | 0.4208          | 122   |
| 0.1727     | 0.4161          | 123   |
| 0.1710     | 0.4221          | 124   |
| 0.1712     | 0.4267          | 125   |
| 0.1688     | 0.4319          | 126   |
| 0.1679     | 0.4339          | 127   |
| 0.1681     | 0.4388          | 128   |
| 0.1660     | 0.4455          | 129   |
| 0.1666     | 0.4419          | 130   |
| 0.1662     | 0.4351          | 131   |
| 0.1642     | 0.4405          | 132   |
| 0.1633     | 0.4486          | 133   |
| 0.1631     | 0.4483          | 134   |
| 0.1617     | 0.4470          | 135   |
| 0.1608     | 0.4542          | 136   |
| 0.1591     | 0.4589          | 137   |
| 0.1597     | 0.4482          | 138   |
| 0.1573     | 0.4584          | 139   |
| 0.1576     | 0.4552          | 140   |
| 0.1578     | 0.4612          | 141   |
| 0.1553     | 0.4602          | 142   |
| 0.1554     | 0.4616          | 143   |
| 0.1539     | 0.4653          | 144   |
| 0.1536     | 0.4658          | 145   |
| 0.1528     | 0.4671          | 146   |
| 0.1531     | 0.4758          | 147   |
| 0.1521     | 0.4708          | 148   |
| 0.1507     | 0.4808          | 149   |


### Framework versions

- Transformers 4.33.3
- TensorFlow 2.13.0
- Datasets 2.14.5
- Tokenizers 0.13.3