File size: 5,394 Bytes
f7d7396
 
d88a532
f7d7396
 
 
 
 
 
 
 
 
 
 
 
d88a532
f7d7396
f11c8aa
 
 
f7d7396
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
90cb863
ead6749
9446cba
f71c124
5090616
ffb9146
9d0706f
966c2bb
3c8a335
3e56a3b
089e757
f2af6d0
27d2cab
75e34c0
2a48291
5a1a33f
3cc0998
813da8f
6167d43
643560b
d64ef4f
64eb224
d941abf
29c2054
ddd5ebe
145a531
cf08158
056e056
8294417
6065383
ef5a404
cb47401
877aa07
275f094
0c5557a
3ad27e8
bf108bd
0b1195d
3f91188
c38592b
157cd7d
2796f2e
48cf80c
1add3b9
9d5fbef
ced8afa
803ae29
0e9002d
c1f9b93
fd70364
e42c09b
8401040
17f253a
2a49800
6835d95
5ec88eb
7d98e8c
e396708
f2a8c6d
75b5816
a61f7d9
aa83926
b90cf09
7d18ad3
393b684
731367c
ba353a8
19f546d
470c766
57b43fd
94b3a61
ca5d308
6685563
d1948ee
4c91cb1
5b09a61
12a92ea
3370f2c
2e86d67
3366e08
fa679b4
4b52c4e
b3169a2
04a3534
f1dda61
b6400f3
afe9343
b21a9e3
dfadcb9
65f6b4a
679ce88
e58cb47
cc7b47b
6d04691
2c769f6
f0944e3
8a23fd4
dace104
f11c8aa
f7d7396
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
---
license: apache-2.0
base_model: mBart
tags:
- generated_from_keras_callback
model-index:
- name: bedus-creation/t5-small-dataset-ii-eng-lim
  results: []
---

<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->

# bedus-creation/t5-small-dataset-ii-eng-lim

This model is a fine-tuned version of [mBart](https://huggingface.co/mBart) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 6.0880
- Validation Loss: 6.2594
- Epoch: 99

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 2e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01}
- training_precision: float32

### Training results

| Train Loss | Validation Loss | Epoch |
|:----------:|:---------------:|:-----:|
| 8.3860     | 7.8693          | 0     |
| 7.8568     | 7.6558          | 1     |
| 7.6900     | 7.5352          | 2     |
| 7.5904     | 7.4631          | 3     |
| 7.5155     | 7.4041          | 4     |
| 7.4554     | 7.3553          | 5     |
| 7.4005     | 7.3036          | 6     |
| 7.3547     | 7.2561          | 7     |
| 7.3104     | 7.2076          | 8     |
| 7.2651     | 7.1736          | 9     |
| 7.2302     | 7.1315          | 10    |
| 7.1888     | 7.0968          | 11    |
| 7.1616     | 7.0651          | 12    |
| 7.1290     | 7.0307          | 13    |
| 7.1066     | 7.0053          | 14    |
| 7.0729     | 6.9707          | 15    |
| 7.0388     | 6.9448          | 16    |
| 7.0169     | 6.9307          | 17    |
| 6.9924     | 6.9024          | 18    |
| 6.9716     | 6.8793          | 19    |
| 6.9503     | 6.8574          | 20    |
| 6.9252     | 6.8467          | 21    |
| 6.9136     | 6.8283          | 22    |
| 6.8915     | 6.8110          | 23    |
| 6.8697     | 6.7949          | 24    |
| 6.8531     | 6.7795          | 25    |
| 6.8336     | 6.7697          | 26    |
| 6.8255     | 6.7512          | 27    |
| 6.8080     | 6.7408          | 28    |
| 6.7928     | 6.7286          | 29    |
| 6.7752     | 6.7145          | 30    |
| 6.7629     | 6.7035          | 31    |
| 6.7467     | 6.6857          | 32    |
| 6.7329     | 6.6796          | 33    |
| 6.7216     | 6.6668          | 34    |
| 6.7067     | 6.6644          | 35    |
| 6.6935     | 6.6473          | 36    |
| 6.6810     | 6.6427          | 37    |
| 6.6713     | 6.6261          | 38    |
| 6.6551     | 6.6150          | 39    |
| 6.6422     | 6.6055          | 40    |
| 6.6346     | 6.5983          | 41    |
| 6.6254     | 6.5894          | 42    |
| 6.6066     | 6.5755          | 43    |
| 6.6023     | 6.5741          | 44    |
| 6.5900     | 6.5606          | 45    |
| 6.5781     | 6.5552          | 46    |
| 6.5597     | 6.5443          | 47    |
| 6.5578     | 6.5378          | 48    |
| 6.5426     | 6.5306          | 49    |
| 6.5304     | 6.5201          | 50    |
| 6.5179     | 6.5205          | 51    |
| 6.5142     | 6.5051          | 52    |
| 6.5010     | 6.4979          | 53    |
| 6.4840     | 6.5017          | 54    |
| 6.4787     | 6.4823          | 55    |
| 6.4734     | 6.4735          | 56    |
| 6.4619     | 6.4677          | 57    |
| 6.4496     | 6.4637          | 58    |
| 6.4344     | 6.4539          | 59    |
| 6.4290     | 6.4470          | 60    |
| 6.4159     | 6.4421          | 61    |
| 6.4069     | 6.4314          | 62    |
| 6.3964     | 6.4247          | 63    |
| 6.3887     | 6.4217          | 64    |
| 6.3783     | 6.4150          | 65    |
| 6.3670     | 6.4078          | 66    |
| 6.3593     | 6.3974          | 67    |
| 6.3500     | 6.3996          | 68    |
| 6.3359     | 6.3906          | 69    |
| 6.3358     | 6.3818          | 70    |
| 6.3298     | 6.3764          | 71    |
| 6.3158     | 6.3746          | 72    |
| 6.3026     | 6.3638          | 73    |
| 6.2904     | 6.3611          | 74    |
| 6.2861     | 6.3627          | 75    |
| 6.2820     | 6.3596          | 76    |
| 6.2658     | 6.3496          | 77    |
| 6.2554     | 6.3430          | 78    |
| 6.2552     | 6.3374          | 79    |
| 6.2468     | 6.3300          | 80    |
| 6.2316     | 6.3230          | 81    |
| 6.2314     | 6.3171          | 82    |
| 6.2198     | 6.3162          | 83    |
| 6.2084     | 6.3126          | 84    |
| 6.2020     | 6.3108          | 85    |
| 6.1906     | 6.3039          | 86    |
| 6.1851     | 6.2929          | 87    |
| 6.1749     | 6.2924          | 88    |
| 6.1678     | 6.3043          | 89    |
| 6.1576     | 6.2845          | 90    |
| 6.1566     | 6.2820          | 91    |
| 6.1454     | 6.2695          | 92    |
| 6.1351     | 6.2746          | 93    |
| 6.1313     | 6.2629          | 94    |
| 6.1211     | 6.2618          | 95    |
| 6.1107     | 6.2512          | 96    |
| 6.1092     | 6.2542          | 97    |
| 6.0974     | 6.2551          | 98    |
| 6.0880     | 6.2594          | 99    |


### Framework versions

- Transformers 4.33.2
- TensorFlow 2.13.0
- Datasets 2.14.5
- Tokenizers 0.13.3