cdactvm commited on
Commit
cf23252
1 Parent(s): 3668fc2

End of training

Browse files
README.md CHANGED
@@ -1,199 +1,248 @@
1
  ---
2
- library_name: transformers
3
- tags: []
 
 
 
 
 
 
 
4
  ---
5
 
6
- # Model Card for Model ID
7
-
8
- <!-- Provide a quick summary of what the model is/does. -->
9
-
10
-
11
-
12
- ## Model Details
13
-
14
- ### Model Description
15
-
16
- <!-- Provide a longer summary of what this model is. -->
17
-
18
- This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
19
-
20
- - **Developed by:** [More Information Needed]
21
- - **Funded by [optional]:** [More Information Needed]
22
- - **Shared by [optional]:** [More Information Needed]
23
- - **Model type:** [More Information Needed]
24
- - **Language(s) (NLP):** [More Information Needed]
25
- - **License:** [More Information Needed]
26
- - **Finetuned from model [optional]:** [More Information Needed]
27
-
28
- ### Model Sources [optional]
29
-
30
- <!-- Provide the basic links for the model. -->
31
-
32
- - **Repository:** [More Information Needed]
33
- - **Paper [optional]:** [More Information Needed]
34
- - **Demo [optional]:** [More Information Needed]
35
-
36
- ## Uses
37
-
38
- <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
39
-
40
- ### Direct Use
41
-
42
- <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
43
-
44
- [More Information Needed]
45
-
46
- ### Downstream Use [optional]
47
-
48
- <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
49
-
50
- [More Information Needed]
51
-
52
- ### Out-of-Scope Use
53
-
54
- <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
55
-
56
- [More Information Needed]
57
-
58
- ## Bias, Risks, and Limitations
59
-
60
- <!-- This section is meant to convey both technical and sociotechnical limitations. -->
61
-
62
- [More Information Needed]
63
-
64
- ### Recommendations
65
-
66
- <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
67
-
68
- Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
69
-
70
- ## How to Get Started with the Model
71
-
72
- Use the code below to get started with the model.
73
-
74
- [More Information Needed]
75
-
76
- ## Training Details
77
-
78
- ### Training Data
79
-
80
- <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
81
-
82
- [More Information Needed]
83
-
84
- ### Training Procedure
85
-
86
- <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
87
-
88
- #### Preprocessing [optional]
89
-
90
- [More Information Needed]
91
-
92
-
93
- #### Training Hyperparameters
94
-
95
- - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
96
-
97
- #### Speeds, Sizes, Times [optional]
98
-
99
- <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
100
-
101
- [More Information Needed]
102
-
103
- ## Evaluation
104
-
105
- <!-- This section describes the evaluation protocols and provides the results. -->
106
-
107
- ### Testing Data, Factors & Metrics
108
-
109
- #### Testing Data
110
-
111
- <!-- This should link to a Dataset Card if possible. -->
112
-
113
- [More Information Needed]
114
-
115
- #### Factors
116
-
117
- <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
118
-
119
- [More Information Needed]
120
-
121
- #### Metrics
122
-
123
- <!-- These are the evaluation metrics being used, ideally with a description of why. -->
124
-
125
- [More Information Needed]
126
-
127
- ### Results
128
-
129
- [More Information Needed]
130
-
131
- #### Summary
132
-
133
-
134
-
135
- ## Model Examination [optional]
136
-
137
- <!-- Relevant interpretability work for the model goes here -->
138
-
139
- [More Information Needed]
140
-
141
- ## Environmental Impact
142
-
143
- <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
144
-
145
- Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
146
-
147
- - **Hardware Type:** [More Information Needed]
148
- - **Hours used:** [More Information Needed]
149
- - **Cloud Provider:** [More Information Needed]
150
- - **Compute Region:** [More Information Needed]
151
- - **Carbon Emitted:** [More Information Needed]
152
-
153
- ## Technical Specifications [optional]
154
-
155
- ### Model Architecture and Objective
156
-
157
- [More Information Needed]
158
-
159
- ### Compute Infrastructure
160
-
161
- [More Information Needed]
162
-
163
- #### Hardware
164
-
165
- [More Information Needed]
166
-
167
- #### Software
168
-
169
- [More Information Needed]
170
-
171
- ## Citation [optional]
172
-
173
- <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
174
-
175
- **BibTeX:**
176
-
177
- [More Information Needed]
178
-
179
- **APA:**
180
-
181
- [More Information Needed]
182
-
183
- ## Glossary [optional]
184
-
185
- <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
186
-
187
- [More Information Needed]
188
-
189
- ## More Information [optional]
190
-
191
- [More Information Needed]
192
-
193
- ## Model Card Authors [optional]
194
-
195
- [More Information Needed]
196
-
197
- ## Model Card Contact
198
-
199
- [More Information Needed]
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
  ---
2
+ license: mit
3
+ base_model: facebook/w2v-bert-2.0
4
+ tags:
5
+ - generated_from_trainer
6
+ metrics:
7
+ - wer
8
+ model-index:
9
+ - name: w2v-bert-2.0-hindi_v1
10
+ results: []
11
  ---
12
 
13
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
14
+ should probably proofread and complete it, then remove this comment. -->
15
+
16
+ # w2v-bert-2.0-hindi_v1
17
+
18
+ This model is a fine-tuned version of [facebook/w2v-bert-2.0](https://huggingface.co/facebook/w2v-bert-2.0) on an unknown dataset.
19
+ It achieves the following results on the evaluation set:
20
+ - Loss: 0.0787
21
+ - Wer: 0.0505
22
+
23
+ ## Model description
24
+
25
+ More information needed
26
+
27
+ ## Intended uses & limitations
28
+
29
+ More information needed
30
+
31
+ ## Training and evaluation data
32
+
33
+ More information needed
34
+
35
+ ## Training procedure
36
+
37
+ ### Training hyperparameters
38
+
39
+ The following hyperparameters were used during training:
40
+ - learning_rate: 3.5356e-05
41
+ - train_batch_size: 1
42
+ - eval_batch_size: 1
43
+ - seed: 42
44
+ - gradient_accumulation_steps: 4
45
+ - total_train_batch_size: 4
46
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
47
+ - lr_scheduler_type: linear
48
+ - lr_scheduler_warmup_steps: 500
49
+ - num_epochs: 2
50
+ - mixed_precision_training: Native AMP
51
+
52
+ ### Training results
53
+
54
+ | Training Loss | Epoch | Step | Validation Loss | Wer |
55
+ |:-------------:|:------:|:-----:|:---------------:|:------:|
56
+ | 4.508 | 0.0108 | 300 | 3.5169 | 1.0 |
57
+ | 2.447 | 0.0216 | 600 | 1.1256 | 0.7027 |
58
+ | 1.2978 | 0.0324 | 900 | 0.7873 | 0.4987 |
59
+ | 1.034 | 0.0432 | 1200 | 0.6345 | 0.4258 |
60
+ | 0.9139 | 0.0540 | 1500 | 0.5973 | 0.3962 |
61
+ | 0.8422 | 0.0648 | 1800 | 0.5562 | 0.3586 |
62
+ | 0.7939 | 0.0755 | 2100 | 0.4826 | 0.3295 |
63
+ | 0.7194 | 0.0863 | 2400 | 0.4829 | 0.3266 |
64
+ | 0.737 | 0.0971 | 2700 | 0.4913 | 0.3557 |
65
+ | 0.6676 | 0.1079 | 3000 | 0.4541 | 0.3187 |
66
+ | 0.6265 | 0.1187 | 3300 | 0.4660 | 0.3088 |
67
+ | 0.6296 | 0.1295 | 3600 | 0.4080 | 0.2976 |
68
+ | 0.5943 | 0.1403 | 3900 | 0.4042 | 0.2799 |
69
+ | 0.6052 | 0.1511 | 4200 | 0.4212 | 0.2945 |
70
+ | 0.554 | 0.1619 | 4500 | 0.3867 | 0.2707 |
71
+ | 0.5613 | 0.1727 | 4800 | 0.3947 | 0.2881 |
72
+ | 0.5254 | 0.1835 | 5100 | 0.3586 | 0.2653 |
73
+ | 0.5288 | 0.1943 | 5400 | 0.3691 | 0.2801 |
74
+ | 0.5152 | 0.2051 | 5700 | 0.3619 | 0.2555 |
75
+ | 0.5361 | 0.2158 | 6000 | 0.3288 | 0.2401 |
76
+ | 0.5086 | 0.2266 | 6300 | 0.3216 | 0.2415 |
77
+ | 0.4799 | 0.2374 | 6600 | 0.3366 | 0.2467 |
78
+ | 0.4876 | 0.2482 | 6900 | 0.3282 | 0.2460 |
79
+ | 0.5001 | 0.2590 | 7200 | 0.3300 | 0.2499 |
80
+ | 0.4737 | 0.2698 | 7500 | 0.3494 | 0.2385 |
81
+ | 0.4768 | 0.2806 | 7800 | 0.3058 | 0.2368 |
82
+ | 0.435 | 0.2914 | 8100 | 0.3623 | 0.2561 |
83
+ | 0.4366 | 0.3022 | 8400 | 0.3111 | 0.2359 |
84
+ | 0.4155 | 0.3130 | 8700 | 0.2987 | 0.2348 |
85
+ | 0.4104 | 0.3238 | 9000 | 0.2932 | 0.2312 |
86
+ | 0.406 | 0.3346 | 9300 | 0.3100 | 0.2173 |
87
+ | 0.397 | 0.3454 | 9600 | 0.2972 | 0.2204 |
88
+ | 0.4224 | 0.3561 | 9900 | 0.3044 | 0.2212 |
89
+ | 0.3851 | 0.3669 | 10200 | 0.2941 | 0.2165 |
90
+ | 0.3684 | 0.3777 | 10500 | 0.2742 | 0.2084 |
91
+ | 0.3884 | 0.3885 | 10800 | 0.2633 | 0.2122 |
92
+ | 0.3681 | 0.3993 | 11100 | 0.2799 | 0.2089 |
93
+ | 0.3468 | 0.4101 | 11400 | 0.2873 | 0.2080 |
94
+ | 0.3753 | 0.4209 | 11700 | 0.2533 | 0.1978 |
95
+ | 0.3837 | 0.4317 | 12000 | 0.2628 | 0.2054 |
96
+ | 0.3442 | 0.4425 | 12300 | 0.2609 | 0.1994 |
97
+ | 0.3338 | 0.4533 | 12600 | 0.2512 | 0.2001 |
98
+ | 0.3593 | 0.4641 | 12900 | 0.2472 | 0.1954 |
99
+ | 0.3311 | 0.4749 | 13200 | 0.2705 | 0.1929 |
100
+ | 0.329 | 0.4857 | 13500 | 0.2545 | 0.1997 |
101
+ | 0.3122 | 0.4964 | 13800 | 0.2489 | 0.1931 |
102
+ | 0.3368 | 0.5072 | 14100 | 0.2568 | 0.1924 |
103
+ | 0.3364 | 0.5180 | 14400 | 0.2447 | 0.1949 |
104
+ | 0.367 | 0.5288 | 14700 | 0.2325 | 0.1849 |
105
+ | 0.3253 | 0.5396 | 15000 | 0.2448 | 0.1839 |
106
+ | 0.3166 | 0.5504 | 15300 | 0.2421 | 0.1902 |
107
+ | 0.3232 | 0.5612 | 15600 | 0.2319 | 0.1833 |
108
+ | 0.2959 | 0.5720 | 15900 | 0.2333 | 0.1757 |
109
+ | 0.315 | 0.5828 | 16200 | 0.2372 | 0.1809 |
110
+ | 0.2854 | 0.5936 | 16500 | 0.2400 | 0.1810 |
111
+ | 0.3361 | 0.6044 | 16800 | 0.2573 | 0.1780 |
112
+ | 0.3027 | 0.6152 | 17100 | 0.2308 | 0.1744 |
113
+ | 0.3015 | 0.6259 | 17400 | 0.2405 | 0.1736 |
114
+ | 0.3035 | 0.6367 | 17700 | 0.2322 | 0.1822 |
115
+ | 0.2882 | 0.6475 | 18000 | 0.2297 | 0.1762 |
116
+ | 0.267 | 0.6583 | 18300 | 0.2155 | 0.1652 |
117
+ | 0.2819 | 0.6691 | 18600 | 0.2156 | 0.1612 |
118
+ | 0.2898 | 0.6799 | 18900 | 0.2116 | 0.1585 |
119
+ | 0.2857 | 0.6907 | 19200 | 0.1987 | 0.1531 |
120
+ | 0.2826 | 0.7015 | 19500 | 0.1909 | 0.1556 |
121
+ | 0.2774 | 0.7123 | 19800 | 0.1858 | 0.1499 |
122
+ | 0.293 | 0.7231 | 20100 | 0.1940 | 0.1503 |
123
+ | 0.2771 | 0.7339 | 20400 | 0.1994 | 0.1521 |
124
+ | 0.2664 | 0.7447 | 20700 | 0.1948 | 0.1519 |
125
+ | 0.261 | 0.7555 | 21000 | 0.1875 | 0.1442 |
126
+ | 0.2467 | 0.7662 | 21300 | 0.1887 | 0.1439 |
127
+ | 0.2435 | 0.7770 | 21600 | 0.2039 | 0.1452 |
128
+ | 0.2459 | 0.7878 | 21900 | 0.1825 | 0.1398 |
129
+ | 0.2367 | 0.7986 | 22200 | 0.2007 | 0.1439 |
130
+ | 0.2383 | 0.8094 | 22500 | 0.1901 | 0.1419 |
131
+ | 0.2524 | 0.8202 | 22800 | 0.1727 | 0.1409 |
132
+ | 0.248 | 0.8310 | 23100 | 0.1926 | 0.1405 |
133
+ | 0.265 | 0.8418 | 23400 | 0.1795 | 0.1353 |
134
+ | 0.2469 | 0.8526 | 23700 | 0.1712 | 0.1301 |
135
+ | 0.2212 | 0.8634 | 24000 | 0.1841 | 0.1389 |
136
+ | 0.2591 | 0.8742 | 24300 | 0.1783 | 0.1281 |
137
+ | 0.2311 | 0.8850 | 24600 | 0.1843 | 0.1342 |
138
+ | 0.2297 | 0.8958 | 24900 | 0.1652 | 0.1326 |
139
+ | 0.2203 | 0.9065 | 25200 | 0.1608 | 0.1263 |
140
+ | 0.222 | 0.9173 | 25500 | 0.1788 | 0.1267 |
141
+ | 0.2232 | 0.9281 | 25800 | 0.1614 | 0.1226 |
142
+ | 0.2165 | 0.9389 | 26100 | 0.1746 | 0.1231 |
143
+ | 0.2111 | 0.9497 | 26400 | 0.1793 | 0.1274 |
144
+ | 0.2344 | 0.9605 | 26700 | 0.1645 | 0.1209 |
145
+ | 0.2075 | 0.9713 | 27000 | 0.1609 | 0.1243 |
146
+ | 0.212 | 0.9821 | 27300 | 0.1750 | 0.1294 |
147
+ | 0.1863 | 0.9929 | 27600 | 0.1595 | 0.1179 |
148
+ | 0.1876 | 1.0037 | 27900 | 0.1535 | 0.1150 |
149
+ | 0.1708 | 1.0145 | 28200 | 0.1599 | 0.1159 |
150
+ | 0.1624 | 1.0253 | 28500 | 0.1587 | 0.1172 |
151
+ | 0.1837 | 1.0361 | 28800 | 0.1561 | 0.1160 |
152
+ | 0.1894 | 1.0468 | 29100 | 0.1593 | 0.1079 |
153
+ | 0.1656 | 1.0576 | 29400 | 0.1549 | 0.1115 |
154
+ | 0.1809 | 1.0684 | 29700 | 0.1333 | 0.1093 |
155
+ | 0.1814 | 1.0792 | 30000 | 0.1458 | 0.1058 |
156
+ | 0.159 | 1.0900 | 30300 | 0.1460 | 0.1091 |
157
+ | 0.1707 | 1.1008 | 30600 | 0.1430 | 0.1077 |
158
+ | 0.1728 | 1.1116 | 30900 | 0.1564 | 0.1026 |
159
+ | 0.1583 | 1.1224 | 31200 | 0.1408 | 0.1021 |
160
+ | 0.1751 | 1.1332 | 31500 | 0.1464 | 0.1048 |
161
+ | 0.1686 | 1.1440 | 31800 | 0.1371 | 0.0999 |
162
+ | 0.1495 | 1.1548 | 32100 | 0.1448 | 0.0996 |
163
+ | 0.1647 | 1.1656 | 32400 | 0.1452 | 0.1004 |
164
+ | 0.151 | 1.1764 | 32700 | 0.1376 | 0.0993 |
165
+ | 0.1507 | 1.1871 | 33000 | 0.1308 | 0.0947 |
166
+ | 0.154 | 1.1979 | 33300 | 0.1315 | 0.0975 |
167
+ | 0.1452 | 1.2087 | 33600 | 0.1281 | 0.0951 |
168
+ | 0.1381 | 1.2195 | 33900 | 0.1329 | 0.0936 |
169
+ | 0.146 | 1.2303 | 34200 | 0.1304 | 0.0905 |
170
+ | 0.1697 | 1.2411 | 34500 | 0.1265 | 0.0930 |
171
+ | 0.1479 | 1.2519 | 34800 | 0.1245 | 0.0896 |
172
+ | 0.1583 | 1.2627 | 35100 | 0.1292 | 0.0888 |
173
+ | 0.1246 | 1.2735 | 35400 | 0.1330 | 0.0939 |
174
+ | 0.1537 | 1.2843 | 35700 | 0.1279 | 0.0865 |
175
+ | 0.142 | 1.2951 | 36000 | 0.1221 | 0.0877 |
176
+ | 0.1312 | 1.3059 | 36300 | 0.1222 | 0.0876 |
177
+ | 0.1364 | 1.3167 | 36600 | 0.1235 | 0.0881 |
178
+ | 0.1527 | 1.3274 | 36900 | 0.1241 | 0.0834 |
179
+ | 0.1362 | 1.3382 | 37200 | 0.1177 | 0.0810 |
180
+ | 0.1546 | 1.3490 | 37500 | 0.1212 | 0.0801 |
181
+ | 0.1341 | 1.3598 | 37800 | 0.1231 | 0.0819 |
182
+ | 0.1371 | 1.3706 | 38100 | 0.1196 | 0.0865 |
183
+ | 0.1425 | 1.3814 | 38400 | 0.1126 | 0.0805 |
184
+ | 0.16 | 1.3922 | 38700 | 0.1185 | 0.0783 |
185
+ | 0.1316 | 1.4030 | 39000 | 0.1204 | 0.0794 |
186
+ | 0.1361 | 1.4138 | 39300 | 0.1091 | 0.0777 |
187
+ | 0.1623 | 1.4246 | 39600 | 0.1090 | 0.0776 |
188
+ | 0.1246 | 1.4354 | 39900 | 0.1115 | 0.0779 |
189
+ | 0.1289 | 1.4462 | 40200 | 0.1081 | 0.0748 |
190
+ | 0.1124 | 1.4570 | 40500 | 0.1083 | 0.0745 |
191
+ | 0.1224 | 1.4677 | 40800 | 0.1072 | 0.0755 |
192
+ | 0.1218 | 1.4785 | 41100 | 0.1132 | 0.0739 |
193
+ | 0.121 | 1.4893 | 41400 | 0.1085 | 0.0733 |
194
+ | 0.1058 | 1.5001 | 41700 | 0.1098 | 0.0720 |
195
+ | 0.1304 | 1.5109 | 42000 | 0.1044 | 0.0694 |
196
+ | 0.1309 | 1.5217 | 42300 | 0.1045 | 0.0694 |
197
+ | 0.1418 | 1.5325 | 42600 | 0.0997 | 0.0675 |
198
+ | 0.1213 | 1.5433 | 42900 | 0.1039 | 0.0698 |
199
+ | 0.1253 | 1.5541 | 43200 | 0.1024 | 0.0695 |
200
+ | 0.1119 | 1.5649 | 43500 | 0.1043 | 0.0706 |
201
+ | 0.1132 | 1.5757 | 43800 | 0.1043 | 0.0665 |
202
+ | 0.1161 | 1.5865 | 44100 | 0.1041 | 0.0644 |
203
+ | 0.095 | 1.5973 | 44400 | 0.1014 | 0.0656 |
204
+ | 0.0958 | 1.6080 | 44700 | 0.0972 | 0.0640 |
205
+ | 0.1035 | 1.6188 | 45000 | 0.1003 | 0.0652 |
206
+ | 0.1054 | 1.6296 | 45300 | 0.1043 | 0.0666 |
207
+ | 0.1172 | 1.6404 | 45600 | 0.1002 | 0.0643 |
208
+ | 0.1078 | 1.6512 | 45900 | 0.0996 | 0.0641 |
209
+ | 0.102 | 1.6620 | 46200 | 0.0973 | 0.0619 |
210
+ | 0.108 | 1.6728 | 46500 | 0.0966 | 0.0609 |
211
+ | 0.1058 | 1.6836 | 46800 | 0.0938 | 0.0613 |
212
+ | 0.1134 | 1.6944 | 47100 | 0.0905 | 0.0606 |
213
+ | 0.1102 | 1.7052 | 47400 | 0.0915 | 0.0598 |
214
+ | 0.1342 | 1.7160 | 47700 | 0.0903 | 0.0587 |
215
+ | 0.1039 | 1.7268 | 48000 | 0.0905 | 0.0590 |
216
+ | 0.0993 | 1.7376 | 48300 | 0.0924 | 0.0596 |
217
+ | 0.0965 | 1.7483 | 48600 | 0.0898 | 0.0580 |
218
+ | 0.0911 | 1.7591 | 48900 | 0.0899 | 0.0577 |
219
+ | 0.1023 | 1.7699 | 49200 | 0.0897 | 0.0577 |
220
+ | 0.094 | 1.7807 | 49500 | 0.0875 | 0.0558 |
221
+ | 0.0962 | 1.7915 | 49800 | 0.0880 | 0.0558 |
222
+ | 0.0922 | 1.8023 | 50100 | 0.0858 | 0.0555 |
223
+ | 0.0945 | 1.8131 | 50400 | 0.0866 | 0.0548 |
224
+ | 0.0897 | 1.8239 | 50700 | 0.0840 | 0.0542 |
225
+ | 0.0921 | 1.8347 | 51000 | 0.0876 | 0.0549 |
226
+ | 0.0917 | 1.8455 | 51300 | 0.0853 | 0.0540 |
227
+ | 0.1093 | 1.8563 | 51600 | 0.0844 | 0.0540 |
228
+ | 0.0986 | 1.8671 | 51900 | 0.0831 | 0.0536 |
229
+ | 0.0904 | 1.8778 | 52200 | 0.0831 | 0.0530 |
230
+ | 0.096 | 1.8886 | 52500 | 0.0825 | 0.0531 |
231
+ | 0.0815 | 1.8994 | 52800 | 0.0837 | 0.0533 |
232
+ | 0.0892 | 1.9102 | 53100 | 0.0840 | 0.0533 |
233
+ | 0.0789 | 1.9210 | 53400 | 0.0826 | 0.0524 |
234
+ | 0.0914 | 1.9318 | 53700 | 0.0813 | 0.0520 |
235
+ | 0.1029 | 1.9426 | 54000 | 0.0803 | 0.0513 |
236
+ | 0.0856 | 1.9534 | 54300 | 0.0798 | 0.0511 |
237
+ | 0.0869 | 1.9642 | 54600 | 0.0794 | 0.0507 |
238
+ | 0.101 | 1.9750 | 54900 | 0.0785 | 0.0508 |
239
+ | 0.0917 | 1.9858 | 55200 | 0.0787 | 0.0507 |
240
+ | 0.0875 | 1.9966 | 55500 | 0.0787 | 0.0505 |
241
+
242
+
243
+ ### Framework versions
244
+
245
+ - Transformers 4.41.1
246
+ - Pytorch 2.1.2+cu121
247
+ - Datasets 2.19.1
248
+ - Tokenizers 0.19.1
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:d2648323db90f152af258e4bf0c89addcf755b50a9e8eaddb464fe278ffa1e58
3
  size 2423126260
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:c3f54835c223d627eaac8af6f8a353c6b3b7ef4cb98b839c2d633e256b660bb6
3
  size 2423126260
runs/Jun25_20-52-30_GPU/events.out.tfevents.1719329599.GPU.1184996.0 CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:f5bee4253dc7b32714a0226bf525dd1b5c268a18b9901c4f1a097ebd5b8d51ff
3
- size 104715
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:ae2fea2997bda143636e0d8907238db792e3e258dbf9c4298308470816ba5326
3
+ size 105614