gjonesQ02 commited on
Commit
773803c
1 Parent(s): efeb0d9

End of training

Browse files
README.md CHANGED
@@ -1,8 +1,8 @@
1
  ---
2
  license: apache-2.0
 
3
  tags:
4
  - generated_from_trainer
5
- base_model: distilgpt2
6
  model-index:
7
  - name: StatementOfWork_Generator_Omega_BS_1024_2
8
  results: []
@@ -15,7 +15,7 @@ should probably proofread and complete it, then remove this comment. -->
15
 
16
  This model is a fine-tuned version of [distilgpt2](https://huggingface.co/distilgpt2) on an unknown dataset.
17
  It achieves the following results on the evaluation set:
18
- - Loss: 0.7684
19
 
20
  ## Model description
21
 
@@ -35,7 +35,7 @@ More information needed
35
 
36
  The following hyperparameters were used during training:
37
  - learning_rate: 2e-05
38
- - train_batch_size: 10
39
  - eval_batch_size: 10
40
  - seed: 42
41
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
@@ -46,206 +46,206 @@ The following hyperparameters were used during training:
46
 
47
  | Training Loss | Epoch | Step | Validation Loss |
48
  |:-------------:|:-----:|:----:|:---------------:|
49
- | No log | 1.0 | 10 | 0.8329 |
50
- | No log | 2.0 | 20 | 0.8309 |
51
- | No log | 3.0 | 30 | 0.8323 |
52
- | No log | 4.0 | 40 | 0.8277 |
53
- | No log | 5.0 | 50 | 0.8296 |
54
- | No log | 6.0 | 60 | 0.8229 |
55
- | No log | 7.0 | 70 | 0.8243 |
56
- | No log | 8.0 | 80 | 0.8209 |
57
- | No log | 9.0 | 90 | 0.8132 |
58
- | No log | 10.0 | 100 | 0.8192 |
59
- | No log | 11.0 | 110 | 0.8151 |
60
- | No log | 12.0 | 120 | 0.8160 |
61
- | No log | 13.0 | 130 | 0.8177 |
62
- | No log | 14.0 | 140 | 0.8091 |
63
- | No log | 15.0 | 150 | 0.8085 |
64
- | No log | 16.0 | 160 | 0.8079 |
65
- | No log | 17.0 | 170 | 0.8044 |
66
- | No log | 18.0 | 180 | 0.8081 |
67
- | No log | 19.0 | 190 | 0.7995 |
68
- | No log | 20.0 | 200 | 0.8035 |
69
- | No log | 21.0 | 210 | 0.8020 |
70
- | No log | 22.0 | 220 | 0.7967 |
71
- | No log | 23.0 | 230 | 0.8016 |
72
- | No log | 24.0 | 240 | 0.7959 |
73
- | No log | 25.0 | 250 | 0.7949 |
74
- | No log | 26.0 | 260 | 0.7947 |
75
- | No log | 27.0 | 270 | 0.7943 |
76
- | No log | 28.0 | 280 | 0.7881 |
77
- | No log | 29.0 | 290 | 0.7946 |
78
- | No log | 30.0 | 300 | 0.7920 |
79
- | No log | 31.0 | 310 | 0.7912 |
80
- | No log | 32.0 | 320 | 0.7874 |
81
- | No log | 33.0 | 330 | 0.7889 |
82
- | No log | 34.0 | 340 | 0.7870 |
83
- | No log | 35.0 | 350 | 0.7873 |
84
- | No log | 36.0 | 360 | 0.7832 |
85
- | No log | 37.0 | 370 | 0.7883 |
86
- | No log | 38.0 | 380 | 0.7865 |
87
- | No log | 39.0 | 390 | 0.7816 |
88
- | No log | 40.0 | 400 | 0.7849 |
89
- | No log | 41.0 | 410 | 0.7815 |
90
- | No log | 42.0 | 420 | 0.7813 |
91
- | No log | 43.0 | 430 | 0.7831 |
92
- | No log | 44.0 | 440 | 0.7794 |
93
- | No log | 45.0 | 450 | 0.7797 |
94
- | No log | 46.0 | 460 | 0.7771 |
95
- | No log | 47.0 | 470 | 0.7799 |
96
- | No log | 48.0 | 480 | 0.7749 |
97
- | No log | 49.0 | 490 | 0.7780 |
98
- | 0.2277 | 50.0 | 500 | 0.7758 |
99
- | 0.2277 | 51.0 | 510 | 0.7763 |
100
- | 0.2277 | 52.0 | 520 | 0.7757 |
101
- | 0.2277 | 53.0 | 530 | 0.7773 |
102
- | 0.2277 | 54.0 | 540 | 0.7751 |
103
- | 0.2277 | 55.0 | 550 | 0.7753 |
104
- | 0.2277 | 56.0 | 560 | 0.7740 |
105
- | 0.2277 | 57.0 | 570 | 0.7746 |
106
- | 0.2277 | 58.0 | 580 | 0.7748 |
107
- | 0.2277 | 59.0 | 590 | 0.7730 |
108
- | 0.2277 | 60.0 | 600 | 0.7720 |
109
- | 0.2277 | 61.0 | 610 | 0.7723 |
110
- | 0.2277 | 62.0 | 620 | 0.7708 |
111
- | 0.2277 | 63.0 | 630 | 0.7725 |
112
- | 0.2277 | 64.0 | 640 | 0.7734 |
113
- | 0.2277 | 65.0 | 650 | 0.7702 |
114
- | 0.2277 | 66.0 | 660 | 0.7744 |
115
- | 0.2277 | 67.0 | 670 | 0.7714 |
116
- | 0.2277 | 68.0 | 680 | 0.7709 |
117
- | 0.2277 | 69.0 | 690 | 0.7684 |
118
- | 0.2277 | 70.0 | 700 | 0.7721 |
119
- | 0.2277 | 71.0 | 710 | 0.7710 |
120
- | 0.2277 | 72.0 | 720 | 0.7694 |
121
- | 0.2277 | 73.0 | 730 | 0.7703 |
122
- | 0.2277 | 74.0 | 740 | 0.7689 |
123
- | 0.2277 | 75.0 | 750 | 0.7707 |
124
- | 0.2277 | 76.0 | 760 | 0.7735 |
125
- | 0.2277 | 77.0 | 770 | 0.7710 |
126
- | 0.2277 | 78.0 | 780 | 0.7727 |
127
- | 0.2277 | 79.0 | 790 | 0.7704 |
128
- | 0.2277 | 80.0 | 800 | 0.7699 |
129
- | 0.2277 | 81.0 | 810 | 0.7699 |
130
- | 0.2277 | 82.0 | 820 | 0.7707 |
131
- | 0.2277 | 83.0 | 830 | 0.7698 |
132
- | 0.2277 | 84.0 | 840 | 0.7687 |
133
- | 0.2277 | 85.0 | 850 | 0.7705 |
134
- | 0.2277 | 86.0 | 860 | 0.7687 |
135
- | 0.2277 | 87.0 | 870 | 0.7687 |
136
- | 0.2277 | 88.0 | 880 | 0.7682 |
137
- | 0.2277 | 89.0 | 890 | 0.7688 |
138
- | 0.2277 | 90.0 | 900 | 0.7685 |
139
- | 0.2277 | 91.0 | 910 | 0.7691 |
140
- | 0.2277 | 92.0 | 920 | 0.7673 |
141
- | 0.2277 | 93.0 | 930 | 0.7673 |
142
- | 0.2277 | 94.0 | 940 | 0.7678 |
143
- | 0.2277 | 95.0 | 950 | 0.7700 |
144
- | 0.2277 | 96.0 | 960 | 0.7685 |
145
- | 0.2277 | 97.0 | 970 | 0.7697 |
146
- | 0.2277 | 98.0 | 980 | 0.7684 |
147
- | 0.2277 | 99.0 | 990 | 0.7680 |
148
- | 0.166 | 100.0 | 1000 | 0.7680 |
149
- | 0.166 | 101.0 | 1010 | 0.7697 |
150
- | 0.166 | 102.0 | 1020 | 0.7675 |
151
- | 0.166 | 103.0 | 1030 | 0.7698 |
152
- | 0.166 | 104.0 | 1040 | 0.7682 |
153
- | 0.166 | 105.0 | 1050 | 0.7663 |
154
- | 0.166 | 106.0 | 1060 | 0.7686 |
155
- | 0.166 | 107.0 | 1070 | 0.7645 |
156
- | 0.166 | 108.0 | 1080 | 0.7671 |
157
- | 0.166 | 109.0 | 1090 | 0.7675 |
158
- | 0.166 | 110.0 | 1100 | 0.7666 |
159
- | 0.166 | 111.0 | 1110 | 0.7664 |
160
- | 0.166 | 112.0 | 1120 | 0.7656 |
161
- | 0.166 | 113.0 | 1130 | 0.7655 |
162
- | 0.166 | 114.0 | 1140 | 0.7666 |
163
- | 0.166 | 115.0 | 1150 | 0.7648 |
164
- | 0.166 | 116.0 | 1160 | 0.7669 |
165
- | 0.166 | 117.0 | 1170 | 0.7649 |
166
- | 0.166 | 118.0 | 1180 | 0.7654 |
167
- | 0.166 | 119.0 | 1190 | 0.7671 |
168
- | 0.166 | 120.0 | 1200 | 0.7663 |
169
- | 0.166 | 121.0 | 1210 | 0.7668 |
170
- | 0.166 | 122.0 | 1220 | 0.7672 |
171
- | 0.166 | 123.0 | 1230 | 0.7678 |
172
- | 0.166 | 124.0 | 1240 | 0.7680 |
173
- | 0.166 | 125.0 | 1250 | 0.7681 |
174
- | 0.166 | 126.0 | 1260 | 0.7672 |
175
- | 0.166 | 127.0 | 1270 | 0.7656 |
176
- | 0.166 | 128.0 | 1280 | 0.7651 |
177
- | 0.166 | 129.0 | 1290 | 0.7667 |
178
- | 0.166 | 130.0 | 1300 | 0.7674 |
179
- | 0.166 | 131.0 | 1310 | 0.7657 |
180
- | 0.166 | 132.0 | 1320 | 0.7652 |
181
- | 0.166 | 133.0 | 1330 | 0.7658 |
182
- | 0.166 | 134.0 | 1340 | 0.7678 |
183
- | 0.166 | 135.0 | 1350 | 0.7701 |
184
- | 0.166 | 136.0 | 1360 | 0.7698 |
185
- | 0.166 | 137.0 | 1370 | 0.7678 |
186
- | 0.166 | 138.0 | 1380 | 0.7675 |
187
- | 0.166 | 139.0 | 1390 | 0.7679 |
188
- | 0.166 | 140.0 | 1400 | 0.7678 |
189
- | 0.166 | 141.0 | 1410 | 0.7675 |
190
- | 0.166 | 142.0 | 1420 | 0.7685 |
191
- | 0.166 | 143.0 | 1430 | 0.7669 |
192
- | 0.166 | 144.0 | 1440 | 0.7673 |
193
- | 0.166 | 145.0 | 1450 | 0.7689 |
194
- | 0.166 | 146.0 | 1460 | 0.7678 |
195
- | 0.166 | 147.0 | 1470 | 0.7685 |
196
- | 0.166 | 148.0 | 1480 | 0.7691 |
197
- | 0.166 | 149.0 | 1490 | 0.7697 |
198
- | 0.1346 | 150.0 | 1500 | 0.7689 |
199
- | 0.1346 | 151.0 | 1510 | 0.7701 |
200
- | 0.1346 | 152.0 | 1520 | 0.7684 |
201
- | 0.1346 | 153.0 | 1530 | 0.7676 |
202
- | 0.1346 | 154.0 | 1540 | 0.7681 |
203
- | 0.1346 | 155.0 | 1550 | 0.7679 |
204
- | 0.1346 | 156.0 | 1560 | 0.7669 |
205
- | 0.1346 | 157.0 | 1570 | 0.7668 |
206
- | 0.1346 | 158.0 | 1580 | 0.7676 |
207
- | 0.1346 | 159.0 | 1590 | 0.7689 |
208
- | 0.1346 | 160.0 | 1600 | 0.7695 |
209
- | 0.1346 | 161.0 | 1610 | 0.7688 |
210
- | 0.1346 | 162.0 | 1620 | 0.7678 |
211
- | 0.1346 | 163.0 | 1630 | 0.7685 |
212
- | 0.1346 | 164.0 | 1640 | 0.7680 |
213
- | 0.1346 | 165.0 | 1650 | 0.7689 |
214
- | 0.1346 | 166.0 | 1660 | 0.7699 |
215
- | 0.1346 | 167.0 | 1670 | 0.7693 |
216
- | 0.1346 | 168.0 | 1680 | 0.7684 |
217
- | 0.1346 | 169.0 | 1690 | 0.7670 |
218
- | 0.1346 | 170.0 | 1700 | 0.7667 |
219
- | 0.1346 | 171.0 | 1710 | 0.7684 |
220
- | 0.1346 | 172.0 | 1720 | 0.7699 |
221
- | 0.1346 | 173.0 | 1730 | 0.7692 |
222
- | 0.1346 | 174.0 | 1740 | 0.7673 |
223
- | 0.1346 | 175.0 | 1750 | 0.7677 |
224
- | 0.1346 | 176.0 | 1760 | 0.7675 |
225
- | 0.1346 | 177.0 | 1770 | 0.7680 |
226
- | 0.1346 | 178.0 | 1780 | 0.7684 |
227
- | 0.1346 | 179.0 | 1790 | 0.7685 |
228
- | 0.1346 | 180.0 | 1800 | 0.7683 |
229
- | 0.1346 | 181.0 | 1810 | 0.7679 |
230
- | 0.1346 | 182.0 | 1820 | 0.7680 |
231
- | 0.1346 | 183.0 | 1830 | 0.7678 |
232
- | 0.1346 | 184.0 | 1840 | 0.7681 |
233
- | 0.1346 | 185.0 | 1850 | 0.7685 |
234
- | 0.1346 | 186.0 | 1860 | 0.7682 |
235
- | 0.1346 | 187.0 | 1870 | 0.7679 |
236
- | 0.1346 | 188.0 | 1880 | 0.7679 |
237
- | 0.1346 | 189.0 | 1890 | 0.7682 |
238
- | 0.1346 | 190.0 | 1900 | 0.7681 |
239
- | 0.1346 | 191.0 | 1910 | 0.7681 |
240
- | 0.1346 | 192.0 | 1920 | 0.7680 |
241
- | 0.1346 | 193.0 | 1930 | 0.7679 |
242
- | 0.1346 | 194.0 | 1940 | 0.7679 |
243
- | 0.1346 | 195.0 | 1950 | 0.7681 |
244
- | 0.1346 | 196.0 | 1960 | 0.7684 |
245
- | 0.1346 | 197.0 | 1970 | 0.7685 |
246
- | 0.1346 | 198.0 | 1980 | 0.7685 |
247
- | 0.1346 | 199.0 | 1990 | 0.7684 |
248
- | 0.1215 | 200.0 | 2000 | 0.7684 |
249
 
250
 
251
  ### Framework versions
 
1
  ---
2
  license: apache-2.0
3
+ base_model: distilgpt2
4
  tags:
5
  - generated_from_trainer
 
6
  model-index:
7
  - name: StatementOfWork_Generator_Omega_BS_1024_2
8
  results: []
 
15
 
16
  This model is a fine-tuned version of [distilgpt2](https://huggingface.co/distilgpt2) on an unknown dataset.
17
  It achieves the following results on the evaluation set:
18
+ - Loss: 0.7939
19
 
20
  ## Model description
21
 
 
35
 
36
  The following hyperparameters were used during training:
37
  - learning_rate: 2e-05
38
+ - train_batch_size: 20
39
  - eval_batch_size: 10
40
  - seed: 42
41
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
 
46
 
47
  | Training Loss | Epoch | Step | Validation Loss |
48
  |:-------------:|:-----:|:----:|:---------------:|
49
+ | No log | 1.0 | 5 | 0.7747 |
50
+ | No log | 2.0 | 10 | 0.7699 |
51
+ | No log | 3.0 | 15 | 0.7670 |
52
+ | No log | 4.0 | 20 | 0.7678 |
53
+ | No log | 5.0 | 25 | 0.7720 |
54
+ | No log | 6.0 | 30 | 0.7740 |
55
+ | No log | 7.0 | 35 | 0.7681 |
56
+ | No log | 8.0 | 40 | 0.7736 |
57
+ | No log | 9.0 | 45 | 0.7701 |
58
+ | No log | 10.0 | 50 | 0.7704 |
59
+ | No log | 11.0 | 55 | 0.7728 |
60
+ | No log | 12.0 | 60 | 0.7714 |
61
+ | No log | 13.0 | 65 | 0.7722 |
62
+ | No log | 14.0 | 70 | 0.7730 |
63
+ | No log | 15.0 | 75 | 0.7731 |
64
+ | No log | 16.0 | 80 | 0.7742 |
65
+ | No log | 17.0 | 85 | 0.7726 |
66
+ | No log | 18.0 | 90 | 0.7732 |
67
+ | No log | 19.0 | 95 | 0.7729 |
68
+ | No log | 20.0 | 100 | 0.7720 |
69
+ | No log | 21.0 | 105 | 0.7727 |
70
+ | No log | 22.0 | 110 | 0.7731 |
71
+ | No log | 23.0 | 115 | 0.7723 |
72
+ | No log | 24.0 | 120 | 0.7756 |
73
+ | No log | 25.0 | 125 | 0.7746 |
74
+ | No log | 26.0 | 130 | 0.7721 |
75
+ | No log | 27.0 | 135 | 0.7759 |
76
+ | No log | 28.0 | 140 | 0.7727 |
77
+ | No log | 29.0 | 145 | 0.7754 |
78
+ | No log | 30.0 | 150 | 0.7769 |
79
+ | No log | 31.0 | 155 | 0.7747 |
80
+ | No log | 32.0 | 160 | 0.7728 |
81
+ | No log | 33.0 | 165 | 0.7749 |
82
+ | No log | 34.0 | 170 | 0.7760 |
83
+ | No log | 35.0 | 175 | 0.7736 |
84
+ | No log | 36.0 | 180 | 0.7774 |
85
+ | No log | 37.0 | 185 | 0.7758 |
86
+ | No log | 38.0 | 190 | 0.7757 |
87
+ | No log | 39.0 | 195 | 0.7759 |
88
+ | No log | 40.0 | 200 | 0.7789 |
89
+ | No log | 41.0 | 205 | 0.7796 |
90
+ | No log | 42.0 | 210 | 0.7779 |
91
+ | No log | 43.0 | 215 | 0.7785 |
92
+ | No log | 44.0 | 220 | 0.7779 |
93
+ | No log | 45.0 | 225 | 0.7770 |
94
+ | No log | 46.0 | 230 | 0.7787 |
95
+ | No log | 47.0 | 235 | 0.7800 |
96
+ | No log | 48.0 | 240 | 0.7789 |
97
+ | No log | 49.0 | 245 | 0.7784 |
98
+ | No log | 50.0 | 250 | 0.7805 |
99
+ | No log | 51.0 | 255 | 0.7802 |
100
+ | No log | 52.0 | 260 | 0.7816 |
101
+ | No log | 53.0 | 265 | 0.7803 |
102
+ | No log | 54.0 | 270 | 0.7789 |
103
+ | No log | 55.0 | 275 | 0.7804 |
104
+ | No log | 56.0 | 280 | 0.7824 |
105
+ | No log | 57.0 | 285 | 0.7814 |
106
+ | No log | 58.0 | 290 | 0.7798 |
107
+ | No log | 59.0 | 295 | 0.7829 |
108
+ | No log | 60.0 | 300 | 0.7820 |
109
+ | No log | 61.0 | 305 | 0.7815 |
110
+ | No log | 62.0 | 310 | 0.7818 |
111
+ | No log | 63.0 | 315 | 0.7826 |
112
+ | No log | 64.0 | 320 | 0.7820 |
113
+ | No log | 65.0 | 325 | 0.7816 |
114
+ | No log | 66.0 | 330 | 0.7847 |
115
+ | No log | 67.0 | 335 | 0.7821 |
116
+ | No log | 68.0 | 340 | 0.7827 |
117
+ | No log | 69.0 | 345 | 0.7816 |
118
+ | No log | 70.0 | 350 | 0.7833 |
119
+ | No log | 71.0 | 355 | 0.7853 |
120
+ | No log | 72.0 | 360 | 0.7837 |
121
+ | No log | 73.0 | 365 | 0.7854 |
122
+ | No log | 74.0 | 370 | 0.7842 |
123
+ | No log | 75.0 | 375 | 0.7836 |
124
+ | No log | 76.0 | 380 | 0.7846 |
125
+ | No log | 77.0 | 385 | 0.7837 |
126
+ | No log | 78.0 | 390 | 0.7829 |
127
+ | No log | 79.0 | 395 | 0.7849 |
128
+ | No log | 80.0 | 400 | 0.7845 |
129
+ | No log | 81.0 | 405 | 0.7854 |
130
+ | No log | 82.0 | 410 | 0.7854 |
131
+ | No log | 83.0 | 415 | 0.7842 |
132
+ | No log | 84.0 | 420 | 0.7854 |
133
+ | No log | 85.0 | 425 | 0.7847 |
134
+ | No log | 86.0 | 430 | 0.7850 |
135
+ | No log | 87.0 | 435 | 0.7852 |
136
+ | No log | 88.0 | 440 | 0.7847 |
137
+ | No log | 89.0 | 445 | 0.7870 |
138
+ | No log | 90.0 | 450 | 0.7881 |
139
+ | No log | 91.0 | 455 | 0.7850 |
140
+ | No log | 92.0 | 460 | 0.7852 |
141
+ | No log | 93.0 | 465 | 0.7856 |
142
+ | No log | 94.0 | 470 | 0.7840 |
143
+ | No log | 95.0 | 475 | 0.7854 |
144
+ | No log | 96.0 | 480 | 0.7864 |
145
+ | No log | 97.0 | 485 | 0.7870 |
146
+ | No log | 98.0 | 490 | 0.7864 |
147
+ | No log | 99.0 | 495 | 0.7869 |
148
+ | 0.0968 | 100.0 | 500 | 0.7872 |
149
+ | 0.0968 | 101.0 | 505 | 0.7870 |
150
+ | 0.0968 | 102.0 | 510 | 0.7864 |
151
+ | 0.0968 | 103.0 | 515 | 0.7863 |
152
+ | 0.0968 | 104.0 | 520 | 0.7862 |
153
+ | 0.0968 | 105.0 | 525 | 0.7862 |
154
+ | 0.0968 | 106.0 | 530 | 0.7872 |
155
+ | 0.0968 | 107.0 | 535 | 0.7883 |
156
+ | 0.0968 | 108.0 | 540 | 0.7884 |
157
+ | 0.0968 | 109.0 | 545 | 0.7868 |
158
+ | 0.0968 | 110.0 | 550 | 0.7869 |
159
+ | 0.0968 | 111.0 | 555 | 0.7864 |
160
+ | 0.0968 | 112.0 | 560 | 0.7864 |
161
+ | 0.0968 | 113.0 | 565 | 0.7861 |
162
+ | 0.0968 | 114.0 | 570 | 0.7859 |
163
+ | 0.0968 | 115.0 | 575 | 0.7876 |
164
+ | 0.0968 | 116.0 | 580 | 0.7895 |
165
+ | 0.0968 | 117.0 | 585 | 0.7900 |
166
+ | 0.0968 | 118.0 | 590 | 0.7903 |
167
+ | 0.0968 | 119.0 | 595 | 0.7897 |
168
+ | 0.0968 | 120.0 | 600 | 0.7900 |
169
+ | 0.0968 | 121.0 | 605 | 0.7904 |
170
+ | 0.0968 | 122.0 | 610 | 0.7909 |
171
+ | 0.0968 | 123.0 | 615 | 0.7904 |
172
+ | 0.0968 | 124.0 | 620 | 0.7904 |
173
+ | 0.0968 | 125.0 | 625 | 0.7911 |
174
+ | 0.0968 | 126.0 | 630 | 0.7911 |
175
+ | 0.0968 | 127.0 | 635 | 0.7893 |
176
+ | 0.0968 | 128.0 | 640 | 0.7898 |
177
+ | 0.0968 | 129.0 | 645 | 0.7915 |
178
+ | 0.0968 | 130.0 | 650 | 0.7921 |
179
+ | 0.0968 | 131.0 | 655 | 0.7923 |
180
+ | 0.0968 | 132.0 | 660 | 0.7916 |
181
+ | 0.0968 | 133.0 | 665 | 0.7910 |
182
+ | 0.0968 | 134.0 | 670 | 0.7909 |
183
+ | 0.0968 | 135.0 | 675 | 0.7920 |
184
+ | 0.0968 | 136.0 | 680 | 0.7928 |
185
+ | 0.0968 | 137.0 | 685 | 0.7921 |
186
+ | 0.0968 | 138.0 | 690 | 0.7910 |
187
+ | 0.0968 | 139.0 | 695 | 0.7908 |
188
+ | 0.0968 | 140.0 | 700 | 0.7929 |
189
+ | 0.0968 | 141.0 | 705 | 0.7940 |
190
+ | 0.0968 | 142.0 | 710 | 0.7930 |
191
+ | 0.0968 | 143.0 | 715 | 0.7924 |
192
+ | 0.0968 | 144.0 | 720 | 0.7919 |
193
+ | 0.0968 | 145.0 | 725 | 0.7923 |
194
+ | 0.0968 | 146.0 | 730 | 0.7922 |
195
+ | 0.0968 | 147.0 | 735 | 0.7921 |
196
+ | 0.0968 | 148.0 | 740 | 0.7929 |
197
+ | 0.0968 | 149.0 | 745 | 0.7936 |
198
+ | 0.0968 | 150.0 | 750 | 0.7938 |
199
+ | 0.0968 | 151.0 | 755 | 0.7938 |
200
+ | 0.0968 | 152.0 | 760 | 0.7938 |
201
+ | 0.0968 | 153.0 | 765 | 0.7940 |
202
+ | 0.0968 | 154.0 | 770 | 0.7934 |
203
+ | 0.0968 | 155.0 | 775 | 0.7927 |
204
+ | 0.0968 | 156.0 | 780 | 0.7928 |
205
+ | 0.0968 | 157.0 | 785 | 0.7931 |
206
+ | 0.0968 | 158.0 | 790 | 0.7929 |
207
+ | 0.0968 | 159.0 | 795 | 0.7925 |
208
+ | 0.0968 | 160.0 | 800 | 0.7920 |
209
+ | 0.0968 | 161.0 | 805 | 0.7919 |
210
+ | 0.0968 | 162.0 | 810 | 0.7917 |
211
+ | 0.0968 | 163.0 | 815 | 0.7925 |
212
+ | 0.0968 | 164.0 | 820 | 0.7933 |
213
+ | 0.0968 | 165.0 | 825 | 0.7933 |
214
+ | 0.0968 | 166.0 | 830 | 0.7929 |
215
+ | 0.0968 | 167.0 | 835 | 0.7932 |
216
+ | 0.0968 | 168.0 | 840 | 0.7938 |
217
+ | 0.0968 | 169.0 | 845 | 0.7939 |
218
+ | 0.0968 | 170.0 | 850 | 0.7938 |
219
+ | 0.0968 | 171.0 | 855 | 0.7936 |
220
+ | 0.0968 | 172.0 | 860 | 0.7937 |
221
+ | 0.0968 | 173.0 | 865 | 0.7936 |
222
+ | 0.0968 | 174.0 | 870 | 0.7934 |
223
+ | 0.0968 | 175.0 | 875 | 0.7933 |
224
+ | 0.0968 | 176.0 | 880 | 0.7938 |
225
+ | 0.0968 | 177.0 | 885 | 0.7943 |
226
+ | 0.0968 | 178.0 | 890 | 0.7942 |
227
+ | 0.0968 | 179.0 | 895 | 0.7940 |
228
+ | 0.0968 | 180.0 | 900 | 0.7942 |
229
+ | 0.0968 | 181.0 | 905 | 0.7946 |
230
+ | 0.0968 | 182.0 | 910 | 0.7947 |
231
+ | 0.0968 | 183.0 | 915 | 0.7944 |
232
+ | 0.0968 | 184.0 | 920 | 0.7940 |
233
+ | 0.0968 | 185.0 | 925 | 0.7938 |
234
+ | 0.0968 | 186.0 | 930 | 0.7935 |
235
+ | 0.0968 | 187.0 | 935 | 0.7934 |
236
+ | 0.0968 | 188.0 | 940 | 0.7935 |
237
+ | 0.0968 | 189.0 | 945 | 0.7936 |
238
+ | 0.0968 | 190.0 | 950 | 0.7937 |
239
+ | 0.0968 | 191.0 | 955 | 0.7938 |
240
+ | 0.0968 | 192.0 | 960 | 0.7939 |
241
+ | 0.0968 | 193.0 | 965 | 0.7940 |
242
+ | 0.0968 | 194.0 | 970 | 0.7939 |
243
+ | 0.0968 | 195.0 | 975 | 0.7939 |
244
+ | 0.0968 | 196.0 | 980 | 0.7939 |
245
+ | 0.0968 | 197.0 | 985 | 0.7939 |
246
+ | 0.0968 | 198.0 | 990 | 0.7939 |
247
+ | 0.0968 | 199.0 | 995 | 0.7939 |
248
+ | 0.0702 | 200.0 | 1000 | 0.7939 |
249
 
250
 
251
  ### Framework versions
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:7beb7fab556a6df95c4063341bf413af28179e24bfe1bd1a10c933741d28f900
3
  size 327657928
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:b9c3d4fe18dbafae2f86b8e51702eb75db167d67cfa8bef81506170021522c75
3
  size 327657928
runs/Jul02_16-10-18_viridian/events.out.tfevents.1719936620.viridian.3874171.6 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:1fa9101581f04ca4a1f2dfc9be7c155bf071db62bdd9a9fe2799fe7d9ef545e7
3
+ size 59929
training_args.bin CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:c8ad5c3f5dbda24f21458ae661a230c50b9cd10ffc77f2521dac8e57a948e5ed
3
  size 5048
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:d4b73340fce85723dde334bd95136763f7a6e6912c391c65ec51d5303a8e8d8d
3
  size 5048