End of training
Browse files- README.md +203 -203
- model.safetensors +1 -1
- runs/Jul02_16-10-18_viridian/events.out.tfevents.1719936620.viridian.3874171.6 +3 -0
- training_args.bin +1 -1
README.md
CHANGED
@@ -1,8 +1,8 @@
|
|
1 |
---
|
2 |
license: apache-2.0
|
|
|
3 |
tags:
|
4 |
- generated_from_trainer
|
5 |
-
base_model: distilgpt2
|
6 |
model-index:
|
7 |
- name: StatementOfWork_Generator_Omega_BS_1024_2
|
8 |
results: []
|
@@ -15,7 +15,7 @@ should probably proofread and complete it, then remove this comment. -->
|
|
15 |
|
16 |
This model is a fine-tuned version of [distilgpt2](https://huggingface.co/distilgpt2) on an unknown dataset.
|
17 |
It achieves the following results on the evaluation set:
|
18 |
-
- Loss: 0.
|
19 |
|
20 |
## Model description
|
21 |
|
@@ -35,7 +35,7 @@ More information needed
|
|
35 |
|
36 |
The following hyperparameters were used during training:
|
37 |
- learning_rate: 2e-05
|
38 |
-
- train_batch_size:
|
39 |
- eval_batch_size: 10
|
40 |
- seed: 42
|
41 |
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
|
@@ -46,206 +46,206 @@ The following hyperparameters were used during training:
|
|
46 |
|
47 |
| Training Loss | Epoch | Step | Validation Loss |
|
48 |
|:-------------:|:-----:|:----:|:---------------:|
|
49 |
-
| No log | 1.0 |
|
50 |
-
| No log | 2.0 |
|
51 |
-
| No log | 3.0 |
|
52 |
-
| No log | 4.0 |
|
53 |
-
| No log | 5.0 |
|
54 |
-
| No log | 6.0 |
|
55 |
-
| No log | 7.0 |
|
56 |
-
| No log | 8.0 |
|
57 |
-
| No log | 9.0 |
|
58 |
-
| No log | 10.0 |
|
59 |
-
| No log | 11.0 |
|
60 |
-
| No log | 12.0 |
|
61 |
-
| No log | 13.0 |
|
62 |
-
| No log | 14.0 |
|
63 |
-
| No log | 15.0 |
|
64 |
-
| No log | 16.0 |
|
65 |
-
| No log | 17.0 |
|
66 |
-
| No log | 18.0 |
|
67 |
-
| No log | 19.0 |
|
68 |
-
| No log | 20.0 |
|
69 |
-
| No log | 21.0 |
|
70 |
-
| No log | 22.0 |
|
71 |
-
| No log | 23.0 |
|
72 |
-
| No log | 24.0 |
|
73 |
-
| No log | 25.0 |
|
74 |
-
| No log | 26.0 |
|
75 |
-
| No log | 27.0 |
|
76 |
-
| No log | 28.0 |
|
77 |
-
| No log | 29.0 |
|
78 |
-
| No log | 30.0 |
|
79 |
-
| No log | 31.0 |
|
80 |
-
| No log | 32.0 |
|
81 |
-
| No log | 33.0 |
|
82 |
-
| No log | 34.0 |
|
83 |
-
| No log | 35.0 |
|
84 |
-
| No log | 36.0 |
|
85 |
-
| No log | 37.0 |
|
86 |
-
| No log | 38.0 |
|
87 |
-
| No log | 39.0 |
|
88 |
-
| No log | 40.0 |
|
89 |
-
| No log | 41.0 |
|
90 |
-
| No log | 42.0 |
|
91 |
-
| No log | 43.0 |
|
92 |
-
| No log | 44.0 |
|
93 |
-
| No log | 45.0 |
|
94 |
-
| No log | 46.0 |
|
95 |
-
| No log | 47.0 |
|
96 |
-
| No log | 48.0 |
|
97 |
-
| No log | 49.0 |
|
98 |
-
|
|
99 |
-
|
|
100 |
-
|
|
101 |
-
|
|
102 |
-
|
|
103 |
-
|
|
104 |
-
|
|
105 |
-
|
|
106 |
-
|
|
107 |
-
|
|
108 |
-
|
|
109 |
-
|
|
110 |
-
|
|
111 |
-
|
|
112 |
-
|
|
113 |
-
|
|
114 |
-
|
|
115 |
-
|
|
116 |
-
|
|
117 |
-
|
|
118 |
-
|
|
119 |
-
|
|
120 |
-
|
|
121 |
-
|
|
122 |
-
|
|
123 |
-
|
|
124 |
-
|
|
125 |
-
|
|
126 |
-
|
|
127 |
-
|
|
128 |
-
|
|
129 |
-
|
|
130 |
-
|
|
131 |
-
|
|
132 |
-
|
|
133 |
-
|
|
134 |
-
|
|
135 |
-
|
|
136 |
-
|
|
137 |
-
|
|
138 |
-
|
|
139 |
-
|
|
140 |
-
|
|
141 |
-
|
|
142 |
-
|
|
143 |
-
|
|
144 |
-
|
|
145 |
-
|
|
146 |
-
|
|
147 |
-
|
|
148 |
-
| 0.
|
149 |
-
| 0.
|
150 |
-
| 0.
|
151 |
-
| 0.
|
152 |
-
| 0.
|
153 |
-
| 0.
|
154 |
-
| 0.
|
155 |
-
| 0.
|
156 |
-
| 0.
|
157 |
-
| 0.
|
158 |
-
| 0.
|
159 |
-
| 0.
|
160 |
-
| 0.
|
161 |
-
| 0.
|
162 |
-
| 0.
|
163 |
-
| 0.
|
164 |
-
| 0.
|
165 |
-
| 0.
|
166 |
-
| 0.
|
167 |
-
| 0.
|
168 |
-
| 0.
|
169 |
-
| 0.
|
170 |
-
| 0.
|
171 |
-
| 0.
|
172 |
-
| 0.
|
173 |
-
| 0.
|
174 |
-
| 0.
|
175 |
-
| 0.
|
176 |
-
| 0.
|
177 |
-
| 0.
|
178 |
-
| 0.
|
179 |
-
| 0.
|
180 |
-
| 0.
|
181 |
-
| 0.
|
182 |
-
| 0.
|
183 |
-
| 0.
|
184 |
-
| 0.
|
185 |
-
| 0.
|
186 |
-
| 0.
|
187 |
-
| 0.
|
188 |
-
| 0.
|
189 |
-
| 0.
|
190 |
-
| 0.
|
191 |
-
| 0.
|
192 |
-
| 0.
|
193 |
-
| 0.
|
194 |
-
| 0.
|
195 |
-
| 0.
|
196 |
-
| 0.
|
197 |
-
| 0.
|
198 |
-
| 0.
|
199 |
-
| 0.
|
200 |
-
| 0.
|
201 |
-
| 0.
|
202 |
-
| 0.
|
203 |
-
| 0.
|
204 |
-
| 0.
|
205 |
-
| 0.
|
206 |
-
| 0.
|
207 |
-
| 0.
|
208 |
-
| 0.
|
209 |
-
| 0.
|
210 |
-
| 0.
|
211 |
-
| 0.
|
212 |
-
| 0.
|
213 |
-
| 0.
|
214 |
-
| 0.
|
215 |
-
| 0.
|
216 |
-
| 0.
|
217 |
-
| 0.
|
218 |
-
| 0.
|
219 |
-
| 0.
|
220 |
-
| 0.
|
221 |
-
| 0.
|
222 |
-
| 0.
|
223 |
-
| 0.
|
224 |
-
| 0.
|
225 |
-
| 0.
|
226 |
-
| 0.
|
227 |
-
| 0.
|
228 |
-
| 0.
|
229 |
-
| 0.
|
230 |
-
| 0.
|
231 |
-
| 0.
|
232 |
-
| 0.
|
233 |
-
| 0.
|
234 |
-
| 0.
|
235 |
-
| 0.
|
236 |
-
| 0.
|
237 |
-
| 0.
|
238 |
-
| 0.
|
239 |
-
| 0.
|
240 |
-
| 0.
|
241 |
-
| 0.
|
242 |
-
| 0.
|
243 |
-
| 0.
|
244 |
-
| 0.
|
245 |
-
| 0.
|
246 |
-
| 0.
|
247 |
-
| 0.
|
248 |
-
| 0.
|
249 |
|
250 |
|
251 |
### Framework versions
|
|
|
1 |
---
|
2 |
license: apache-2.0
|
3 |
+
base_model: distilgpt2
|
4 |
tags:
|
5 |
- generated_from_trainer
|
|
|
6 |
model-index:
|
7 |
- name: StatementOfWork_Generator_Omega_BS_1024_2
|
8 |
results: []
|
|
|
15 |
|
16 |
This model is a fine-tuned version of [distilgpt2](https://huggingface.co/distilgpt2) on an unknown dataset.
|
17 |
It achieves the following results on the evaluation set:
|
18 |
+
- Loss: 0.7939
|
19 |
|
20 |
## Model description
|
21 |
|
|
|
35 |
|
36 |
The following hyperparameters were used during training:
|
37 |
- learning_rate: 2e-05
|
38 |
+
- train_batch_size: 20
|
39 |
- eval_batch_size: 10
|
40 |
- seed: 42
|
41 |
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
|
|
|
46 |
|
47 |
| Training Loss | Epoch | Step | Validation Loss |
|
48 |
|:-------------:|:-----:|:----:|:---------------:|
|
49 |
+
| No log | 1.0 | 5 | 0.7747 |
|
50 |
+
| No log | 2.0 | 10 | 0.7699 |
|
51 |
+
| No log | 3.0 | 15 | 0.7670 |
|
52 |
+
| No log | 4.0 | 20 | 0.7678 |
|
53 |
+
| No log | 5.0 | 25 | 0.7720 |
|
54 |
+
| No log | 6.0 | 30 | 0.7740 |
|
55 |
+
| No log | 7.0 | 35 | 0.7681 |
|
56 |
+
| No log | 8.0 | 40 | 0.7736 |
|
57 |
+
| No log | 9.0 | 45 | 0.7701 |
|
58 |
+
| No log | 10.0 | 50 | 0.7704 |
|
59 |
+
| No log | 11.0 | 55 | 0.7728 |
|
60 |
+
| No log | 12.0 | 60 | 0.7714 |
|
61 |
+
| No log | 13.0 | 65 | 0.7722 |
|
62 |
+
| No log | 14.0 | 70 | 0.7730 |
|
63 |
+
| No log | 15.0 | 75 | 0.7731 |
|
64 |
+
| No log | 16.0 | 80 | 0.7742 |
|
65 |
+
| No log | 17.0 | 85 | 0.7726 |
|
66 |
+
| No log | 18.0 | 90 | 0.7732 |
|
67 |
+
| No log | 19.0 | 95 | 0.7729 |
|
68 |
+
| No log | 20.0 | 100 | 0.7720 |
|
69 |
+
| No log | 21.0 | 105 | 0.7727 |
|
70 |
+
| No log | 22.0 | 110 | 0.7731 |
|
71 |
+
| No log | 23.0 | 115 | 0.7723 |
|
72 |
+
| No log | 24.0 | 120 | 0.7756 |
|
73 |
+
| No log | 25.0 | 125 | 0.7746 |
|
74 |
+
| No log | 26.0 | 130 | 0.7721 |
|
75 |
+
| No log | 27.0 | 135 | 0.7759 |
|
76 |
+
| No log | 28.0 | 140 | 0.7727 |
|
77 |
+
| No log | 29.0 | 145 | 0.7754 |
|
78 |
+
| No log | 30.0 | 150 | 0.7769 |
|
79 |
+
| No log | 31.0 | 155 | 0.7747 |
|
80 |
+
| No log | 32.0 | 160 | 0.7728 |
|
81 |
+
| No log | 33.0 | 165 | 0.7749 |
|
82 |
+
| No log | 34.0 | 170 | 0.7760 |
|
83 |
+
| No log | 35.0 | 175 | 0.7736 |
|
84 |
+
| No log | 36.0 | 180 | 0.7774 |
|
85 |
+
| No log | 37.0 | 185 | 0.7758 |
|
86 |
+
| No log | 38.0 | 190 | 0.7757 |
|
87 |
+
| No log | 39.0 | 195 | 0.7759 |
|
88 |
+
| No log | 40.0 | 200 | 0.7789 |
|
89 |
+
| No log | 41.0 | 205 | 0.7796 |
|
90 |
+
| No log | 42.0 | 210 | 0.7779 |
|
91 |
+
| No log | 43.0 | 215 | 0.7785 |
|
92 |
+
| No log | 44.0 | 220 | 0.7779 |
|
93 |
+
| No log | 45.0 | 225 | 0.7770 |
|
94 |
+
| No log | 46.0 | 230 | 0.7787 |
|
95 |
+
| No log | 47.0 | 235 | 0.7800 |
|
96 |
+
| No log | 48.0 | 240 | 0.7789 |
|
97 |
+
| No log | 49.0 | 245 | 0.7784 |
|
98 |
+
| No log | 50.0 | 250 | 0.7805 |
|
99 |
+
| No log | 51.0 | 255 | 0.7802 |
|
100 |
+
| No log | 52.0 | 260 | 0.7816 |
|
101 |
+
| No log | 53.0 | 265 | 0.7803 |
|
102 |
+
| No log | 54.0 | 270 | 0.7789 |
|
103 |
+
| No log | 55.0 | 275 | 0.7804 |
|
104 |
+
| No log | 56.0 | 280 | 0.7824 |
|
105 |
+
| No log | 57.0 | 285 | 0.7814 |
|
106 |
+
| No log | 58.0 | 290 | 0.7798 |
|
107 |
+
| No log | 59.0 | 295 | 0.7829 |
|
108 |
+
| No log | 60.0 | 300 | 0.7820 |
|
109 |
+
| No log | 61.0 | 305 | 0.7815 |
|
110 |
+
| No log | 62.0 | 310 | 0.7818 |
|
111 |
+
| No log | 63.0 | 315 | 0.7826 |
|
112 |
+
| No log | 64.0 | 320 | 0.7820 |
|
113 |
+
| No log | 65.0 | 325 | 0.7816 |
|
114 |
+
| No log | 66.0 | 330 | 0.7847 |
|
115 |
+
| No log | 67.0 | 335 | 0.7821 |
|
116 |
+
| No log | 68.0 | 340 | 0.7827 |
|
117 |
+
| No log | 69.0 | 345 | 0.7816 |
|
118 |
+
| No log | 70.0 | 350 | 0.7833 |
|
119 |
+
| No log | 71.0 | 355 | 0.7853 |
|
120 |
+
| No log | 72.0 | 360 | 0.7837 |
|
121 |
+
| No log | 73.0 | 365 | 0.7854 |
|
122 |
+
| No log | 74.0 | 370 | 0.7842 |
|
123 |
+
| No log | 75.0 | 375 | 0.7836 |
|
124 |
+
| No log | 76.0 | 380 | 0.7846 |
|
125 |
+
| No log | 77.0 | 385 | 0.7837 |
|
126 |
+
| No log | 78.0 | 390 | 0.7829 |
|
127 |
+
| No log | 79.0 | 395 | 0.7849 |
|
128 |
+
| No log | 80.0 | 400 | 0.7845 |
|
129 |
+
| No log | 81.0 | 405 | 0.7854 |
|
130 |
+
| No log | 82.0 | 410 | 0.7854 |
|
131 |
+
| No log | 83.0 | 415 | 0.7842 |
|
132 |
+
| No log | 84.0 | 420 | 0.7854 |
|
133 |
+
| No log | 85.0 | 425 | 0.7847 |
|
134 |
+
| No log | 86.0 | 430 | 0.7850 |
|
135 |
+
| No log | 87.0 | 435 | 0.7852 |
|
136 |
+
| No log | 88.0 | 440 | 0.7847 |
|
137 |
+
| No log | 89.0 | 445 | 0.7870 |
|
138 |
+
| No log | 90.0 | 450 | 0.7881 |
|
139 |
+
| No log | 91.0 | 455 | 0.7850 |
|
140 |
+
| No log | 92.0 | 460 | 0.7852 |
|
141 |
+
| No log | 93.0 | 465 | 0.7856 |
|
142 |
+
| No log | 94.0 | 470 | 0.7840 |
|
143 |
+
| No log | 95.0 | 475 | 0.7854 |
|
144 |
+
| No log | 96.0 | 480 | 0.7864 |
|
145 |
+
| No log | 97.0 | 485 | 0.7870 |
|
146 |
+
| No log | 98.0 | 490 | 0.7864 |
|
147 |
+
| No log | 99.0 | 495 | 0.7869 |
|
148 |
+
| 0.0968 | 100.0 | 500 | 0.7872 |
|
149 |
+
| 0.0968 | 101.0 | 505 | 0.7870 |
|
150 |
+
| 0.0968 | 102.0 | 510 | 0.7864 |
|
151 |
+
| 0.0968 | 103.0 | 515 | 0.7863 |
|
152 |
+
| 0.0968 | 104.0 | 520 | 0.7862 |
|
153 |
+
| 0.0968 | 105.0 | 525 | 0.7862 |
|
154 |
+
| 0.0968 | 106.0 | 530 | 0.7872 |
|
155 |
+
| 0.0968 | 107.0 | 535 | 0.7883 |
|
156 |
+
| 0.0968 | 108.0 | 540 | 0.7884 |
|
157 |
+
| 0.0968 | 109.0 | 545 | 0.7868 |
|
158 |
+
| 0.0968 | 110.0 | 550 | 0.7869 |
|
159 |
+
| 0.0968 | 111.0 | 555 | 0.7864 |
|
160 |
+
| 0.0968 | 112.0 | 560 | 0.7864 |
|
161 |
+
| 0.0968 | 113.0 | 565 | 0.7861 |
|
162 |
+
| 0.0968 | 114.0 | 570 | 0.7859 |
|
163 |
+
| 0.0968 | 115.0 | 575 | 0.7876 |
|
164 |
+
| 0.0968 | 116.0 | 580 | 0.7895 |
|
165 |
+
| 0.0968 | 117.0 | 585 | 0.7900 |
|
166 |
+
| 0.0968 | 118.0 | 590 | 0.7903 |
|
167 |
+
| 0.0968 | 119.0 | 595 | 0.7897 |
|
168 |
+
| 0.0968 | 120.0 | 600 | 0.7900 |
|
169 |
+
| 0.0968 | 121.0 | 605 | 0.7904 |
|
170 |
+
| 0.0968 | 122.0 | 610 | 0.7909 |
|
171 |
+
| 0.0968 | 123.0 | 615 | 0.7904 |
|
172 |
+
| 0.0968 | 124.0 | 620 | 0.7904 |
|
173 |
+
| 0.0968 | 125.0 | 625 | 0.7911 |
|
174 |
+
| 0.0968 | 126.0 | 630 | 0.7911 |
|
175 |
+
| 0.0968 | 127.0 | 635 | 0.7893 |
|
176 |
+
| 0.0968 | 128.0 | 640 | 0.7898 |
|
177 |
+
| 0.0968 | 129.0 | 645 | 0.7915 |
|
178 |
+
| 0.0968 | 130.0 | 650 | 0.7921 |
|
179 |
+
| 0.0968 | 131.0 | 655 | 0.7923 |
|
180 |
+
| 0.0968 | 132.0 | 660 | 0.7916 |
|
181 |
+
| 0.0968 | 133.0 | 665 | 0.7910 |
|
182 |
+
| 0.0968 | 134.0 | 670 | 0.7909 |
|
183 |
+
| 0.0968 | 135.0 | 675 | 0.7920 |
|
184 |
+
| 0.0968 | 136.0 | 680 | 0.7928 |
|
185 |
+
| 0.0968 | 137.0 | 685 | 0.7921 |
|
186 |
+
| 0.0968 | 138.0 | 690 | 0.7910 |
|
187 |
+
| 0.0968 | 139.0 | 695 | 0.7908 |
|
188 |
+
| 0.0968 | 140.0 | 700 | 0.7929 |
|
189 |
+
| 0.0968 | 141.0 | 705 | 0.7940 |
|
190 |
+
| 0.0968 | 142.0 | 710 | 0.7930 |
|
191 |
+
| 0.0968 | 143.0 | 715 | 0.7924 |
|
192 |
+
| 0.0968 | 144.0 | 720 | 0.7919 |
|
193 |
+
| 0.0968 | 145.0 | 725 | 0.7923 |
|
194 |
+
| 0.0968 | 146.0 | 730 | 0.7922 |
|
195 |
+
| 0.0968 | 147.0 | 735 | 0.7921 |
|
196 |
+
| 0.0968 | 148.0 | 740 | 0.7929 |
|
197 |
+
| 0.0968 | 149.0 | 745 | 0.7936 |
|
198 |
+
| 0.0968 | 150.0 | 750 | 0.7938 |
|
199 |
+
| 0.0968 | 151.0 | 755 | 0.7938 |
|
200 |
+
| 0.0968 | 152.0 | 760 | 0.7938 |
|
201 |
+
| 0.0968 | 153.0 | 765 | 0.7940 |
|
202 |
+
| 0.0968 | 154.0 | 770 | 0.7934 |
|
203 |
+
| 0.0968 | 155.0 | 775 | 0.7927 |
|
204 |
+
| 0.0968 | 156.0 | 780 | 0.7928 |
|
205 |
+
| 0.0968 | 157.0 | 785 | 0.7931 |
|
206 |
+
| 0.0968 | 158.0 | 790 | 0.7929 |
|
207 |
+
| 0.0968 | 159.0 | 795 | 0.7925 |
|
208 |
+
| 0.0968 | 160.0 | 800 | 0.7920 |
|
209 |
+
| 0.0968 | 161.0 | 805 | 0.7919 |
|
210 |
+
| 0.0968 | 162.0 | 810 | 0.7917 |
|
211 |
+
| 0.0968 | 163.0 | 815 | 0.7925 |
|
212 |
+
| 0.0968 | 164.0 | 820 | 0.7933 |
|
213 |
+
| 0.0968 | 165.0 | 825 | 0.7933 |
|
214 |
+
| 0.0968 | 166.0 | 830 | 0.7929 |
|
215 |
+
| 0.0968 | 167.0 | 835 | 0.7932 |
|
216 |
+
| 0.0968 | 168.0 | 840 | 0.7938 |
|
217 |
+
| 0.0968 | 169.0 | 845 | 0.7939 |
|
218 |
+
| 0.0968 | 170.0 | 850 | 0.7938 |
|
219 |
+
| 0.0968 | 171.0 | 855 | 0.7936 |
|
220 |
+
| 0.0968 | 172.0 | 860 | 0.7937 |
|
221 |
+
| 0.0968 | 173.0 | 865 | 0.7936 |
|
222 |
+
| 0.0968 | 174.0 | 870 | 0.7934 |
|
223 |
+
| 0.0968 | 175.0 | 875 | 0.7933 |
|
224 |
+
| 0.0968 | 176.0 | 880 | 0.7938 |
|
225 |
+
| 0.0968 | 177.0 | 885 | 0.7943 |
|
226 |
+
| 0.0968 | 178.0 | 890 | 0.7942 |
|
227 |
+
| 0.0968 | 179.0 | 895 | 0.7940 |
|
228 |
+
| 0.0968 | 180.0 | 900 | 0.7942 |
|
229 |
+
| 0.0968 | 181.0 | 905 | 0.7946 |
|
230 |
+
| 0.0968 | 182.0 | 910 | 0.7947 |
|
231 |
+
| 0.0968 | 183.0 | 915 | 0.7944 |
|
232 |
+
| 0.0968 | 184.0 | 920 | 0.7940 |
|
233 |
+
| 0.0968 | 185.0 | 925 | 0.7938 |
|
234 |
+
| 0.0968 | 186.0 | 930 | 0.7935 |
|
235 |
+
| 0.0968 | 187.0 | 935 | 0.7934 |
|
236 |
+
| 0.0968 | 188.0 | 940 | 0.7935 |
|
237 |
+
| 0.0968 | 189.0 | 945 | 0.7936 |
|
238 |
+
| 0.0968 | 190.0 | 950 | 0.7937 |
|
239 |
+
| 0.0968 | 191.0 | 955 | 0.7938 |
|
240 |
+
| 0.0968 | 192.0 | 960 | 0.7939 |
|
241 |
+
| 0.0968 | 193.0 | 965 | 0.7940 |
|
242 |
+
| 0.0968 | 194.0 | 970 | 0.7939 |
|
243 |
+
| 0.0968 | 195.0 | 975 | 0.7939 |
|
244 |
+
| 0.0968 | 196.0 | 980 | 0.7939 |
|
245 |
+
| 0.0968 | 197.0 | 985 | 0.7939 |
|
246 |
+
| 0.0968 | 198.0 | 990 | 0.7939 |
|
247 |
+
| 0.0968 | 199.0 | 995 | 0.7939 |
|
248 |
+
| 0.0702 | 200.0 | 1000 | 0.7939 |
|
249 |
|
250 |
|
251 |
### Framework versions
|
model.safetensors
CHANGED
@@ -1,3 +1,3 @@
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
-
oid sha256:
|
3 |
size 327657928
|
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:b9c3d4fe18dbafae2f86b8e51702eb75db167d67cfa8bef81506170021522c75
|
3 |
size 327657928
|
runs/Jul02_16-10-18_viridian/events.out.tfevents.1719936620.viridian.3874171.6
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:1fa9101581f04ca4a1f2dfc9be7c155bf071db62bdd9a9fe2799fe7d9ef545e7
|
3 |
+
size 59929
|
training_args.bin
CHANGED
@@ -1,3 +1,3 @@
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
-
oid sha256:
|
3 |
size 5048
|
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:d4b73340fce85723dde334bd95136763f7a6e6912c391c65ec51d5303a8e8d8d
|
3 |
size 5048
|