Commit
•
22f8323
1
Parent(s):
f6cca39
qf2iwkac: saving weights and logs of step 0k
Browse files- .gitattributes +1 -0
- events.out.tfevents.1653825915.t1v-n-4eb331dd-w-0.1720934.0.v2 +3 -0
- flax_model.msgpack +1 -1
- wandb/debug-internal.log +1 -0
- wandb/debug.log +1 -0
- wandb/latest-run +1 -0
- wandb/run-20220529_120458-qf2iwkac/files/config.yaml +27 -0
- wandb/run-20220529_120458-qf2iwkac/files/media/table/eval/step_0k_5_92ae240f2472ae79ad31.table.json +1 -0
- wandb/run-20220529_120458-qf2iwkac/files/media/table/eval/step_0k_incorrect_5_b30e6671cbaaf5c82b6d.table.json +1 -0
- wandb/run-20220529_120458-qf2iwkac/files/output.log +690 -0
- wandb/run-20220529_120458-qf2iwkac/files/requirements.txt +187 -0
- wandb/run-20220529_120458-qf2iwkac/files/wandb-metadata.json +58 -0
- wandb/run-20220529_120458-qf2iwkac/files/wandb-summary.json +1 -0
- wandb/run-20220529_120458-qf2iwkac/logs/debug-internal.log +108 -0
- wandb/run-20220529_120458-qf2iwkac/logs/debug.log +25 -0
- wandb/run-20220529_120458-qf2iwkac/run-qf2iwkac.wandb +3 -0
.gitattributes
CHANGED
@@ -25,3 +25,4 @@ saved_model/**/* filter=lfs diff=lfs merge=lfs -text
|
|
25 |
*.zip filter=lfs diff=lfs merge=lfs -text
|
26 |
*.zstandard filter=lfs diff=lfs merge=lfs -text
|
27 |
*tfevents* filter=lfs diff=lfs merge=lfs -text
|
|
|
|
25 |
*.zip filter=lfs diff=lfs merge=lfs -text
|
26 |
*.zstandard filter=lfs diff=lfs merge=lfs -text
|
27 |
*tfevents* filter=lfs diff=lfs merge=lfs -text
|
28 |
+
*.wandb filter=lfs diff=lfs merge=lfs -text
|
events.out.tfevents.1653825915.t1v-n-4eb331dd-w-0.1720934.0.v2
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:611808cc427646dd912828e114124e8bb080a2ba008da8736f91e2dc33af0e4b
|
3 |
+
size 40
|
flax_model.msgpack
CHANGED
@@ -1,3 +1,3 @@
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
-
oid sha256:
|
3 |
size 218688
|
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:48ae4ecf95a1c43c77baa282af96d82c8f123e860ab7354c6a1e9114fdd3ae58
|
3 |
size 218688
|
wandb/debug-internal.log
ADDED
@@ -0,0 +1 @@
|
|
|
|
|
1 |
+
run-20220529_120458-qf2iwkac/logs/debug-internal.log
|
wandb/debug.log
ADDED
@@ -0,0 +1 @@
|
|
|
|
|
1 |
+
run-20220529_120458-qf2iwkac/logs/debug.log
|
wandb/latest-run
ADDED
@@ -0,0 +1 @@
|
|
|
|
|
1 |
+
run-20220529_120458-qf2iwkac
|
wandb/run-20220529_120458-qf2iwkac/files/config.yaml
ADDED
@@ -0,0 +1,27 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
wandb_version: 1
|
2 |
+
|
3 |
+
_wandb:
|
4 |
+
desc: null
|
5 |
+
value:
|
6 |
+
cli_version: 0.12.11
|
7 |
+
framework: huggingface
|
8 |
+
huggingface_version: 4.18.0.dev0
|
9 |
+
is_jupyter_run: false
|
10 |
+
is_kaggle_kernel: false
|
11 |
+
python_version: 3.8.10
|
12 |
+
start_time: 1653825898
|
13 |
+
t:
|
14 |
+
1:
|
15 |
+
- 1
|
16 |
+
- 2
|
17 |
+
- 3
|
18 |
+
- 11
|
19 |
+
- 12
|
20 |
+
- 45
|
21 |
+
- 49
|
22 |
+
- 51
|
23 |
+
4: 3.8.10
|
24 |
+
5: 0.12.11
|
25 |
+
6: 4.18.0.dev0
|
26 |
+
8:
|
27 |
+
- 5
|
wandb/run-20220529_120458-qf2iwkac/files/media/table/eval/step_0k_5_92ae240f2472ae79ad31.table.json
ADDED
@@ -0,0 +1 @@
|
|
|
|
|
1 |
+
{"columns": ["id", "label_str", "beam_1"], "data": [["1272-128104-0000", "mister quilter is the apostle of the middle classes and we are glad to welcome his gospel", " Mar MarSSSersersersersersersersersersi hist hist hist hist hist hist hist hist hist hist hist histepepepepep\ufffdhenhenhenhen pl pl"], ["1272-128104-0001", "nor is mister quilter's manner less interesting than his matter", " Mar MarSSSersersersersersersersersersi hist hist hist hist hist hist hist hist hist hist hist histepepepepep\ufffdhenhenhenhen pl pl"], ["1272-128104-0002", "he tells us that at this festive season of the year with christmas and roast beef looming before us similes drawn from eating and its results occur most readily to the mind", " Mar MarSSSersersersersersersersersersi hist hist hist hist hist hist hist hist hist hist hist histepepepepep\ufffdhenhenhenhen pl pl"], ["1272-128104-0003", "he has grave doubts whether sir frederick leighton's work is really greek after all and can discover in it but little of rocky ithaca", " Mar MarSSSersersersersersersersersersi hist hist hist hist hist hist hist hist hist hist hist histepepepepep\ufffdhenhenhenhen pl pl"], ["1272-128104-0005", "it is obviously unnecessary for us to point out how luminous these criticisms are how delicate in expression", " Mar MarSSSersersersersersersersersersi hist hist hist hist hist hist hist hist hist hist hist histepepepepep\ufffdhenhenhenhen pl pl"], ["1272-128104-0006", "on the general principles of art mister quilter writes with equal lucidity", " Mar MarSSSersersersersersersersersersi hist hist hist hist hist hist hist hist hist hist hist histepepepepep\ufffdhenhenhenhen pl pl"], ["1272-128104-0007", "painting he tells us is of a different quality to mathematics and finish in art is adding more fact", " Mar MarSSSersersersersersersersersersi hist hist hist hist hist hist hist hist hist hist hist histepepepepep\ufffdhenhenhenhen pl pl"], ["1272-128104-0008", "as for etchings they are of two kinds british and foreign", " Mar MarSSSersersersersersersersersersi hist hist hist hist hist hist hist hist hist hist hist histepepepepep\ufffdhenhenhenhen pl pl"], ["1272-128104-0010", "near the fire and the ornaments fred brought home from india on the mantel board", " Mar MarSSSersersersersersersersersersi hist hist hist hist hist hist hist hist hist hist hist histepepepepep\ufffdhenhenhenhen pl pl"], ["1272-128104-0012", "only unfortunately his own work never does get good", " Mar MarSSSersersersersersersersersersi hist hist hist hist hist hist hist hist hist hist hist histepepepepep\ufffdhenhenhenhen pl pl"], ["1272-128104-0013", "mister quilter has missed his chance for he has failed even to make himself the tupper of painting", " Mar MarSSSersersersersersersersersersi hist hist hist hist hist hist hist hist hist hist hist histepepepepep\ufffdhenhenhenhen pl pl"], ["1272-128104-0014", "by harry quilter m a", " Mar MarSSSersersersersersersersersersi hist hist hist hist hist hist hist hist hist hist hist histepepepepep\ufffdhenhenhenhen pl pl"], ["1272-135031-0000", "because you were sleeping instead of conquering the lovely rose princess has become a fiddle without a bow while poor shaggy sits there a cooing dove", " Mar MarSSSersersersersersersersersersi hist hist hist hist hist hist hist hist hist hist hist histepepepepep\ufffdhenhenhenhen pl pl"], ["1272-135031-0001", "he has gone and gone for good answered polychrome who had managed to squeeze into the room beside the dragon and had witnessed the occurrences with much interest", " Mar MarSSSersersersersersersersersersi hist hist hist hist hist hist hist hist hist hist hist histepepepepep\ufffdhenhenhenhen pl pl"], ["1272-135031-0002", "i have remained a prisoner only because i wished to be one and with this he stepped forward and burst the stout chains as easily as if they had been threads", " Mar MarSSSersersersersersersersersersi hist hist hist hist hist hist hist hist hist hist hist histepepepepep\ufffdhenhenhenhen pl pl"], ["1272-135031-0003", "the little girl had been asleep but she heard the raps and opened the door", " Mar MarSSSersersersersersersersersersi hist hist hist hist hist hist hist hist hist hist hist histepepepepep\ufffdhenhenhenhen pl pl"], ["1272-135031-0004", "the king has fled in disgrace and your friends are asking for you", " Mar MarSSSersersersersersersersersersi hist hist hist hist hist hist hist hist hist hist hist histepepepepep\ufffdhenhenhenhen pl pl"], ["1272-135031-0005", "i begged ruggedo long ago to send him away but he would not do so", " Mar MarSSSersersersersersersersersersi hist hist hist hist hist hist hist hist hist hist hist histepepepepep\ufffdhenhenhenhen pl pl"], ["1272-135031-0006", "i also offered to help your brother to escape but he would not go", " Mar MarSSSersersersersersersersersersi hist hist hist hist hist hist hist hist hist hist hist histepepepepep\ufffdhenhenhenhen pl pl"], ["1272-135031-0007", "he eats and sleeps very steadily replied the new king", " Mar MarSSSersersersersersersersersersi hist hist hist hist hist hist hist hist hist hist hist histepepepepep\ufffdhenhenhenhen pl pl"], ["1272-135031-0008", "i hope he doesn't work too hard said shaggy", " Mar MarSSSersersersersersersersersersi hist hist hist hist hist hist hist hist hist hist hist histepepepepep\ufffdhenhenhenhen pl pl"], ["1272-135031-0009", "he doesn't work at all", " Mar MarSSSersersersersersersersersersi hist hist hist hist hist hist hist hist hist hist hist histepepepepep\ufffdhenhenhenhen pl pl"], ["1272-135031-0010", "in fact there is nothing he can do in these dominions as well as our nomes whose numbers are so great that it worries us to keep them all busy", " Mar MarSSSersersersersersersersersersi hist hist hist hist hist hist hist hist hist hist hist histepepepepep\ufffdhenhenhenhen pl pl"], ["1272-135031-0011", "not exactly returned kaliko", " Mar MarSSSersersersersersersersersersi hist hist hist hist hist hist hist hist hist hist hist histepepepepep\ufffdhenhenhenhen pl pl"], ["1272-135031-0012", "where is my brother now", " Mar MarSSSersersersersersersersersersi hist hist hist hist hist hist hist hist hist hist hist histepepepepep\ufffdhenhenhenhen pl pl"], ["1272-135031-0013", "inquired shaggy in the metal forest", " Mar MarSSSersersersersersersersersersi hist hist hist hist hist hist hist hist hist hist hist histepepepepep\ufffdhenhenhenhen pl pl"], ["1272-135031-0014", "where is that", " Mar MarSSSersersersersersersersersersi hist hist hist hist hist hist hist hist hist hist hist histepepepepep\ufffdhenhenhenhen pl pl"], ["1272-135031-0015", "the metal forest is in the great domed cavern the largest in all our dominions replied kaliko", " Mar MarSSSersersersersersersersersersi hist hist hist hist hist hist hist hist hist hist hist histepepepepep\ufffdhenhenhenhen pl pl"], ["1272-135031-0016", "kaliko hesitated", " Mar MarSSSersersersersersersersersersi hist hist hist hist hist hist hist hist hist hist hist histepepepepep\ufffdhenhenhenhen pl pl"], ["1272-135031-0017", "however if we look sharp we may be able to discover one of these secret ways", " Mar MarSSSersersersersersersersersersi hist hist hist hist hist hist hist hist hist hist hist histepepepepep\ufffdhenhenhenhen pl pl"], ["1272-135031-0018", "oh no i'm quite sure he didn't", " Mar MarSSSersersersersersersersersersi hist hist hist hist hist hist hist hist hist hist hist histepepepepep\ufffdhenhenhenhen pl pl"], ["1272-135031-0019", "that's funny remarked betsy thoughtfully", " Mar MarSSSersersersersersersersersersi hist hist hist hist hist hist hist hist hist hist hist histepepepepep\ufffdhenhenhenhen pl pl"], ["1272-135031-0020", "i don't believe ann knew any magic or she'd have worked it before", " Mar MarSSSersersersersersersersersersi hist hist hist hist hist hist hist hist hist hist hist histepepepepep\ufffdhenhenhenhen pl pl"], ["1272-135031-0021", "i do not know confessed shaggy", " Mar MarSSSersersersersersersersersersi hist hist hist hist hist hist hist hist hist hist hist histepepepepep\ufffdhenhenhenhen pl pl"], ["1272-135031-0022", "true agreed kaliko", " Mar MarSSSersersersersersersersersersi hist hist hist hist hist hist hist hist hist hist hist histepepepepep\ufffdhenhenhenhen pl pl"], ["1272-135031-0023", "kaliko went to the big gong and pounded on it just as ruggedo used to do but no one answered the summons", " Mar MarSSSersersersersersersersersersi hist hist hist hist hist hist hist hist hist hist hist histepepepepep\ufffdhenhenhenhen pl pl"], ["1272-141231-0000", "a man said to the universe sir i exist", " Mar MarSSSersersersersersersersersersi hist hist hist hist hist hist hist hist hist hist hist histepepepepep\ufffdhenhenhenhen pl pl"], ["1272-141231-0001", "sweat covered brion's body trickling into the tight loincloth that was the only garment he wore", " Mar MarSSSersersersersersersersersersi hist hist hist hist hist hist hist hist hist hist hist histepepepepep\ufffdhenhenhenhen pl pl"], ["1272-141231-0003", "his instant of panic was followed by a small sharp blow high on his chest", " Mar MarSSSersersersersersersersersersi hist hist hist hist hist hist hist hist hist hist hist histepepepepep\ufffdhenhenhenhen pl pl"], ["1272-141231-0004", "one minute a voice said and the time buzzer sounded", " Mar MarSSSersersersersersersersersersi hist hist hist hist hist hist hist hist hist hist hist histepepepepep\ufffdhenhenhenhen pl pl"], ["1272-141231-0005", "a minute is not a very large measure of time and his body needed every fraction of it", " Mar MarSSSersersersersersersersersersi hist hist hist hist hist hist hist hist hist hist hist histepepepepep\ufffdhenhenhenhen pl pl"], ["1272-141231-0006", "the buzzer's whirr triggered his muscles into complete relaxation", " Mar MarSSSersersersersersersersersersi hist hist hist hist hist hist hist hist hist hist hist histepepepepep\ufffdhenhenhenhen pl pl"], ["1272-141231-0007", "only his heart and lungs worked on at a strong measured rate", " Mar MarSSSersersersersersersersersersi hist hist hist hist hist hist hist hist hist hist hist histepepepepep\ufffdhenhenhenhen pl pl"], ["1272-141231-0008", "he was in reverie sliding along the borders of consciousness", " Mar MarSSSersersersersersersersersersi hist hist hist hist hist hist hist hist hist hist hist histepepepepep\ufffdhenhenhenhen pl pl"], ["1272-141231-0009", "the contestants in the twenties needed undisturbed rest therefore nights in the dormitories were as quiet as death", " Mar MarSSSersersersersersersersersersi hist hist hist hist hist hist hist hist hist hist hist histepepepepep\ufffdhenhenhenhen pl pl"], ["1272-141231-0010", "particularly so on this last night when only two of the little cubicles were occupied the thousands of others standing with dark empty doors", " Mar MarSSSersersersersersersersersersi hist hist hist hist hist hist hist hist hist hist hist histepepepepep\ufffdhenhenhenhen pl pl"], ["1272-141231-0011", "the other voice snapped with a harsh urgency clearly used to command", " Mar MarSSSersersersersersersersersersi hist hist hist hist hist hist hist hist hist hist hist histepepepepep\ufffdhenhenhenhen pl pl"], ["1272-141231-0012", "i'm here because the matter is of utmost importance and brandd is the one i must see now stand aside", " Mar MarSSSersersersersersersersersersi hist hist hist hist hist hist hist hist hist hist hist histepepepepep\ufffdhenhenhenhen pl pl"], ["1272-141231-0013", "the twenties", " Mar MarSSSersersersersersersersersersi hist hist hist hist hist hist hist hist hist hist hist histepepepepep\ufffdhenhenhenhen pl pl"], ["1272-141231-0014", "he must have drawn his gun because the intruder said quickly put that away you're being a fool out", " Mar MarSSSersersersersersersersersersi hist hist hist hist hist hist hist hist hist hist hist histepepepepep\ufffdhenhenhenhen pl pl"]]}
|
wandb/run-20220529_120458-qf2iwkac/files/media/table/eval/step_0k_incorrect_5_b30e6671cbaaf5c82b6d.table.json
ADDED
@@ -0,0 +1 @@
|
|
|
|
|
1 |
+
{"columns": ["id", "label_str", "beam_1"], "data": [["1272-128104-0000", "mister quilter is the apostle of the middle classes and we are glad to welcome his gospel", " Mar MarSSSersersersersersersersersersi hist hist hist hist hist hist hist hist hist hist hist histepepepepep\ufffdhenhenhenhen pl pl"], ["1272-128104-0001", "nor is mister quilter's manner less interesting than his matter", " Mar MarSSSersersersersersersersersersi hist hist hist hist hist hist hist hist hist hist hist histepepepepep\ufffdhenhenhenhen pl pl"], ["1272-128104-0002", "he tells us that at this festive season of the year with christmas and roast beef looming before us similes drawn from eating and its results occur most readily to the mind", " Mar MarSSSersersersersersersersersersi hist hist hist hist hist hist hist hist hist hist hist histepepepepep\ufffdhenhenhenhen pl pl"], ["1272-128104-0003", "he has grave doubts whether sir frederick leighton's work is really greek after all and can discover in it but little of rocky ithaca", " Mar MarSSSersersersersersersersersersi hist hist hist hist hist hist hist hist hist hist hist histepepepepep\ufffdhenhenhenhen pl pl"], ["1272-128104-0005", "it is obviously unnecessary for us to point out how luminous these criticisms are how delicate in expression", " Mar MarSSSersersersersersersersersersi hist hist hist hist hist hist hist hist hist hist hist histepepepepep\ufffdhenhenhenhen pl pl"], ["1272-128104-0006", "on the general principles of art mister quilter writes with equal lucidity", " Mar MarSSSersersersersersersersersersi hist hist hist hist hist hist hist hist hist hist hist histepepepepep\ufffdhenhenhenhen pl pl"], ["1272-128104-0007", "painting he tells us is of a different quality to mathematics and finish in art is adding more fact", " Mar MarSSSersersersersersersersersersi hist hist hist hist hist hist hist hist hist hist hist histepepepepep\ufffdhenhenhenhen pl pl"], ["1272-128104-0008", "as for etchings they are of two kinds british and foreign", " Mar MarSSSersersersersersersersersersi hist hist hist hist hist hist hist hist hist hist hist histepepepepep\ufffdhenhenhenhen pl pl"], ["1272-128104-0010", "near the fire and the ornaments fred brought home from india on the mantel board", " Mar MarSSSersersersersersersersersersi hist hist hist hist hist hist hist hist hist hist hist histepepepepep\ufffdhenhenhenhen pl pl"], ["1272-128104-0012", "only unfortunately his own work never does get good", " Mar MarSSSersersersersersersersersersi hist hist hist hist hist hist hist hist hist hist hist histepepepepep\ufffdhenhenhenhen pl pl"], ["1272-128104-0013", "mister quilter has missed his chance for he has failed even to make himself the tupper of painting", " Mar MarSSSersersersersersersersersersi hist hist hist hist hist hist hist hist hist hist hist histepepepepep\ufffdhenhenhenhen pl pl"], ["1272-128104-0014", "by harry quilter m a", " Mar MarSSSersersersersersersersersersi hist hist hist hist hist hist hist hist hist hist hist histepepepepep\ufffdhenhenhenhen pl pl"], ["1272-135031-0000", "because you were sleeping instead of conquering the lovely rose princess has become a fiddle without a bow while poor shaggy sits there a cooing dove", " Mar MarSSSersersersersersersersersersi hist hist hist hist hist hist hist hist hist hist hist histepepepepep\ufffdhenhenhenhen pl pl"], ["1272-135031-0001", "he has gone and gone for good answered polychrome who had managed to squeeze into the room beside the dragon and had witnessed the occurrences with much interest", " Mar MarSSSersersersersersersersersersi hist hist hist hist hist hist hist hist hist hist hist histepepepepep\ufffdhenhenhenhen pl pl"], ["1272-135031-0002", "i have remained a prisoner only because i wished to be one and with this he stepped forward and burst the stout chains as easily as if they had been threads", " Mar MarSSSersersersersersersersersersi hist hist hist hist hist hist hist hist hist hist hist histepepepepep\ufffdhenhenhenhen pl pl"], ["1272-135031-0003", "the little girl had been asleep but she heard the raps and opened the door", " Mar MarSSSersersersersersersersersersi hist hist hist hist hist hist hist hist hist hist hist histepepepepep\ufffdhenhenhenhen pl pl"], ["1272-135031-0004", "the king has fled in disgrace and your friends are asking for you", " Mar MarSSSersersersersersersersersersi hist hist hist hist hist hist hist hist hist hist hist histepepepepep\ufffdhenhenhenhen pl pl"], ["1272-135031-0005", "i begged ruggedo long ago to send him away but he would not do so", " Mar MarSSSersersersersersersersersersi hist hist hist hist hist hist hist hist hist hist hist histepepepepep\ufffdhenhenhenhen pl pl"], ["1272-135031-0006", "i also offered to help your brother to escape but he would not go", " Mar MarSSSersersersersersersersersersi hist hist hist hist hist hist hist hist hist hist hist histepepepepep\ufffdhenhenhenhen pl pl"], ["1272-135031-0007", "he eats and sleeps very steadily replied the new king", " Mar MarSSSersersersersersersersersersi hist hist hist hist hist hist hist hist hist hist hist histepepepepep\ufffdhenhenhenhen pl pl"], ["1272-135031-0008", "i hope he doesn't work too hard said shaggy", " Mar MarSSSersersersersersersersersersi hist hist hist hist hist hist hist hist hist hist hist histepepepepep\ufffdhenhenhenhen pl pl"], ["1272-135031-0009", "he doesn't work at all", " Mar MarSSSersersersersersersersersersi hist hist hist hist hist hist hist hist hist hist hist histepepepepep\ufffdhenhenhenhen pl pl"], ["1272-135031-0010", "in fact there is nothing he can do in these dominions as well as our nomes whose numbers are so great that it worries us to keep them all busy", " Mar MarSSSersersersersersersersersersi hist hist hist hist hist hist hist hist hist hist hist histepepepepep\ufffdhenhenhenhen pl pl"], ["1272-135031-0011", "not exactly returned kaliko", " Mar MarSSSersersersersersersersersersi hist hist hist hist hist hist hist hist hist hist hist histepepepepep\ufffdhenhenhenhen pl pl"], ["1272-135031-0012", "where is my brother now", " Mar MarSSSersersersersersersersersersi hist hist hist hist hist hist hist hist hist hist hist histepepepepep\ufffdhenhenhenhen pl pl"], ["1272-135031-0013", "inquired shaggy in the metal forest", " Mar MarSSSersersersersersersersersersi hist hist hist hist hist hist hist hist hist hist hist histepepepepep\ufffdhenhenhenhen pl pl"], ["1272-135031-0014", "where is that", " Mar MarSSSersersersersersersersersersi hist hist hist hist hist hist hist hist hist hist hist histepepepepep\ufffdhenhenhenhen pl pl"], ["1272-135031-0015", "the metal forest is in the great domed cavern the largest in all our dominions replied kaliko", " Mar MarSSSersersersersersersersersersi hist hist hist hist hist hist hist hist hist hist hist histepepepepep\ufffdhenhenhenhen pl pl"], ["1272-135031-0016", "kaliko hesitated", " Mar MarSSSersersersersersersersersersi hist hist hist hist hist hist hist hist hist hist hist histepepepepep\ufffdhenhenhenhen pl pl"], ["1272-135031-0017", "however if we look sharp we may be able to discover one of these secret ways", " Mar MarSSSersersersersersersersersersi hist hist hist hist hist hist hist hist hist hist hist histepepepepep\ufffdhenhenhenhen pl pl"], ["1272-135031-0018", "oh no i'm quite sure he didn't", " Mar MarSSSersersersersersersersersersi hist hist hist hist hist hist hist hist hist hist hist histepepepepep\ufffdhenhenhenhen pl pl"], ["1272-135031-0019", "that's funny remarked betsy thoughtfully", " Mar MarSSSersersersersersersersersersi hist hist hist hist hist hist hist hist hist hist hist histepepepepep\ufffdhenhenhenhen pl pl"], ["1272-135031-0020", "i don't believe ann knew any magic or she'd have worked it before", " Mar MarSSSersersersersersersersersersi hist hist hist hist hist hist hist hist hist hist hist histepepepepep\ufffdhenhenhenhen pl pl"], ["1272-135031-0021", "i do not know confessed shaggy", " Mar MarSSSersersersersersersersersersi hist hist hist hist hist hist hist hist hist hist hist histepepepepep\ufffdhenhenhenhen pl pl"], ["1272-135031-0022", "true agreed kaliko", " Mar MarSSSersersersersersersersersersi hist hist hist hist hist hist hist hist hist hist hist histepepepepep\ufffdhenhenhenhen pl pl"], ["1272-135031-0023", "kaliko went to the big gong and pounded on it just as ruggedo used to do but no one answered the summons", " Mar MarSSSersersersersersersersersersi hist hist hist hist hist hist hist hist hist hist hist histepepepepep\ufffdhenhenhenhen pl pl"], ["1272-141231-0000", "a man said to the universe sir i exist", " Mar MarSSSersersersersersersersersersi hist hist hist hist hist hist hist hist hist hist hist histepepepepep\ufffdhenhenhenhen pl pl"], ["1272-141231-0001", "sweat covered brion's body trickling into the tight loincloth that was the only garment he wore", " Mar MarSSSersersersersersersersersersi hist hist hist hist hist hist hist hist hist hist hist histepepepepep\ufffdhenhenhenhen pl pl"], ["1272-141231-0003", "his instant of panic was followed by a small sharp blow high on his chest", " Mar MarSSSersersersersersersersersersi hist hist hist hist hist hist hist hist hist hist hist histepepepepep\ufffdhenhenhenhen pl pl"], ["1272-141231-0004", "one minute a voice said and the time buzzer sounded", " Mar MarSSSersersersersersersersersersi hist hist hist hist hist hist hist hist hist hist hist histepepepepep\ufffdhenhenhenhen pl pl"], ["1272-141231-0005", "a minute is not a very large measure of time and his body needed every fraction of it", " Mar MarSSSersersersersersersersersersi hist hist hist hist hist hist hist hist hist hist hist histepepepepep\ufffdhenhenhenhen pl pl"], ["1272-141231-0006", "the buzzer's whirr triggered his muscles into complete relaxation", " Mar MarSSSersersersersersersersersersi hist hist hist hist hist hist hist hist hist hist hist histepepepepep\ufffdhenhenhenhen pl pl"], ["1272-141231-0007", "only his heart and lungs worked on at a strong measured rate", " Mar MarSSSersersersersersersersersersi hist hist hist hist hist hist hist hist hist hist hist histepepepepep\ufffdhenhenhenhen pl pl"], ["1272-141231-0008", "he was in reverie sliding along the borders of consciousness", " Mar MarSSSersersersersersersersersersi hist hist hist hist hist hist hist hist hist hist hist histepepepepep\ufffdhenhenhenhen pl pl"], ["1272-141231-0009", "the contestants in the twenties needed undisturbed rest therefore nights in the dormitories were as quiet as death", " Mar MarSSSersersersersersersersersersi hist hist hist hist hist hist hist hist hist hist hist histepepepepep\ufffdhenhenhenhen pl pl"], ["1272-141231-0010", "particularly so on this last night when only two of the little cubicles were occupied the thousands of others standing with dark empty doors", " Mar MarSSSersersersersersersersersersi hist hist hist hist hist hist hist hist hist hist hist histepepepepep\ufffdhenhenhenhen pl pl"], ["1272-141231-0011", "the other voice snapped with a harsh urgency clearly used to command", " Mar MarSSSersersersersersersersersersi hist hist hist hist hist hist hist hist hist hist hist histepepepepep\ufffdhenhenhenhen pl pl"], ["1272-141231-0012", "i'm here because the matter is of utmost importance and brandd is the one i must see now stand aside", " Mar MarSSSersersersersersersersersersi hist hist hist hist hist hist hist hist hist hist hist histepepepepep\ufffdhenhenhenhen pl pl"], ["1272-141231-0013", "the twenties", " Mar MarSSSersersersersersersersersersi hist hist hist hist hist hist hist hist hist hist hist histepepepepep\ufffdhenhenhenhen pl pl"], ["1272-141231-0014", "he must have drawn his gun because the intruder said quickly put that away you're being a fool out", " Mar MarSSSersersersersersersersersersi hist hist hist hist hist hist hist hist hist hist hist histepepepepep\ufffdhenhenhenhen pl pl"], ["1272-141231-0015", "there was silence then and still wondering brion was once more asleep", " Mar MarSSSersersersersersersersersersi hist hist hist hist hist hist hist hist hist hist hist histepepepepep\ufffdhenhenhenhen pl pl"], ["1272-141231-0016", "ten seconds", " Mar MarSSSersersersersersersersersersi hist hist hist hist hist hist hist hist hist hist hist histepepepepep\ufffdhenhenhenhen pl pl"], ["1272-141231-0017", "he asked the handler who was kneading his aching muscles", " Mar MarSSSersersersersersersersersersi hist hist hist hist hist hist hist hist hist hist hist histepepepepep\ufffdhenhenhenhen pl pl"], ["1272-141231-0018", "a red haired mountain of a man with an apparently inexhaustible store of energy", " Mar MarSSSersersersersersersersersersi hist hist hist hist hist hist hist hist hist hist hist histepepepepep\ufffdhenhenhenhen pl pl"], ["1272-141231-0019", "there could be little art in this last and final round of fencing", " Mar MarSSSersersersersersersersersersi hist hist hist hist hist hist hist hist hist hist hist histepepepepep\ufffdhenhenhenhen pl pl"], ["1272-141231-0020", "just thrust and parry and victory to the stronger", " Mar MarSSSersersersersersersersersersi hist hist hist hist hist hist hist hist hist hist hist histepepepepep\ufffdhenhenhenhen pl pl"], ["1272-141231-0021", "every man who entered the twenties had his own training tricks", " Mar MarSSSersersersersersersersersersi hist hist hist hist hist hist hist hist hist hist hist histepepepepep\ufffdhenhenhenhen pl pl"], ["1272-141231-0022", "there appeared to be an immediate association with the death trauma as if the two were inextricably linked into one", " Mar MarSSSersersersersersersersersersi hist hist hist hist hist hist hist hist hist hist hist histepepepepep\ufffdhenhenhenhen pl pl"], ["1272-141231-0023", "the strength that enables someone in a trance to hold his body stiff and unsupported except at two points the head and heels", " Mar MarSSSersersersersersersersersersi hist hist hist hist hist hist hist hist hist hist hist histepepepepep\ufffdhenhenhenhen pl pl"], ["1272-141231-0024", "this is physically impossible when conscious", " Mar MarSSSersersersersersersersersersi hist hist hist hist hist hist hist hist hist hist hist histepepepepep\ufffdhenhenhenhen pl pl"], ["1272-141231-0025", "others had died before during the twenties and death during the last round was in some ways easier than defeat", " Mar MarSSSersersersersersersersersersi hist hist hist hist hist hist hist hist hist hist hist histepepepepep\ufffdhenhenhenhen pl pl"], ["1272-141231-0026", "breathing deeply brion softly spoke the auto hypnotic phrases that triggered the process", " Mar MarSSSersersersersersersersersersi hist hist hist hist hist hist hist hist hist hist hist histepepepepep\ufffdhenhenhenhen pl pl"], ["1272-141231-0027", "when the buzzer sounded he pulled his foil from his second's startled grasp and ran forward", " Mar MarSSSersersersersersersersersersi hist hist hist hist hist hist hist hist hist hist hist histepepepepep\ufffdhenhenhenhen pl pl"], ["1272-141231-0028", "irolg looked amazed at the sudden fury of the attack then smiled", " Mar MarSSSersersersersersersersersersi hist hist hist hist hist hist hist hist hist hist hist histepepepepep\ufffdhenhenhenhen pl pl"]]}
|
wandb/run-20220529_120458-qf2iwkac/files/output.log
ADDED
@@ -0,0 +1,690 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
05/29/2022 12:05:01 - INFO - __main__ - Training/evaluation parameters FlaxSeq2SeqTrainingArguments(
|
2 |
+
_n_gpu=-1,
|
3 |
+
adafactor=False,
|
4 |
+
adam_beta1=0.9,
|
5 |
+
adam_beta2=0.999,
|
6 |
+
adam_epsilon=1e-08,
|
7 |
+
bf16=False,
|
8 |
+
bf16_full_eval=False,
|
9 |
+
data_seed=None,
|
10 |
+
dataloader_drop_last=False,
|
11 |
+
dataloader_num_workers=0,
|
12 |
+
dataloader_pin_memory=True,
|
13 |
+
ddp_bucket_cap_mb=None,
|
14 |
+
ddp_find_unused_parameters=None,
|
15 |
+
debug=,
|
16 |
+
deepspeed=None,
|
17 |
+
disable_tqdm=None,
|
18 |
+
do_eval=True,
|
19 |
+
do_predict=True,
|
20 |
+
do_train=True,
|
21 |
+
eval_accumulation_steps=None,
|
22 |
+
eval_delay=0,
|
23 |
+
eval_steps=5,
|
24 |
+
evaluation_strategy=no,
|
25 |
+
final_generation_max_length=40,
|
26 |
+
final_generation_num_beams=1,
|
27 |
+
fp16=False,
|
28 |
+
fp16_backend=auto,
|
29 |
+
fp16_full_eval=False,
|
30 |
+
fp16_opt_level=O1,
|
31 |
+
generation_length_penalty=1,
|
32 |
+
generation_max_length=40,
|
33 |
+
generation_num_beams=1,
|
34 |
+
gradient_accumulation_steps=1,
|
35 |
+
gradient_checkpointing=False,
|
36 |
+
greater_is_better=None,
|
37 |
+
group_by_length=False,
|
38 |
+
half_precision_backend=auto,
|
39 |
+
hub_model_id=None,
|
40 |
+
hub_strategy=every_save,
|
41 |
+
hub_token=<HUB_TOKEN>,
|
42 |
+
ignore_data_skip=False,
|
43 |
+
label_names=None,
|
44 |
+
label_smoothing_factor=0.0,
|
45 |
+
learning_rate=0.0003,
|
46 |
+
length_column_name=input_length,
|
47 |
+
load_best_model_at_end=False,
|
48 |
+
local_rank=-1,
|
49 |
+
log_level=passive,
|
50 |
+
log_level_replica=passive,
|
51 |
+
log_on_each_node=True,
|
52 |
+
logging_dir=None,
|
53 |
+
logging_first_step=False,
|
54 |
+
logging_nan_inf_filter=True,
|
55 |
+
logging_steps=1,
|
56 |
+
logging_strategy=steps,
|
57 |
+
lr_scheduler_type=linear,
|
58 |
+
matmul_precision=default,
|
59 |
+
max_grad_norm=1.0,
|
60 |
+
max_steps=10,
|
61 |
+
metric_for_best_model=None,
|
62 |
+
mp_parameters=,
|
63 |
+
no_cuda=False,
|
64 |
+
num_train_epochs=3.0,
|
65 |
+
optim=adamw_hf,
|
66 |
+
output_dir=./,
|
67 |
+
overwrite_output_dir=True,
|
68 |
+
past_index=-1,
|
69 |
+
per_device_eval_batch_size=2,
|
70 |
+
per_device_train_batch_size=2,
|
71 |
+
precision=full,
|
72 |
+
predict_with_generate=True,
|
73 |
+
prediction_loss_only=False,
|
74 |
+
push_to_hub=True,
|
75 |
+
push_to_hub_model_id=None,
|
76 |
+
push_to_hub_organization=None,
|
77 |
+
push_to_hub_token=<PUSH_TO_HUB_TOKEN>,
|
78 |
+
remove_unused_columns=True,
|
79 |
+
report_to=None,
|
80 |
+
resume_from_checkpoint=None,
|
81 |
+
run_name=None,
|
82 |
+
save_on_each_node=False,
|
83 |
+
save_steps=5,
|
84 |
+
save_strategy=steps,
|
85 |
+
save_total_limit=1,
|
86 |
+
seed=42,
|
87 |
+
sharded_ddp=,
|
88 |
+
skip_memory_metrics=True,
|
89 |
+
sortish_sampler=False,
|
90 |
+
tf32=None,
|
91 |
+
tpu_metrics_debug=False,
|
92 |
+
tpu_num_cores=None,
|
93 |
+
use_legacy_prediction_loop=False,
|
94 |
+
warmup_ratio=0.0,
|
95 |
+
warmup_steps=500,
|
96 |
+
weight_decay=0.0,
|
97 |
+
xpu_backend=None,
|
98 |
+
)
|
99 |
+
05/29/2022 12:05:01 - INFO - __main__ - JAX devices: 8, matmul precision: default
|
100 |
+
05/29/2022 12:05:02 - WARNING - datasets.builder - Reusing dataset librispeech_asr (/home/sanchitgandhi/cache/huggingface/datasets/hf-internal-testing___librispeech_asr/clean/2.1.0/d3bc4c2bc2078fcde3ad0f0f635862e4c0fef78ba94c4a34c4c250a097af240b)
|
101 |
+
05/29/2022 12:05:02 - WARNING - datasets.builder - Reusing dataset librispeech_asr (/home/sanchitgandhi/cache/huggingface/datasets/hf-internal-testing___librispeech_asr/clean/2.1.0/d3bc4c2bc2078fcde3ad0f0f635862e4c0fef78ba94c4a34c4c250a097af240b)
|
102 |
+
loading configuration file ./config.json
|
103 |
+
You passed along `num_labels=3` with an incompatible id to label map: {'0': 'LABEL_0', '1': 'LABEL_1'}. The number of labels wil be overwritten to 2.
|
104 |
+
Model config SpeechEncoderDecoderConfig {
|
105 |
+
"_name_or_path": "./",
|
106 |
+
"architectures": [
|
107 |
+
"SpeechEncoderDecoderModel"
|
108 |
+
],
|
109 |
+
"decoder": {
|
110 |
+
"_name_or_path": "",
|
111 |
+
"activation_dropout": 0.0,
|
112 |
+
"activation_function": "gelu",
|
113 |
+
"add_cross_attention": true,
|
114 |
+
"architectures": null,
|
115 |
+
"attention_dropout": 0.1,
|
116 |
+
"bad_words_ids": null,
|
117 |
+
"bos_token_id": 0,
|
118 |
+
"chunk_size_feed_forward": 0,
|
119 |
+
"classifier_dropout": 0.0,
|
120 |
+
"cross_attention_hidden_size": null,
|
121 |
+
"d_model": 16,
|
122 |
+
"decoder_attention_heads": 4,
|
123 |
+
"decoder_ffn_dim": 4,
|
124 |
+
"decoder_layerdrop": 0.0,
|
125 |
+
"decoder_layers": 2,
|
126 |
+
"decoder_start_token_id": 2,
|
127 |
+
"diversity_penalty": 0.0,
|
128 |
+
"do_sample": false,
|
129 |
+
"dropout": 0.1,
|
130 |
+
"early_stopping": false,
|
131 |
+
"encoder_attention_heads": 4,
|
132 |
+
"encoder_ffn_dim": 4,
|
133 |
+
"encoder_layerdrop": 0.0,
|
134 |
+
"encoder_layers": 2,
|
135 |
+
"encoder_no_repeat_ngram_size": 0,
|
136 |
+
"eos_token_id": 2,
|
137 |
+
"exponential_decay_length_penalty": null,
|
138 |
+
"finetuning_task": null,
|
139 |
+
"forced_bos_token_id": null,
|
140 |
+
"forced_eos_token_id": 2,
|
141 |
+
"fuse_matmuls": false,
|
142 |
+
"gradient_checkpointing": false,
|
143 |
+
"id2label": {
|
144 |
+
"0": "LABEL_0",
|
145 |
+
"1": "LABEL_1"
|
146 |
+
},
|
147 |
+
"init_std": 0.02,
|
148 |
+
"is_decoder": true,
|
149 |
+
"is_encoder_decoder": false,
|
150 |
+
"label2id": {
|
151 |
+
"LABEL_0": 0,
|
152 |
+
"LABEL_1": 1
|
153 |
+
},
|
154 |
+
"length_penalty": 1.0,
|
155 |
+
"max_length": 20,
|
156 |
+
"max_position_embeddings": 100,
|
157 |
+
"min_length": 0,
|
158 |
+
"model_type": "bart",
|
159 |
+
"no_repeat_ngram_size": 0,
|
160 |
+
"num_beam_groups": 1,
|
161 |
+
"num_beams": 1,
|
162 |
+
"num_hidden_layers": 2,
|
163 |
+
"num_return_sequences": 1,
|
164 |
+
"output_attentions": false,
|
165 |
+
"output_hidden_states": false,
|
166 |
+
"output_scores": false,
|
167 |
+
"pad_token_id": 1,
|
168 |
+
"prefix": null,
|
169 |
+
"problem_type": null,
|
170 |
+
"pruned_heads": {},
|
171 |
+
"remove_invalid_values": false,
|
172 |
+
"repetition_penalty": 1.0,
|
173 |
+
"return_dict": true,
|
174 |
+
"return_dict_in_generate": false,
|
175 |
+
"scale_embedding": false,
|
176 |
+
"sep_token_id": null,
|
177 |
+
"task_specific_params": null,
|
178 |
+
"temperature": 1.0,
|
179 |
+
"tie_encoder_decoder": false,
|
180 |
+
"tie_word_embeddings": true,
|
181 |
+
"tokenizer_class": null,
|
182 |
+
"top_k": 50,
|
183 |
+
"top_p": 1.0,
|
184 |
+
"torch_dtype": null,
|
185 |
+
"torchscript": false,
|
186 |
+
"transformers_version": "4.18.0.dev0",
|
187 |
+
"typical_p": 1.0,
|
188 |
+
"use_bfloat16": false,
|
189 |
+
"use_cache": true,
|
190 |
+
"use_scan": true,
|
191 |
+
"vocab_size": 1000
|
192 |
+
},
|
193 |
+
"decoder_start_token_id": 0,
|
194 |
+
"encoder": {
|
195 |
+
"_name_or_path": "",
|
196 |
+
"activation_dropout": 0.1,
|
197 |
+
"adapter_kernel_size": 3,
|
198 |
+
"adapter_stride": 2,
|
199 |
+
"add_adapter": true,
|
200 |
+
"add_cross_attention": false,
|
201 |
+
"apply_spec_augment": true,
|
202 |
+
"architectures": null,
|
203 |
+
"attention_dropout": 0.1,
|
204 |
+
"bad_words_ids": null,
|
205 |
+
"bos_token_id": 1,
|
206 |
+
"chunk_size_feed_forward": 0,
|
207 |
+
"classifier_proj_size": 256,
|
208 |
+
"codevector_dim": 256,
|
209 |
+
"contrastive_logits_temperature": 0.1,
|
210 |
+
"conv_bias": false,
|
211 |
+
"conv_dim": [
|
212 |
+
32,
|
213 |
+
32,
|
214 |
+
32
|
215 |
+
],
|
216 |
+
"conv_kernel": [
|
217 |
+
8,
|
218 |
+
8,
|
219 |
+
8
|
220 |
+
],
|
221 |
+
"conv_stride": [
|
222 |
+
4,
|
223 |
+
4,
|
224 |
+
4
|
225 |
+
],
|
226 |
+
"cross_attention_hidden_size": null,
|
227 |
+
"ctc_loss_reduction": "sum",
|
228 |
+
"ctc_zero_infinity": false,
|
229 |
+
"decoder_start_token_id": null,
|
230 |
+
"diversity_loss_weight": 0.1,
|
231 |
+
"diversity_penalty": 0.0,
|
232 |
+
"do_sample": false,
|
233 |
+
"do_stable_layer_norm": true,
|
234 |
+
"early_stopping": false,
|
235 |
+
"encoder_no_repeat_ngram_size": 0,
|
236 |
+
"eos_token_id": 2,
|
237 |
+
"exponential_decay_length_penalty": null,
|
238 |
+
"feat_extract_activation": "gelu",
|
239 |
+
"feat_extract_dropout": 0.0,
|
240 |
+
"feat_extract_norm": "layer",
|
241 |
+
"feat_proj_dropout": 0.0,
|
242 |
+
"feat_quantizer_dropout": 0.0,
|
243 |
+
"final_dropout": 0.0,
|
244 |
+
"finetuning_task": null,
|
245 |
+
"forced_bos_token_id": null,
|
246 |
+
"forced_eos_token_id": null,
|
247 |
+
"fuse_matmuls": false,
|
248 |
+
"gradient_checkpointing": false,
|
249 |
+
"hidden_act": "gelu",
|
250 |
+
"hidden_dropout": 0.1,
|
251 |
+
"hidden_dropout_prob": 0.1,
|
252 |
+
"hidden_size": 16,
|
253 |
+
"id2label": {
|
254 |
+
"0": "LABEL_0",
|
255 |
+
"1": "LABEL_1"
|
256 |
+
},
|
257 |
+
"initializer_range": 0.02,
|
258 |
+
"intermediate_size": 20,
|
259 |
+
"is_decoder": false,
|
260 |
+
"is_encoder_decoder": false,
|
261 |
+
"label2id": {
|
262 |
+
"LABEL_0": 0,
|
263 |
+
"LABEL_1": 1
|
264 |
+
},
|
265 |
+
"layer_norm_eps": 1e-05,
|
266 |
+
"layerdrop": 0.0,
|
267 |
+
"length_penalty": 1.0,
|
268 |
+
"mask_feature_length": 10,
|
269 |
+
"mask_feature_min_masks": 0,
|
270 |
+
"mask_feature_prob": 0.0,
|
271 |
+
"mask_time_length": 10,
|
272 |
+
"mask_time_min_masks": 2,
|
273 |
+
"mask_time_prob": 0.1,
|
274 |
+
"max_length": 20,
|
275 |
+
"min_length": 0,
|
276 |
+
"model_type": "wav2vec2",
|
277 |
+
"no_repeat_ngram_size": 0,
|
278 |
+
"num_adapter_layers": 3,
|
279 |
+
"num_attention_heads": 2,
|
280 |
+
"num_beam_groups": 1,
|
281 |
+
"num_beams": 1,
|
282 |
+
"num_codevector_groups": 2,
|
283 |
+
"num_codevectors_per_group": 320,
|
284 |
+
"num_conv_pos_embedding_groups": 2,
|
285 |
+
"num_conv_pos_embeddings": 16,
|
286 |
+
"num_feat_extract_layers": 3,
|
287 |
+
"num_hidden_layers": 4,
|
288 |
+
"num_negatives": 10,
|
289 |
+
"num_return_sequences": 1,
|
290 |
+
"output_attentions": false,
|
291 |
+
"output_hidden_size": 16,
|
292 |
+
"output_hidden_states": false,
|
293 |
+
"output_scores": false,
|
294 |
+
"pad_token_id": 0,
|
295 |
+
"prefix": null,
|
296 |
+
"problem_type": null,
|
297 |
+
"proj_codevector_dim": 256,
|
298 |
+
"pruned_heads": {},
|
299 |
+
"remove_invalid_values": false,
|
300 |
+
"repetition_penalty": 1.0,
|
301 |
+
"return_dict": true,
|
302 |
+
"return_dict_in_generate": false,
|
303 |
+
"sep_token_id": null,
|
304 |
+
"task_specific_params": null,
|
305 |
+
"tdnn_dilation": [
|
306 |
+
1,
|
307 |
+
2,
|
308 |
+
3,
|
309 |
+
1,
|
310 |
+
1
|
311 |
+
],
|
312 |
+
"tdnn_dim": [
|
313 |
+
512,
|
314 |
+
512,
|
315 |
+
512,
|
316 |
+
512,
|
317 |
+
1500
|
318 |
+
],
|
319 |
+
"tdnn_kernel": [
|
320 |
+
5,
|
321 |
+
3,
|
322 |
+
3,
|
323 |
+
1,
|
324 |
+
1
|
325 |
+
],
|
326 |
+
"temperature": 1.0,
|
327 |
+
"tie_encoder_decoder": false,
|
328 |
+
"tie_word_embeddings": true,
|
329 |
+
"tokenizer_class": null,
|
330 |
+
"top_k": 50,
|
331 |
+
"top_p": 1.0,
|
332 |
+
"torch_dtype": null,
|
333 |
+
"torchscript": false,
|
334 |
+
"transformers_version": "4.18.0.dev0",
|
335 |
+
"typical_p": 1.0,
|
336 |
+
"use_bfloat16": false,
|
337 |
+
"use_scan": true,
|
338 |
+
"use_weighted_layer_sum": false,
|
339 |
+
"vocab_size": 32,
|
340 |
+
"xvector_output_dim": 512
|
341 |
+
},
|
342 |
+
"eos_token_id": 2,
|
343 |
+
"is_encoder_decoder": true,
|
344 |
+
"max_length": 40,
|
345 |
+
"model_type": "speech-encoder-decoder",
|
346 |
+
"pad_token_id": 1,
|
347 |
+
"processor_class": "Wav2Vec2Processor",
|
348 |
+
"tie_word_embeddings": false,
|
349 |
+
"transformers_version": null,
|
350 |
+
"use_cache": false
|
351 |
+
}
|
352 |
+
loading feature extractor configuration file ./preprocessor_config.json
|
353 |
+
Feature extractor Wav2Vec2FeatureExtractor {
|
354 |
+
"do_normalize": true,
|
355 |
+
"feature_extractor_type": "Wav2Vec2FeatureExtractor",
|
356 |
+
"feature_size": 1,
|
357 |
+
"padding_side": "right",
|
358 |
+
"padding_value": 0.0,
|
359 |
+
"return_attention_mask": false,
|
360 |
+
"sampling_rate": 16000
|
361 |
+
}
|
362 |
+
Didn't find file ./added_tokens.json. We won't load it.
|
363 |
+
loading file ./vocab.json
|
364 |
+
loading file ./merges.txt
|
365 |
+
loading file ./tokenizer.json
|
366 |
+
loading file None
|
367 |
+
loading file ./special_tokens_map.json
|
368 |
+
loading file ./tokenizer_config.json
|
369 |
+
loading weights file ./flax_model.msgpack
|
370 |
+
05/29/2022 12:05:03 - WARNING - datasets.builder - Reusing dataset librispeech_asr (/home/sanchitgandhi/cache/huggingface/datasets/hf-internal-testing___librispeech_asr/clean/2.1.0/d3bc4c2bc2078fcde3ad0f0f635862e4c0fef78ba94c4a34c4c250a097af240b)
|
371 |
+
05/29/2022 12:05:04 - WARNING - datasets.builder - Reusing dataset librispeech_asr (/home/sanchitgandhi/cache/huggingface/datasets/hf-internal-testing___librispeech_asr/clean/2.1.0/d3bc4c2bc2078fcde3ad0f0f635862e4c0fef78ba94c4a34c4c250a097af240b)
|
372 |
+
All model checkpoint weights were used when initializing FlaxSpeechEncoderDecoderModel.
|
373 |
+
All the weights of FlaxSpeechEncoderDecoderModel were initialized from the model checkpoint at ./.
|
374 |
+
If your task is similar to the task the model of the checkpoint was trained on, you can already use FlaxSpeechEncoderDecoderModel for predictions without further training.
|
375 |
+
05/29/2022 12:05:14 - WARNING - datasets.arrow_dataset - Loading cached processed dataset at /home/sanchitgandhi/cache/huggingface/datasets/hf-internal-testing___librispeech_asr/clean/2.1.0/d3bc4c2bc2078fcde3ad0f0f635862e4c0fef78ba94c4a34c4c250a097af240b/cache-c95b537ee89f995b.arrow
|
376 |
+
05/29/2022 12:05:14 - WARNING - datasets.arrow_dataset - Loading cached processed dataset at /home/sanchitgandhi/cache/huggingface/datasets/hf-internal-testing___librispeech_asr/clean/2.1.0/d3bc4c2bc2078fcde3ad0f0f635862e4c0fef78ba94c4a34c4c250a097af240b/cache-c95b537ee89f995b.arrow
|
377 |
+
05/29/2022 12:05:14 - WARNING - datasets.arrow_dataset - Loading cached processed dataset at /home/sanchitgandhi/cache/huggingface/datasets/hf-internal-testing___librispeech_asr/clean/2.1.0/d3bc4c2bc2078fcde3ad0f0f635862e4c0fef78ba94c4a34c4c250a097af240b/cache-19130a8d7ad2d887.arrow
|
378 |
+
05/29/2022 12:05:14 - WARNING - datasets.arrow_dataset - Loading cached processed dataset at /home/sanchitgandhi/cache/huggingface/datasets/hf-internal-testing___librispeech_asr/clean/2.1.0/d3bc4c2bc2078fcde3ad0f0f635862e4c0fef78ba94c4a34c4c250a097af240b/cache-e60ab7f8cad7e7f8.arrow
|
379 |
+
05/29/2022 12:05:14 - WARNING - datasets.arrow_dataset - Loading cached processed dataset at /home/sanchitgandhi/cache/huggingface/datasets/hf-internal-testing___librispeech_asr/clean/2.1.0/d3bc4c2bc2078fcde3ad0f0f635862e4c0fef78ba94c4a34c4c250a097af240b/cache-3eb89567438f0a88.arrow
|
380 |
+
05/29/2022 12:05:14 - WARNING - datasets.arrow_dataset - Loading cached processed dataset at /home/sanchitgandhi/cache/huggingface/datasets/hf-internal-testing___librispeech_asr/clean/2.1.0/d3bc4c2bc2078fcde3ad0f0f635862e4c0fef78ba94c4a34c4c250a097af240b/cache-3eb89567438f0a88.arrow
|
381 |
+
05/29/2022 12:05:14 - WARNING - datasets.arrow_dataset - Loading cached processed dataset at /home/sanchitgandhi/cache/huggingface/datasets/hf-internal-testing___librispeech_asr/clean/2.1.0/d3bc4c2bc2078fcde3ad0f0f635862e4c0fef78ba94c4a34c4c250a097af240b/cache-59ebfc33c650e7fb.arrow
|
382 |
+
05/29/2022 12:05:14 - WARNING - datasets.arrow_dataset - Loading cached processed dataset at /home/sanchitgandhi/cache/huggingface/datasets/hf-internal-testing___librispeech_asr/clean/2.1.0/d3bc4c2bc2078fcde3ad0f0f635862e4c0fef78ba94c4a34c4c250a097af240b/cache-ba50e9f3ac5dbfae.arrow
|
383 |
+
05/29/2022 12:05:14 - WARNING - datasets.arrow_dataset - Loading cached processed dataset at /home/sanchitgandhi/cache/huggingface/datasets/hf-internal-testing___librispeech_asr/clean/2.1.0/d3bc4c2bc2078fcde3ad0f0f635862e4c0fef78ba94c4a34c4c250a097af240b/cache-8426f0fa10476a9c.arrow
|
384 |
+
05/29/2022 12:05:14 - WARNING - datasets.arrow_dataset - Loading cached processed dataset at /home/sanchitgandhi/cache/huggingface/datasets/hf-internal-testing___librispeech_asr/clean/2.1.0/d3bc4c2bc2078fcde3ad0f0f635862e4c0fef78ba94c4a34c4c250a097af240b/cache-8426f0fa10476a9c.arrow
|
385 |
+
05/29/2022 12:05:14 - WARNING - datasets.arrow_dataset - Loading cached processed dataset at /home/sanchitgandhi/cache/huggingface/datasets/hf-internal-testing___librispeech_asr/clean/2.1.0/d3bc4c2bc2078fcde3ad0f0f635862e4c0fef78ba94c4a34c4c250a097af240b/cache-8c7436c2e47ec832.arrow
|
386 |
+
05/29/2022 12:05:14 - WARNING - datasets.arrow_dataset - Loading cached processed dataset at /home/sanchitgandhi/cache/huggingface/datasets/hf-internal-testing___librispeech_asr/clean/2.1.0/d3bc4c2bc2078fcde3ad0f0f635862e4c0fef78ba94c4a34c4c250a097af240b/cache-85fa3b40218279d5.arrow
|
387 |
+
05/29/2022 12:05:14 - WARNING - datasets.arrow_dataset - Loading cached processed dataset at /home/sanchitgandhi/cache/huggingface/datasets/hf-internal-testing___librispeech_asr/clean/2.1.0/d3bc4c2bc2078fcde3ad0f0f635862e4c0fef78ba94c4a34c4c250a097af240b/cache-a8160ccb9fa21e31.arrow
|
388 |
+
05/29/2022 12:05:14 - WARNING - datasets.arrow_dataset - Loading cached processed dataset at /home/sanchitgandhi/cache/huggingface/datasets/hf-internal-testing___librispeech_asr/clean/2.1.0/d3bc4c2bc2078fcde3ad0f0f635862e4c0fef78ba94c4a34c4c250a097af240b/cache-a8160ccb9fa21e31.arrow
|
389 |
+
05/29/2022 12:05:14 - WARNING - datasets.arrow_dataset - Loading cached processed dataset at /home/sanchitgandhi/cache/huggingface/datasets/hf-internal-testing___librispeech_asr/clean/2.1.0/d3bc4c2bc2078fcde3ad0f0f635862e4c0fef78ba94c4a34c4c250a097af240b/cache-895a5734fe2db613.arrow
|
390 |
+
05/29/2022 12:05:14 - WARNING - datasets.arrow_dataset - Loading cached processed dataset at /home/sanchitgandhi/cache/huggingface/datasets/hf-internal-testing___librispeech_asr/clean/2.1.0/d3bc4c2bc2078fcde3ad0f0f635862e4c0fef78ba94c4a34c4c250a097af240b/cache-f3e16e20d9ec29fb.arrow
|
391 |
+
Feature extractor saved in ./preprocessor_config.json
|
392 |
+
tokenizer config file saved in ./tokenizer_config.json
|
393 |
+
Special tokens file saved in ./special_tokens_map.json
|
394 |
+
Configuration saved in ./config.json
|
395 |
+
loading feature extractor configuration file ./preprocessor_config.json
|
396 |
+
loading configuration file ./config.json
|
397 |
+
You passed along `num_labels=3` with an incompatible id to label map: {'0': 'LABEL_0', '1': 'LABEL_1'}. The number of labels wil be overwritten to 2.
|
398 |
+
Model config SpeechEncoderDecoderConfig {
|
399 |
+
"_name_or_path": "./",
|
400 |
+
"architectures": [
|
401 |
+
"SpeechEncoderDecoderModel"
|
402 |
+
],
|
403 |
+
"decoder": {
|
404 |
+
"_name_or_path": "",
|
405 |
+
"activation_dropout": 0.0,
|
406 |
+
"activation_function": "gelu",
|
407 |
+
"add_cross_attention": true,
|
408 |
+
"architectures": null,
|
409 |
+
"attention_dropout": 0.1,
|
410 |
+
"bad_words_ids": null,
|
411 |
+
"bos_token_id": 0,
|
412 |
+
"chunk_size_feed_forward": 0,
|
413 |
+
"classifier_dropout": 0.0,
|
414 |
+
"cross_attention_hidden_size": null,
|
415 |
+
"d_model": 16,
|
416 |
+
"decoder_attention_heads": 4,
|
417 |
+
"decoder_ffn_dim": 4,
|
418 |
+
"decoder_layerdrop": 0.0,
|
419 |
+
"decoder_layers": 2,
|
420 |
+
"decoder_start_token_id": 2,
|
421 |
+
"diversity_penalty": 0.0,
|
422 |
+
"do_sample": false,
|
423 |
+
"dropout": 0.1,
|
424 |
+
"early_stopping": false,
|
425 |
+
"encoder_attention_heads": 4,
|
426 |
+
"encoder_ffn_dim": 4,
|
427 |
+
"encoder_layerdrop": 0.0,
|
428 |
+
"encoder_layers": 2,
|
429 |
+
"encoder_no_repeat_ngram_size": 0,
|
430 |
+
"eos_token_id": 2,
|
431 |
+
"exponential_decay_length_penalty": null,
|
432 |
+
"finetuning_task": null,
|
433 |
+
"forced_bos_token_id": null,
|
434 |
+
"forced_eos_token_id": 2,
|
435 |
+
"fuse_matmuls": false,
|
436 |
+
"gradient_checkpointing": false,
|
437 |
+
"id2label": {
|
438 |
+
"0": "LABEL_0",
|
439 |
+
"1": "LABEL_1"
|
440 |
+
},
|
441 |
+
"init_std": 0.02,
|
442 |
+
"is_decoder": true,
|
443 |
+
"is_encoder_decoder": false,
|
444 |
+
"label2id": {
|
445 |
+
"LABEL_0": 0,
|
446 |
+
"LABEL_1": 1
|
447 |
+
},
|
448 |
+
"length_penalty": 1.0,
|
449 |
+
"max_length": 20,
|
450 |
+
"max_position_embeddings": 100,
|
451 |
+
"min_length": 0,
|
452 |
+
"model_type": "bart",
|
453 |
+
"no_repeat_ngram_size": 0,
|
454 |
+
"num_beam_groups": 1,
|
455 |
+
"num_beams": 1,
|
456 |
+
"num_hidden_layers": 2,
|
457 |
+
"num_return_sequences": 1,
|
458 |
+
"output_attentions": false,
|
459 |
+
"output_hidden_states": false,
|
460 |
+
"output_scores": false,
|
461 |
+
"pad_token_id": 1,
|
462 |
+
"prefix": null,
|
463 |
+
"problem_type": null,
|
464 |
+
"pruned_heads": {},
|
465 |
+
"remove_invalid_values": false,
|
466 |
+
"repetition_penalty": 1.0,
|
467 |
+
"return_dict": true,
|
468 |
+
"return_dict_in_generate": false,
|
469 |
+
"scale_embedding": false,
|
470 |
+
"sep_token_id": null,
|
471 |
+
"task_specific_params": null,
|
472 |
+
"temperature": 1.0,
|
473 |
+
"tie_encoder_decoder": false,
|
474 |
+
"tie_word_embeddings": true,
|
475 |
+
"tokenizer_class": null,
|
476 |
+
"top_k": 50,
|
477 |
+
"top_p": 1.0,
|
478 |
+
"torch_dtype": null,
|
479 |
+
"torchscript": false,
|
480 |
+
"transformers_version": "4.18.0.dev0",
|
481 |
+
"typical_p": 1.0,
|
482 |
+
"use_bfloat16": false,
|
483 |
+
"use_cache": true,
|
484 |
+
"use_scan": true,
|
485 |
+
"vocab_size": 1000
|
486 |
+
},
|
487 |
+
"decoder_start_token_id": 0,
|
488 |
+
"encoder": {
|
489 |
+
"_name_or_path": "",
|
490 |
+
"activation_dropout": 0.1,
|
491 |
+
"adapter_kernel_size": 3,
|
492 |
+
"adapter_stride": 2,
|
493 |
+
"add_adapter": true,
|
494 |
+
"add_cross_attention": false,
|
495 |
+
"apply_spec_augment": true,
|
496 |
+
"architectures": null,
|
497 |
+
"attention_dropout": 0.1,
|
498 |
+
"bad_words_ids": null,
|
499 |
+
"bos_token_id": 1,
|
500 |
+
"chunk_size_feed_forward": 0,
|
501 |
+
"classifier_proj_size": 256,
|
502 |
+
"codevector_dim": 256,
|
503 |
+
"contrastive_logits_temperature": 0.1,
|
504 |
+
"conv_bias": false,
|
505 |
+
"conv_dim": [
|
506 |
+
32,
|
507 |
+
32,
|
508 |
+
32
|
509 |
+
],
|
510 |
+
"conv_kernel": [
|
511 |
+
8,
|
512 |
+
8,
|
513 |
+
8
|
514 |
+
],
|
515 |
+
"conv_stride": [
|
516 |
+
4,
|
517 |
+
4,
|
518 |
+
4
|
519 |
+
],
|
520 |
+
"cross_attention_hidden_size": null,
|
521 |
+
"ctc_loss_reduction": "sum",
|
522 |
+
"ctc_zero_infinity": false,
|
523 |
+
"decoder_start_token_id": null,
|
524 |
+
"diversity_loss_weight": 0.1,
|
525 |
+
"diversity_penalty": 0.0,
|
526 |
+
"do_sample": false,
|
527 |
+
"do_stable_layer_norm": true,
|
528 |
+
"early_stopping": false,
|
529 |
+
"encoder_no_repeat_ngram_size": 0,
|
530 |
+
"eos_token_id": 2,
|
531 |
+
"exponential_decay_length_penalty": null,
|
532 |
+
"feat_extract_activation": "gelu",
|
533 |
+
"feat_extract_dropout": 0.0,
|
534 |
+
"feat_extract_norm": "layer",
|
535 |
+
"feat_proj_dropout": 0.0,
|
536 |
+
"feat_quantizer_dropout": 0.0,
|
537 |
+
"final_dropout": 0.0,
|
538 |
+
"finetuning_task": null,
|
539 |
+
"forced_bos_token_id": null,
|
540 |
+
"forced_eos_token_id": null,
|
541 |
+
"fuse_matmuls": false,
|
542 |
+
"gradient_checkpointing": false,
|
543 |
+
"hidden_act": "gelu",
|
544 |
+
"hidden_dropout": 0.1,
|
545 |
+
"hidden_dropout_prob": 0.1,
|
546 |
+
"hidden_size": 16,
|
547 |
+
"id2label": {
|
548 |
+
"0": "LABEL_0",
|
549 |
+
"1": "LABEL_1"
|
550 |
+
},
|
551 |
+
"initializer_range": 0.02,
|
552 |
+
"intermediate_size": 20,
|
553 |
+
"is_decoder": false,
|
554 |
+
"is_encoder_decoder": false,
|
555 |
+
"label2id": {
|
556 |
+
"LABEL_0": 0,
|
557 |
+
"LABEL_1": 1
|
558 |
+
},
|
559 |
+
"layer_norm_eps": 1e-05,
|
560 |
+
"layerdrop": 0.0,
|
561 |
+
"length_penalty": 1.0,
|
562 |
+
"mask_feature_length": 10,
|
563 |
+
"mask_feature_min_masks": 0,
|
564 |
+
"mask_feature_prob": 0.0,
|
565 |
+
"mask_time_length": 10,
|
566 |
+
"mask_time_min_masks": 2,
|
567 |
+
"mask_time_prob": 0.1,
|
568 |
+
"max_length": 20,
|
569 |
+
"min_length": 0,
|
570 |
+
"model_type": "wav2vec2",
|
571 |
+
"no_repeat_ngram_size": 0,
|
572 |
+
"num_adapter_layers": 3,
|
573 |
+
"num_attention_heads": 2,
|
574 |
+
"num_beam_groups": 1,
|
575 |
+
"num_beams": 1,
|
576 |
+
"num_codevector_groups": 2,
|
577 |
+
"num_codevectors_per_group": 320,
|
578 |
+
"num_conv_pos_embedding_groups": 2,
|
579 |
+
"num_conv_pos_embeddings": 16,
|
580 |
+
"num_feat_extract_layers": 3,
|
581 |
+
"num_hidden_layers": 4,
|
582 |
+
"num_negatives": 10,
|
583 |
+
"num_return_sequences": 1,
|
584 |
+
"output_attentions": false,
|
585 |
+
"output_hidden_size": 16,
|
586 |
+
"output_hidden_states": false,
|
587 |
+
"output_scores": false,
|
588 |
+
"pad_token_id": 0,
|
589 |
+
"prefix": null,
|
590 |
+
"problem_type": null,
|
591 |
+
"proj_codevector_dim": 256,
|
592 |
+
"pruned_heads": {},
|
593 |
+
"remove_invalid_values": false,
|
594 |
+
"repetition_penalty": 1.0,
|
595 |
+
"return_dict": true,
|
596 |
+
"return_dict_in_generate": false,
|
597 |
+
"sep_token_id": null,
|
598 |
+
"task_specific_params": null,
|
599 |
+
"tdnn_dilation": [
|
600 |
+
1,
|
601 |
+
2,
|
602 |
+
3,
|
603 |
+
1,
|
604 |
+
1
|
605 |
+
],
|
606 |
+
"tdnn_dim": [
|
607 |
+
512,
|
608 |
+
512,
|
609 |
+
512,
|
610 |
+
512,
|
611 |
+
1500
|
612 |
+
],
|
613 |
+
"tdnn_kernel": [
|
614 |
+
5,
|
615 |
+
3,
|
616 |
+
3,
|
617 |
+
1,
|
618 |
+
1
|
619 |
+
],
|
620 |
+
"temperature": 1.0,
|
621 |
+
"tie_encoder_decoder": false,
|
622 |
+
"tie_word_embeddings": true,
|
623 |
+
"tokenizer_class": null,
|
624 |
+
"top_k": 50,
|
625 |
+
"top_p": 1.0,
|
626 |
+
"torch_dtype": null,
|
627 |
+
"torchscript": false,
|
628 |
+
"transformers_version": "4.18.0.dev0",
|
629 |
+
"typical_p": 1.0,
|
630 |
+
"use_bfloat16": false,
|
631 |
+
"use_scan": true,
|
632 |
+
"use_weighted_layer_sum": false,
|
633 |
+
"vocab_size": 32,
|
634 |
+
"xvector_output_dim": 512
|
635 |
+
},
|
636 |
+
"eos_token_id": 2,
|
637 |
+
"is_encoder_decoder": true,
|
638 |
+
"max_length": 40,
|
639 |
+
"model_type": "speech-encoder-decoder",
|
640 |
+
"pad_token_id": 1,
|
641 |
+
"processor_class": "Wav2Vec2Processor",
|
642 |
+
"tie_word_embeddings": false,
|
643 |
+
"transformers_version": null,
|
644 |
+
"use_cache": false
|
645 |
+
}
|
646 |
+
loading feature extractor configuration file ./preprocessor_config.json
|
647 |
+
Feature extractor Wav2Vec2FeatureExtractor {
|
648 |
+
"do_normalize": true,
|
649 |
+
"feature_extractor_type": "Wav2Vec2FeatureExtractor",
|
650 |
+
"feature_size": 1,
|
651 |
+
"padding_side": "right",
|
652 |
+
"padding_value": 0.0,
|
653 |
+
"return_attention_mask": false,
|
654 |
+
"sampling_rate": 16000
|
655 |
+
}
|
656 |
+
Didn't find file ./added_tokens.json. We won't load it.
|
657 |
+
loading file ./vocab.json
|
658 |
+
loading file ./merges.txt
|
659 |
+
loading file ./tokenizer.json
|
660 |
+
loading file None
|
661 |
+
loading file ./special_tokens_map.json
|
662 |
+
loading file ./tokenizer_config.json
|
663 |
+
2022-05-29 12:05:15.311076: W tensorflow/stream_executor/platform/default/dso_loader.cc:64] Could not load dynamic library 'libcuda.so.1'; dlerror: libcuda.so.1: cannot open shared object file: No such file or directory
|
664 |
+
2022-05-29 12:05:15.311124: W tensorflow/stream_executor/cuda/cuda_driver.cc:269] failed call to cuInit: UNKNOWN ERROR (303)
|
665 |
+
/home/sanchitgandhi/flax-dummy/./ is already a clone of https://huggingface.co/sanchit-gandhi/flax-dummy. Make sure you pull the latest changes with `repo.git_pull()`.
|
666 |
+
05/29/2022 12:05:16 - WARNING - huggingface_hub.repository - /home/sanchitgandhi/flax-dummy/./ is already a clone of https://huggingface.co/sanchit-gandhi/flax-dummy. Make sure you pull the latest changes with `repo.git_pull()`.
|
667 |
+
Epoch ... (1/3): 0%| | 0/3 [00:00<?, ?it/s]
|
668 |
+
Training...: 0%| | 0/4 [00:00<?, ?it/s]
|
669 |
+
05/29/2022 12:05:17 - INFO - __main__ - ***** Running training *****
|
670 |
+
05/29/2022 12:05:17 - INFO - __main__ - Num examples = 68
|
671 |
+
05/29/2022 12:05:17 - INFO - __main__ - Num Epochs = 3
|
672 |
+
05/29/2022 12:05:17 - INFO - __main__ - Instantaneous batch size per device = 2
|
673 |
+
05/29/2022 12:05:17 - INFO - __main__ - Num gradient accumulation steps = 1
|
674 |
+
05/29/2022 12:05:17 - INFO - __main__ - Total train batch size (w. parallel & distributed) = 16
|
675 |
+
05/29/2022 12:05:17 - INFO - __main__ - Total optimization steps = 10
|
676 |
+
05/29/2022 12:05:17 - INFO - __main__ - Gradient checkpointing: False
|
677 |
+
05/29/2022 12:05:17 - INFO - __main__ - Use scan: True
|
678 |
+
|
679 |
+
Training...: 25%|██████████████████ | 1/4 [00:24<01:14, 24.71s/it]
|
680 |
+
|
681 |
+
Training...: 50%|████████████████████████████████████ | 2/4 [00:46<00:45, 22.72s/it]
|
682 |
+
Step... (2 | Loss: 6.890774726867676, Learning Rate: 6.00004568696022e-07, Gradient Norm: 0.3177940249443054)
|
683 |
+
Step... (3 | Loss: 6.900632858276367, Learning Rate: 1.200009137392044e-06, Gradient Norm: 0.26289182901382446)
|
684 |
+
Step... (4 | Loss: 6.901595592498779, Learning Rate: 1.7999846022576094e-06, Gradient Norm: 0.28948840498924255)
|
685 |
+
Epoch ... (1/3): 33%|██████████████████████▋ | 1/3 [01:10<02:18, 69.35s/it]
|
686 |
+
Training...: 0%| | 0/4 [00:00<?, ?it/s]
|
687 |
+
|
688 |
+
|
689 |
+
Evaluating ...: 50%|██████████████████████████████████▌ | 2/4 [00:44<00:43, 21.55s/it]
|
690 |
+
|
wandb/run-20220529_120458-qf2iwkac/files/requirements.txt
ADDED
@@ -0,0 +1,187 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
absl-py==1.0.0
|
2 |
+
aiohttp==3.8.1
|
3 |
+
aiosignal==1.2.0
|
4 |
+
anyio==3.5.0
|
5 |
+
appdirs==1.4.4
|
6 |
+
argon2-cffi-bindings==21.2.0
|
7 |
+
argon2-cffi==21.3.0
|
8 |
+
asttokens==2.0.5
|
9 |
+
astunparse==1.6.3
|
10 |
+
async-timeout==4.0.2
|
11 |
+
attrs==21.4.0
|
12 |
+
audioread==2.1.9
|
13 |
+
babel==2.9.1
|
14 |
+
backcall==0.2.0
|
15 |
+
beautifulsoup4==4.10.0
|
16 |
+
bleach==4.1.0
|
17 |
+
cachetools==5.0.0
|
18 |
+
certifi==2021.10.8
|
19 |
+
cffi==1.15.0
|
20 |
+
charset-normalizer==2.0.12
|
21 |
+
chex==0.1.1
|
22 |
+
click==8.1.0
|
23 |
+
cycler==0.11.0
|
24 |
+
datasets==2.1.1.dev0
|
25 |
+
debugpy==1.6.0
|
26 |
+
decorator==5.1.1
|
27 |
+
defusedxml==0.7.1
|
28 |
+
dill==0.3.4
|
29 |
+
dm-tree==0.1.6
|
30 |
+
docker-pycreds==0.4.0
|
31 |
+
entrypoints==0.4
|
32 |
+
executing==0.8.3
|
33 |
+
filelock==3.6.0
|
34 |
+
flatbuffers==2.0
|
35 |
+
flax==0.4.1
|
36 |
+
fonttools==4.31.2
|
37 |
+
frozenlist==1.3.0
|
38 |
+
fsspec==2022.2.0
|
39 |
+
gast==0.5.3
|
40 |
+
gitdb==4.0.9
|
41 |
+
gitpython==3.1.27
|
42 |
+
google-auth-oauthlib==0.4.6
|
43 |
+
google-auth==2.6.2
|
44 |
+
google-pasta==0.2.0
|
45 |
+
grpcio==1.44.0
|
46 |
+
h5py==3.6.0
|
47 |
+
huggingface-hub==0.4.0
|
48 |
+
hypothesis==6.46.7
|
49 |
+
idna==3.3
|
50 |
+
importlib-metadata==4.11.3
|
51 |
+
importlib-resources==5.6.0
|
52 |
+
iniconfig==1.1.1
|
53 |
+
ipdb==0.13.9
|
54 |
+
ipykernel==6.10.0
|
55 |
+
ipython-genutils==0.2.0
|
56 |
+
ipython==8.2.0
|
57 |
+
jax==0.3.4
|
58 |
+
jaxlib==0.3.2
|
59 |
+
jedi==0.18.1
|
60 |
+
jinja2==3.1.1
|
61 |
+
jiwer==2.3.0
|
62 |
+
joblib==1.1.0
|
63 |
+
json5==0.9.6
|
64 |
+
jsonschema==4.4.0
|
65 |
+
jupyter-client==7.2.1
|
66 |
+
jupyter-core==4.9.2
|
67 |
+
jupyter-server==1.16.0
|
68 |
+
jupyterlab-pygments==0.1.2
|
69 |
+
jupyterlab-server==2.12.0
|
70 |
+
jupyterlab==3.3.2
|
71 |
+
kenlm==0.0.0
|
72 |
+
keras-preprocessing==1.1.2
|
73 |
+
keras==2.8.0
|
74 |
+
kiwisolver==1.4.2
|
75 |
+
libclang==13.0.0
|
76 |
+
librosa==0.9.1
|
77 |
+
libtpu-nightly==0.1.dev20220315
|
78 |
+
llvmlite==0.38.0
|
79 |
+
markdown==3.3.6
|
80 |
+
markupsafe==2.1.1
|
81 |
+
matplotlib-inline==0.1.3
|
82 |
+
matplotlib==3.5.1
|
83 |
+
mistune==0.8.4
|
84 |
+
msgpack==1.0.3
|
85 |
+
multidict==6.0.2
|
86 |
+
multiprocess==0.70.12.2
|
87 |
+
nbclassic==0.3.7
|
88 |
+
nbclient==0.5.13
|
89 |
+
nbconvert==6.4.5
|
90 |
+
nbformat==5.2.0
|
91 |
+
nest-asyncio==1.5.4
|
92 |
+
notebook-shim==0.1.0
|
93 |
+
notebook==6.4.10
|
94 |
+
numba==0.55.1
|
95 |
+
numpy==1.21.0
|
96 |
+
oauthlib==3.2.0
|
97 |
+
opt-einsum==3.3.0
|
98 |
+
optax==0.1.1
|
99 |
+
packaging==21.3
|
100 |
+
pandas==1.4.1
|
101 |
+
pandocfilters==1.5.0
|
102 |
+
parso==0.8.3
|
103 |
+
pathtools==0.1.2
|
104 |
+
pexpect==4.8.0
|
105 |
+
pickleshare==0.7.5
|
106 |
+
pillow==9.0.1
|
107 |
+
pip==20.0.2
|
108 |
+
pkg-resources==0.0.0
|
109 |
+
pluggy==1.0.0
|
110 |
+
pooch==1.6.0
|
111 |
+
prometheus-client==0.13.1
|
112 |
+
promise==2.3
|
113 |
+
prompt-toolkit==3.0.28
|
114 |
+
protobuf==3.19.4
|
115 |
+
psutil==5.9.0
|
116 |
+
ptyprocess==0.7.0
|
117 |
+
pure-eval==0.2.2
|
118 |
+
py==1.11.0
|
119 |
+
pyarrow==7.0.0
|
120 |
+
pyasn1-modules==0.2.8
|
121 |
+
pyasn1==0.4.8
|
122 |
+
pycparser==2.21
|
123 |
+
pyctcdecode==0.3.0
|
124 |
+
pygments==2.11.2
|
125 |
+
pygtrie==2.4.2
|
126 |
+
pyparsing==3.0.7
|
127 |
+
pyrsistent==0.18.1
|
128 |
+
pytest==7.1.2
|
129 |
+
python-dateutil==2.8.2
|
130 |
+
python-levenshtein==0.12.2
|
131 |
+
pytz==2022.1
|
132 |
+
pyyaml==6.0
|
133 |
+
pyzmq==22.3.0
|
134 |
+
regex==2022.3.15
|
135 |
+
requests-oauthlib==1.3.1
|
136 |
+
requests==2.27.1
|
137 |
+
resampy==0.2.2
|
138 |
+
responses==0.18.0
|
139 |
+
rsa==4.8
|
140 |
+
sacremoses==0.0.49
|
141 |
+
scikit-learn==1.0.2
|
142 |
+
scipy==1.8.0
|
143 |
+
send2trash==1.8.0
|
144 |
+
sentry-sdk==1.5.8
|
145 |
+
setproctitle==1.2.2
|
146 |
+
setuptools==44.0.0
|
147 |
+
shortuuid==1.0.8
|
148 |
+
six==1.16.0
|
149 |
+
smmap==5.0.0
|
150 |
+
sniffio==1.2.0
|
151 |
+
sortedcontainers==2.4.0
|
152 |
+
soundfile==0.10.3.post1
|
153 |
+
soupsieve==2.3.1
|
154 |
+
stack-data==0.2.0
|
155 |
+
tensorboard-data-server==0.6.1
|
156 |
+
tensorboard-plugin-wit==1.8.1
|
157 |
+
tensorboard==2.8.0
|
158 |
+
tensorflow-io-gcs-filesystem==0.24.0
|
159 |
+
tensorflow==2.8.0
|
160 |
+
termcolor==1.1.0
|
161 |
+
terminado==0.13.3
|
162 |
+
testpath==0.6.0
|
163 |
+
tf-estimator-nightly==2.8.0.dev2021122109
|
164 |
+
threadpoolctl==3.1.0
|
165 |
+
tokenizers==0.11.6
|
166 |
+
toml==0.10.2
|
167 |
+
tomli==2.0.1
|
168 |
+
toolz==0.11.2
|
169 |
+
torch==1.11.0+cpu
|
170 |
+
torchaudio==0.11.0+cpu
|
171 |
+
tornado==6.1
|
172 |
+
tqdm==4.63.1
|
173 |
+
traitlets==5.1.1
|
174 |
+
transformers==4.18.0.dev0
|
175 |
+
typing-extensions==4.1.1
|
176 |
+
urllib3==1.26.9
|
177 |
+
wandb==0.12.11
|
178 |
+
wcwidth==0.2.5
|
179 |
+
webencodings==0.5.1
|
180 |
+
websocket-client==1.3.2
|
181 |
+
werkzeug==2.1.0
|
182 |
+
wheel==0.37.1
|
183 |
+
wrapt==1.14.0
|
184 |
+
xxhash==3.0.0
|
185 |
+
yarl==1.7.2
|
186 |
+
yaspin==2.1.0
|
187 |
+
zipp==3.7.0
|
wandb/run-20220529_120458-qf2iwkac/files/wandb-metadata.json
ADDED
@@ -0,0 +1,58 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
{
|
2 |
+
"os": "Linux-5.4.0-1043-gcp-x86_64-with-glibc2.29",
|
3 |
+
"python": "3.8.10",
|
4 |
+
"heartbeatAt": "2022-05-29T12:05:01.566960",
|
5 |
+
"startedAt": "2022-05-29T12:04:58.499121",
|
6 |
+
"docker": null,
|
7 |
+
"cpu_count": 96,
|
8 |
+
"cuda": null,
|
9 |
+
"args": [
|
10 |
+
"--dataset_name=hf-internal-testing/librispeech_asr_dummy",
|
11 |
+
"--model_name_or_path=./",
|
12 |
+
"--dataset_config_name=clean",
|
13 |
+
"--train_split_name=validation",
|
14 |
+
"--eval_split_name=validation",
|
15 |
+
"--test_split_name=validation[:90%]+validation[:95%]",
|
16 |
+
"--output_dir=./",
|
17 |
+
"--dataset_cache_dir=/home/sanchitgandhi/cache/huggingface/datasets",
|
18 |
+
"--preprocessing_num_workers=1",
|
19 |
+
"--length_column_name=input_length",
|
20 |
+
"--overwrite_output_dir",
|
21 |
+
"--max_steps=10",
|
22 |
+
"--eval_steps=5",
|
23 |
+
"--save_steps=5",
|
24 |
+
"--per_device_train_batch_size=2",
|
25 |
+
"--per_device_eval_batch_size=2",
|
26 |
+
"--logging_steps=1",
|
27 |
+
"--max_duration_in_seconds=15",
|
28 |
+
"--max_target_length=64",
|
29 |
+
"--generation_max_length=40",
|
30 |
+
"--generation_num_beams=1",
|
31 |
+
"--final_generation_max_length=40",
|
32 |
+
"--final_generation_num_beams=1",
|
33 |
+
"--learning_rate=3e-4",
|
34 |
+
"--warmup_steps=500",
|
35 |
+
"--text_column_name=text",
|
36 |
+
"--save_total_limit=1",
|
37 |
+
"--wandb_project=flax-wav2vec2-2-bart-dummy",
|
38 |
+
"--freeze_feature_encoder",
|
39 |
+
"--predict_with_generate",
|
40 |
+
"--do_lower_case",
|
41 |
+
"--do_train",
|
42 |
+
"--do_eval",
|
43 |
+
"--do_predict",
|
44 |
+
"--push_to_hub"
|
45 |
+
],
|
46 |
+
"state": "running",
|
47 |
+
"program": "run_flax_speech_recognition_seq2seq.py",
|
48 |
+
"codePath": "run_flax_speech_recognition_seq2seq.py",
|
49 |
+
"git": {
|
50 |
+
"remote": "https://huggingface.co/sanchit-gandhi/flax-dummy",
|
51 |
+
"commit": "f6cca39fc53952ef9f6e2aaf1b660ddf4161b05d"
|
52 |
+
},
|
53 |
+
"email": "sanchit@huggingface.co",
|
54 |
+
"root": "/home/sanchitgandhi/flax-dummy",
|
55 |
+
"host": "t1v-n-4eb331dd-w-0",
|
56 |
+
"username": "sanchitgandhi",
|
57 |
+
"executable": "/home/sanchitgandhi/venv/bin/python"
|
58 |
+
}
|
wandb/run-20220529_120458-qf2iwkac/files/wandb-summary.json
ADDED
@@ -0,0 +1 @@
|
|
|
|
|
1 |
+
{"train/decoder_grad_norm": 0.28948837518692017, "train/decoder_param_norm": 10.9873046875, "train/encoder_grad_norm": 6.325928552541882e-05, "train/encoder_param_norm": 21.972986221313477, "train/grad_norm": 0.28948840498924255, "layer_grad_norm/": {"decoder": {"model": {"decoder": {"embed_positions": {"embedding": 0.1468065232038498}, "embed_tokens": {"embedding": 0.24802552163600922}, "layernorm_embedding": {"bias": 0.006764871999621391, "scale": 0.0036919426638633013}, "layers": {"FlaxBartDecoderLayers": {"encoder_attn": {"k_proj": {"bias": 3.328560029599442e-13, "kernel": 9.95582847933274e-13}, "out_proj": {"bias": 0.009623706340789795, "kernel": 4.8367015551775694e-05}, "q_proj": {"bias": 7.838640760787774e-13, "kernel": 1.0103118428667068e-12}, "v_proj": {"bias": 0.00114067189861089, "kernel": 4.325820555095561e-06}}, "encoder_attn_layer_norm": {"bias": 0.00958589930087328, "scale": 0.005416943226009607}, "fc1": {"bias": 0.00019959310884587467, "kernel": 0.0004716127004940063}, "fc2": {"bias": 0.00993743073195219, "kernel": 0.0003652443701867014}, "final_layer_norm": {"bias": 0.010846015997231007, "scale": 0.005791571922600269}, "self_attn": {"k_proj": {"bias": 1.3708134627421487e-09, "kernel": 4.577025265461998e-06}, "out_proj": {"bias": 0.009274601936340332, "kernel": 0.000999519252218306}, "q_proj": {"bias": 1.7297755903200596e-06, "kernel": 4.314425950724399e-06}, "v_proj": {"bias": 0.0008124791202135384, "kernel": 0.0009584471699781716}}, "self_attn_layer_norm": {"bias": 0.009586097672581673, "scale": 0.005415966734290123}}}}}}, "encoder": {"adapter": {"layers": {"0": {"conv": {"bias": 2.977989197461284e-07, "kernel": 1.3835579011356458e-06}}, "1": {"conv": {"bias": 3.6216874832462054e-06, "kernel": 1.5065093066368718e-06}}, "2": {"conv": {"bias": 6.303114059846848e-05, "kernel": 2.8466049570852192e-06}}}}, "encoder": {"layer_norm": {"bias": 4.9153292991377384e-08, "scale": 4.927479935190604e-08}, "layers": {"FlaxWav2Vec2EncoderLayers": {"attention": {"k_proj": {"bias": 1.459917082888187e-15, "kernel": 1.90472235206407e-10}, "out_proj": {"bias": 1.0287843679179787e-06, "kernel": 1.619392975271694e-07}, "q_proj": {"bias": 8.26251983498949e-11, "kernel": 2.237984936259707e-10}, "v_proj": {"bias": 7.108157973334528e-08, "kernel": 1.6080137754670432e-07}}, "feed_forward": {"intermediate_dense": {"bias": 4.810428322343796e-08, "kernel": 1.3616384819670202e-07}, "output_dense": {"bias": 9.441144470656582e-07, "kernel": 1.3445610136386676e-07}}, "final_layer_norm": {"bias": 3.294326367253575e-09, "scale": 2.245429175928848e-09}, "layer_norm": {"bias": 5.97741944829977e-09, "scale": 3.64200114510993e-09}}}, "pos_conv_embed": {"conv": {"bias": 3.8169284266587056e-07, "weight_g": 7.756315589801943e-09, "weight_v": 8.743172230651908e-08}}}, "feature_extractor": {"conv_layers": {"0": {"conv": {"kernel": 0.0}, "layer_norm": {"bias": 0.0, "scale": 0.0}}, "1": {"conv": {"kernel": 0.0}, "layer_norm": {"bias": 0.0, "scale": 0.0}}, "2": {"conv": {"kernel": 0.0}, "layer_norm": {"bias": 0.0, "scale": 0.0}}}}, "feature_projection": {"layer_norm": {"bias": 8.516661864632624e-08, "scale": 9.493159147666574e-09}, "projection": {"bias": 8.456348723484552e-07, "kernel": 6.272103405535745e-07}}, "masked_spec_embed": 0.0}}, "layer_param_norm/": {"decoder": {"model": {"decoder": {"embed_positions": {"embedding": 0.7995925545692444}, "embed_tokens": {"embedding": 2.5083723068237305}, "layernorm_embedding": {"bias": 0.004505096934735775, "scale": 4.000091552734375}, "layers": {"FlaxBartDecoderLayers": {"encoder_attn": {"k_proj": {"bias": 2.66487765188117e-09, "kernel": 0.43237069249153137}, "out_proj": {"bias": 0.006343331653624773, "kernel": 0.46723419427871704}, "q_proj": {"bias": 3.573997986450195e-08, "kernel": 0.4763868451118469}, "v_proj": {"bias": 0.0068108439445495605, "kernel": 0.4630829691886902}}, "encoder_attn_layer_norm": {"bias": 0.006466111168265343, "scale": 5.656971454620361}, "fc1": {"bias": 0.0032308774534612894, "kernel": 0.2302577644586563}, "fc2": {"bias": 0.00635903887450695, "kernel": 0.228498175740242}, "final_layer_norm": {"bias": 0.00622177729383111, "scale": 5.6571831703186035}, "self_attn": {"k_proj": {"bias": 1.3586953173216898e-05, "kernel": 0.4489402174949646}, "out_proj": {"bias": 0.006346424575895071, "kernel": 0.4596422016620636}, "q_proj": {"bias": 0.005505683831870556, "kernel": 0.43902429938316345}, "v_proj": {"bias": 0.0061093042604625225, "kernel": 0.47398924827575684}}, "self_attn_layer_norm": {"bias": 0.006466289050877094, "scale": 5.6569719314575195}}}}}}, "encoder": {"adapter": {"layers": {"0": {"conv": {"bias": 0.003881996963173151, "kernel": 0.7980067133903503}}, "1": {"conv": {"bias": 0.004839896224439144, "kernel": 0.7824534177780151}}, "2": {"conv": {"bias": 0.0047243074513971806, "kernel": 0.793502151966095}}}}, "encoder": {"layer_norm": {"bias": 0.0014691534452140331, "scale": 4.000521659851074}, "layers": {"FlaxWav2Vec2EncoderLayers": {"attention": {"k_proj": {"bias": 1.1641312219756728e-10, "kernel": 0.6511669158935547}, "out_proj": {"bias": 0.004517045803368092, "kernel": 0.6515875458717346}, "q_proj": {"bias": 9.10918606678024e-06, "kernel": 0.636457085609436}, "v_proj": {"bias": 0.00288462289609015, "kernel": 0.6421586871147156}}, "feed_forward": {"intermediate_dense": {"bias": 0.0023183180019259453, "kernel": 0.7063446640968323}, "output_dense": {"bias": 0.004766375757753849, "kernel": 0.7153233885765076}}, "final_layer_norm": {"bias": 0.00032824758091010153, "scale": 7.999977111816406}, "layer_norm": {"bias": 0.0006391080096364021, "scale": 8.000051498413086}}}, "pos_conv_embed": {"conv": {"bias": 0.0019932836294174194, "weight_g": 2.2777116298675537, "weight_v": 2.2775819301605225}}}, "feature_extractor": {"conv_layers": {"0": {"conv": {"kernel": 7.867799758911133}, "layer_norm": {"bias": 0.0, "scale": 5.656854152679443}}, "1": {"conv": {"kernel": 8.025212287902832}, "layer_norm": {"bias": 0.0, "scale": 5.656854152679443}}, "2": {"conv": {"kernel": 7.975250720977783}, "layer_norm": {"bias": 0.0, "scale": 5.656854152679443}}}}, "feature_projection": {"layer_norm": {"bias": 0.0016454723663628101, "scale": 5.656803131103516}, "projection": {"bias": 0.0019700622651726007, "kernel": 0.4300574064254761}}, "masked_spec_embed": 2.404470205307007}}, "train/learning_rate": 1.7999846022576094e-06, "train/loss": 6.901595592498779, "train/param_norm": 24.56690788269043, "_timestamp": 1653825986, "_runtime": 88, "_step": 4}
|
wandb/run-20220529_120458-qf2iwkac/logs/debug-internal.log
ADDED
@@ -0,0 +1,108 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
2022-05-29 12:04:59,324 INFO MainThread:1721989 [internal.py:wandb_internal():92] W&B internal server running at pid: 1721989, started at: 2022-05-29 12:04:59.324307
|
2 |
+
2022-05-29 12:04:59,326 INFO WriterThread:1721989 [datastore.py:open_for_write():77] open: /home/sanchitgandhi/flax-dummy/wandb/run-20220529_120458-qf2iwkac/run-qf2iwkac.wandb
|
3 |
+
2022-05-29 12:04:59,327 DEBUG SenderThread:1721989 [sender.py:send():235] send: header
|
4 |
+
2022-05-29 12:04:59,327 DEBUG SenderThread:1721989 [sender.py:send():235] send: run
|
5 |
+
2022-05-29 12:04:59,384 DEBUG HandlerThread:1721989 [handler.py:handle_request():141] handle_request: check_version
|
6 |
+
2022-05-29 12:04:59,385 INFO SenderThread:1721989 [dir_watcher.py:__init__():169] watching files in: /home/sanchitgandhi/flax-dummy/wandb/run-20220529_120458-qf2iwkac/files
|
7 |
+
2022-05-29 12:04:59,385 INFO SenderThread:1721989 [sender.py:_start_run_threads():812] run started: qf2iwkac with start time 1653825898
|
8 |
+
2022-05-29 12:04:59,385 DEBUG SenderThread:1721989 [sender.py:send():235] send: summary
|
9 |
+
2022-05-29 12:04:59,385 INFO SenderThread:1721989 [sender.py:_save_file():947] saving file wandb-summary.json with policy end
|
10 |
+
2022-05-29 12:04:59,385 DEBUG SenderThread:1721989 [sender.py:send_request():249] send_request: check_version
|
11 |
+
2022-05-29 12:04:59,447 DEBUG HandlerThread:1721989 [handler.py:handle_request():141] handle_request: run_start
|
12 |
+
2022-05-29 12:05:00,436 INFO Thread-7 :1721989 [dir_watcher.py:_on_file_created():217] file/dir created: /home/sanchitgandhi/flax-dummy/wandb/run-20220529_120458-qf2iwkac/files/wandb-summary.json
|
13 |
+
2022-05-29 12:05:01,566 DEBUG HandlerThread:1721989 [meta.py:__init__():37] meta init
|
14 |
+
2022-05-29 12:05:01,566 DEBUG HandlerThread:1721989 [meta.py:__init__():51] meta init done
|
15 |
+
2022-05-29 12:05:01,566 DEBUG HandlerThread:1721989 [meta.py:probe():211] probe
|
16 |
+
2022-05-29 12:05:01,568 DEBUG HandlerThread:1721989 [meta.py:_setup_git():201] setup git
|
17 |
+
2022-05-29 12:05:01,597 DEBUG HandlerThread:1721989 [meta.py:_setup_git():208] setup git done
|
18 |
+
2022-05-29 12:05:01,597 DEBUG HandlerThread:1721989 [meta.py:_save_pip():55] save pip
|
19 |
+
2022-05-29 12:05:01,598 DEBUG HandlerThread:1721989 [meta.py:_save_pip():69] save pip done
|
20 |
+
2022-05-29 12:05:01,598 DEBUG HandlerThread:1721989 [meta.py:probe():249] probe done
|
21 |
+
2022-05-29 12:05:01,601 DEBUG SenderThread:1721989 [sender.py:send():235] send: files
|
22 |
+
2022-05-29 12:05:01,601 INFO SenderThread:1721989 [sender.py:_save_file():947] saving file wandb-metadata.json with policy now
|
23 |
+
2022-05-29 12:05:01,608 DEBUG HandlerThread:1721989 [handler.py:handle_request():141] handle_request: stop_status
|
24 |
+
2022-05-29 12:05:01,608 DEBUG SenderThread:1721989 [sender.py:send_request():249] send_request: stop_status
|
25 |
+
2022-05-29 12:05:01,639 DEBUG SenderThread:1721989 [sender.py:send():235] send: telemetry
|
26 |
+
2022-05-29 12:05:01,849 INFO Thread-11 :1721989 [upload_job.py:push():137] Uploaded file /tmp/tmpl7u6xxw8wandb/12iwbaqf-wandb-metadata.json
|
27 |
+
2022-05-29 12:05:02,475 INFO Thread-7 :1721989 [dir_watcher.py:_on_file_created():217] file/dir created: /home/sanchitgandhi/flax-dummy/wandb/run-20220529_120458-qf2iwkac/files/wandb-metadata.json
|
28 |
+
2022-05-29 12:05:02,475 INFO Thread-7 :1721989 [dir_watcher.py:_on_file_created():217] file/dir created: /home/sanchitgandhi/flax-dummy/wandb/run-20220529_120458-qf2iwkac/files/requirements.txt
|
29 |
+
2022-05-29 12:05:02,475 INFO Thread-7 :1721989 [dir_watcher.py:_on_file_created():217] file/dir created: /home/sanchitgandhi/flax-dummy/wandb/run-20220529_120458-qf2iwkac/files/output.log
|
30 |
+
2022-05-29 12:05:04,476 INFO Thread-7 :1721989 [dir_watcher.py:_on_file_modified():230] file/dir modified: /home/sanchitgandhi/flax-dummy/wandb/run-20220529_120458-qf2iwkac/files/output.log
|
31 |
+
2022-05-29 12:05:06,476 INFO Thread-7 :1721989 [dir_watcher.py:_on_file_modified():230] file/dir modified: /home/sanchitgandhi/flax-dummy/wandb/run-20220529_120458-qf2iwkac/files/output.log
|
32 |
+
2022-05-29 12:05:16,479 INFO Thread-7 :1721989 [dir_watcher.py:_on_file_modified():230] file/dir modified: /home/sanchitgandhi/flax-dummy/wandb/run-20220529_120458-qf2iwkac/files/output.log
|
33 |
+
2022-05-29 12:05:16,641 DEBUG HandlerThread:1721989 [handler.py:handle_request():141] handle_request: stop_status
|
34 |
+
2022-05-29 12:05:16,641 DEBUG SenderThread:1721989 [sender.py:send_request():249] send_request: stop_status
|
35 |
+
2022-05-29 12:05:18,480 INFO Thread-7 :1721989 [dir_watcher.py:_on_file_modified():230] file/dir modified: /home/sanchitgandhi/flax-dummy/wandb/run-20220529_120458-qf2iwkac/files/output.log
|
36 |
+
2022-05-29 12:05:20,481 INFO Thread-7 :1721989 [dir_watcher.py:_on_file_modified():230] file/dir modified: /home/sanchitgandhi/flax-dummy/wandb/run-20220529_120458-qf2iwkac/files/output.log
|
37 |
+
2022-05-29 12:05:29,648 DEBUG SenderThread:1721989 [sender.py:send():235] send: stats
|
38 |
+
2022-05-29 12:05:30,485 INFO Thread-7 :1721989 [dir_watcher.py:_on_file_modified():230] file/dir modified: /home/sanchitgandhi/flax-dummy/wandb/run-20220529_120458-qf2iwkac/files/config.yaml
|
39 |
+
2022-05-29 12:05:31,669 DEBUG HandlerThread:1721989 [handler.py:handle_request():141] handle_request: stop_status
|
40 |
+
2022-05-29 12:05:31,669 DEBUG SenderThread:1721989 [sender.py:send_request():249] send_request: stop_status
|
41 |
+
2022-05-29 12:05:42,516 DEBUG HandlerThread:1721989 [handler.py:handle_request():141] handle_request: partial_history
|
42 |
+
2022-05-29 12:05:42,573 DEBUG HandlerThread:1721989 [handler.py:handle_request():141] handle_request: partial_history
|
43 |
+
2022-05-29 12:05:44,489 INFO Thread-7 :1721989 [dir_watcher.py:_on_file_modified():230] file/dir modified: /home/sanchitgandhi/flax-dummy/wandb/run-20220529_120458-qf2iwkac/files/output.log
|
44 |
+
2022-05-29 12:05:46,714 DEBUG HandlerThread:1721989 [handler.py:handle_request():141] handle_request: stop_status
|
45 |
+
2022-05-29 12:05:46,715 DEBUG SenderThread:1721989 [sender.py:send_request():249] send_request: stop_status
|
46 |
+
2022-05-29 12:05:59,724 DEBUG SenderThread:1721989 [sender.py:send():235] send: stats
|
47 |
+
2022-05-29 12:06:01,767 DEBUG HandlerThread:1721989 [handler.py:handle_request():141] handle_request: stop_status
|
48 |
+
2022-05-29 12:06:01,767 DEBUG SenderThread:1721989 [sender.py:send_request():249] send_request: stop_status
|
49 |
+
2022-05-29 12:06:03,840 DEBUG HandlerThread:1721989 [handler.py:handle_request():141] handle_request: partial_history
|
50 |
+
2022-05-29 12:06:03,845 DEBUG SenderThread:1721989 [sender.py:send():235] send: history
|
51 |
+
2022-05-29 12:06:03,846 DEBUG SenderThread:1721989 [sender.py:send():235] send: summary
|
52 |
+
2022-05-29 12:06:03,847 INFO SenderThread:1721989 [sender.py:_save_file():947] saving file wandb-summary.json with policy end
|
53 |
+
2022-05-29 12:06:03,899 DEBUG HandlerThread:1721989 [handler.py:handle_request():141] handle_request: partial_history
|
54 |
+
2022-05-29 12:06:04,497 INFO Thread-7 :1721989 [dir_watcher.py:_on_file_modified():230] file/dir modified: /home/sanchitgandhi/flax-dummy/wandb/run-20220529_120458-qf2iwkac/files/wandb-summary.json
|
55 |
+
2022-05-29 12:06:06,497 INFO Thread-7 :1721989 [dir_watcher.py:_on_file_modified():230] file/dir modified: /home/sanchitgandhi/flax-dummy/wandb/run-20220529_120458-qf2iwkac/files/output.log
|
56 |
+
2022-05-29 12:06:16,816 DEBUG HandlerThread:1721989 [handler.py:handle_request():141] handle_request: stop_status
|
57 |
+
2022-05-29 12:06:16,816 DEBUG SenderThread:1721989 [sender.py:send_request():249] send_request: stop_status
|
58 |
+
2022-05-29 12:06:26,186 DEBUG HandlerThread:1721989 [handler.py:handle_request():141] handle_request: partial_history
|
59 |
+
2022-05-29 12:06:26,188 DEBUG SenderThread:1721989 [sender.py:send():235] send: history
|
60 |
+
2022-05-29 12:06:26,189 DEBUG SenderThread:1721989 [sender.py:send():235] send: summary
|
61 |
+
2022-05-29 12:06:26,191 INFO SenderThread:1721989 [sender.py:_save_file():947] saving file wandb-summary.json with policy end
|
62 |
+
2022-05-29 12:06:26,256 DEBUG HandlerThread:1721989 [handler.py:handle_request():141] handle_request: partial_history
|
63 |
+
2022-05-29 12:06:26,505 INFO Thread-7 :1721989 [dir_watcher.py:_on_file_modified():230] file/dir modified: /home/sanchitgandhi/flax-dummy/wandb/run-20220529_120458-qf2iwkac/files/wandb-summary.json
|
64 |
+
2022-05-29 12:06:26,834 DEBUG HandlerThread:1721989 [handler.py:handle_request():141] handle_request: partial_history
|
65 |
+
2022-05-29 12:06:26,836 DEBUG SenderThread:1721989 [sender.py:send():235] send: history
|
66 |
+
2022-05-29 12:06:26,837 DEBUG SenderThread:1721989 [sender.py:send():235] send: summary
|
67 |
+
2022-05-29 12:06:26,838 INFO SenderThread:1721989 [sender.py:_save_file():947] saving file wandb-summary.json with policy end
|
68 |
+
2022-05-29 12:06:26,884 DEBUG HandlerThread:1721989 [handler.py:handle_request():141] handle_request: partial_history
|
69 |
+
2022-05-29 12:06:27,505 INFO Thread-7 :1721989 [dir_watcher.py:_on_file_modified():230] file/dir modified: /home/sanchitgandhi/flax-dummy/wandb/run-20220529_120458-qf2iwkac/files/wandb-summary.json
|
70 |
+
2022-05-29 12:06:27,513 DEBUG HandlerThread:1721989 [handler.py:handle_request():141] handle_request: partial_history
|
71 |
+
2022-05-29 12:06:27,515 DEBUG SenderThread:1721989 [sender.py:send():235] send: history
|
72 |
+
2022-05-29 12:06:27,515 DEBUG SenderThread:1721989 [sender.py:send():235] send: summary
|
73 |
+
2022-05-29 12:06:27,516 INFO SenderThread:1721989 [sender.py:_save_file():947] saving file wandb-summary.json with policy end
|
74 |
+
2022-05-29 12:06:27,570 DEBUG HandlerThread:1721989 [handler.py:handle_request():141] handle_request: partial_history
|
75 |
+
2022-05-29 12:06:28,506 INFO Thread-7 :1721989 [dir_watcher.py:_on_file_modified():230] file/dir modified: /home/sanchitgandhi/flax-dummy/wandb/run-20220529_120458-qf2iwkac/files/wandb-summary.json
|
76 |
+
2022-05-29 12:06:28,506 INFO Thread-7 :1721989 [dir_watcher.py:_on_file_modified():230] file/dir modified: /home/sanchitgandhi/flax-dummy/wandb/run-20220529_120458-qf2iwkac/files/output.log
|
77 |
+
2022-05-29 12:06:29,794 DEBUG SenderThread:1721989 [sender.py:send():235] send: stats
|
78 |
+
2022-05-29 12:06:31,871 DEBUG HandlerThread:1721989 [handler.py:handle_request():141] handle_request: stop_status
|
79 |
+
2022-05-29 12:06:31,871 DEBUG SenderThread:1721989 [sender.py:send_request():249] send_request: stop_status
|
80 |
+
2022-05-29 12:06:46,902 DEBUG HandlerThread:1721989 [handler.py:handle_request():141] handle_request: stop_status
|
81 |
+
2022-05-29 12:06:46,903 DEBUG SenderThread:1721989 [sender.py:send_request():249] send_request: stop_status
|
82 |
+
2022-05-29 12:06:54,515 INFO Thread-7 :1721989 [dir_watcher.py:_on_file_modified():230] file/dir modified: /home/sanchitgandhi/flax-dummy/wandb/run-20220529_120458-qf2iwkac/files/output.log
|
83 |
+
2022-05-29 12:06:59,870 DEBUG SenderThread:1721989 [sender.py:send():235] send: stats
|
84 |
+
2022-05-29 12:07:01,952 DEBUG HandlerThread:1721989 [handler.py:handle_request():141] handle_request: stop_status
|
85 |
+
2022-05-29 12:07:01,952 DEBUG SenderThread:1721989 [sender.py:send_request():249] send_request: stop_status
|
86 |
+
2022-05-29 12:07:12,521 INFO Thread-7 :1721989 [dir_watcher.py:_on_file_modified():230] file/dir modified: /home/sanchitgandhi/flax-dummy/wandb/run-20220529_120458-qf2iwkac/files/output.log
|
87 |
+
2022-05-29 12:07:17,016 DEBUG HandlerThread:1721989 [handler.py:handle_request():141] handle_request: stop_status
|
88 |
+
2022-05-29 12:07:17,016 DEBUG SenderThread:1721989 [sender.py:send_request():249] send_request: stop_status
|
89 |
+
2022-05-29 12:07:29,946 DEBUG SenderThread:1721989 [sender.py:send():235] send: stats
|
90 |
+
2022-05-29 12:07:32,044 DEBUG HandlerThread:1721989 [handler.py:handle_request():141] handle_request: stop_status
|
91 |
+
2022-05-29 12:07:32,044 DEBUG SenderThread:1721989 [sender.py:send_request():249] send_request: stop_status
|
92 |
+
2022-05-29 12:07:35,982 DEBUG HandlerThread:1721989 [handler.py:handle_request():141] handle_request: partial_history
|
93 |
+
2022-05-29 12:07:36,081 DEBUG HandlerThread:1721989 [handler.py:handle_request():141] handle_request: log_artifact
|
94 |
+
2022-05-29 12:07:36,081 DEBUG SenderThread:1721989 [sender.py:send_request():249] send_request: log_artifact
|
95 |
+
2022-05-29 12:07:36,280 DEBUG HandlerThread:1721989 [handler.py:handle_request():141] handle_request: partial_history
|
96 |
+
2022-05-29 12:07:36,354 DEBUG HandlerThread:1721989 [handler.py:handle_request():141] handle_request: log_artifact
|
97 |
+
2022-05-29 12:07:36,453 DEBUG HandlerThread:1721989 [handler.py:handle_request():141] handle_request: partial_history
|
98 |
+
2022-05-29 12:07:36,530 INFO Thread-7 :1721989 [dir_watcher.py:_on_file_modified():230] file/dir modified: /home/sanchitgandhi/flax-dummy/wandb/run-20220529_120458-qf2iwkac/files/output.log
|
99 |
+
2022-05-29 12:07:36,530 INFO Thread-7 :1721989 [dir_watcher.py:_on_file_created():217] file/dir created: /home/sanchitgandhi/flax-dummy/wandb/run-20220529_120458-qf2iwkac/files/media/table/eval/step_0k_5_92ae240f2472ae79ad31.table.json
|
100 |
+
2022-05-29 12:07:36,530 INFO Thread-7 :1721989 [dir_watcher.py:_on_file_created():217] file/dir created: /home/sanchitgandhi/flax-dummy/wandb/run-20220529_120458-qf2iwkac/files/media/table/eval/step_0k_incorrect_5_b30e6671cbaaf5c82b6d.table.json
|
101 |
+
2022-05-29 12:07:36,530 INFO Thread-7 :1721989 [dir_watcher.py:_on_file_created():217] file/dir created: /home/sanchitgandhi/flax-dummy/wandb/run-20220529_120458-qf2iwkac/files/media
|
102 |
+
2022-05-29 12:07:36,530 INFO Thread-7 :1721989 [dir_watcher.py:_on_file_created():217] file/dir created: /home/sanchitgandhi/flax-dummy/wandb/run-20220529_120458-qf2iwkac/files/media/table
|
103 |
+
2022-05-29 12:07:36,530 INFO Thread-7 :1721989 [dir_watcher.py:_on_file_created():217] file/dir created: /home/sanchitgandhi/flax-dummy/wandb/run-20220529_120458-qf2iwkac/files/media/table/eval
|
104 |
+
2022-05-29 12:07:36,589 INFO Thread-13 :1721989 [upload_job.py:push():95] Uploaded file /home/sanchitgandhi/.cache/wandb/artifacts/obj/md5/00/66783abbd98b29e02682346c82293a
|
105 |
+
2022-05-29 12:07:36,995 INFO SenderThread:1721989 [sender.py:send_request_log_artifact():976] logged artifact run-qf2iwkac-evalstep_0k - {'id': 'QXJ0aWZhY3Q6MTM1NjQyNTgy', 'digest': '1fc9411295ac28c0df16aa1f68611033', 'state': 'PENDING', 'aliases': [], 'artifactSequence': {'id': 'QXJ0aWZhY3RDb2xsZWN0aW9uOjIxNTAwMTY3', 'latestArtifact': None}, 'version': 'latest'}
|
106 |
+
2022-05-29 12:07:36,995 DEBUG SenderThread:1721989 [sender.py:send():235] send: files
|
107 |
+
2022-05-29 12:07:36,995 INFO SenderThread:1721989 [sender.py:_save_file():947] saving file media/table/eval/step_0k_5_92ae240f2472ae79ad31.table.json with policy now
|
108 |
+
2022-05-29 12:07:36,996 DEBUG SenderThread:1721989 [sender.py:send_request():249] send_request: log_artifact
|
wandb/run-20220529_120458-qf2iwkac/logs/debug.log
ADDED
@@ -0,0 +1,25 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
2022-05-29 12:04:58,500 INFO MainThread:1720934 [wandb_setup.py:_flush():75] Loading settings from /home/sanchitgandhi/.config/wandb/settings
|
2 |
+
2022-05-29 12:04:58,500 INFO MainThread:1720934 [wandb_setup.py:_flush():75] Loading settings from /home/sanchitgandhi/flax-dummy/wandb/settings
|
3 |
+
2022-05-29 12:04:58,500 INFO MainThread:1720934 [wandb_setup.py:_flush():75] Loading settings from environment variables: {}
|
4 |
+
2022-05-29 12:04:58,500 INFO MainThread:1720934 [wandb_setup.py:_flush():75] Inferring run settings from compute environment: {'program_relpath': 'run_flax_speech_recognition_seq2seq.py', 'program': 'run_flax_speech_recognition_seq2seq.py'}
|
5 |
+
2022-05-29 12:04:58,500 INFO MainThread:1720934 [wandb_init.py:_log_setup():405] Logging user logs to /home/sanchitgandhi/flax-dummy/wandb/run-20220529_120458-qf2iwkac/logs/debug.log
|
6 |
+
2022-05-29 12:04:58,500 INFO MainThread:1720934 [wandb_init.py:_log_setup():406] Logging internal logs to /home/sanchitgandhi/flax-dummy/wandb/run-20220529_120458-qf2iwkac/logs/debug-internal.log
|
7 |
+
2022-05-29 12:04:58,501 INFO MainThread:1720934 [wandb_init.py:init():439] calling init triggers
|
8 |
+
2022-05-29 12:04:58,501 INFO MainThread:1720934 [wandb_init.py:init():442] wandb.init called with sweep_config: {}
|
9 |
+
config: {}
|
10 |
+
2022-05-29 12:04:58,501 INFO MainThread:1720934 [wandb_init.py:init():492] starting backend
|
11 |
+
2022-05-29 12:04:58,501 INFO MainThread:1720934 [backend.py:_multiprocessing_setup():99] multiprocessing start_methods=fork,spawn,forkserver, using: spawn
|
12 |
+
2022-05-29 12:04:58,525 INFO MainThread:1720934 [backend.py:ensure_launched():219] starting backend process...
|
13 |
+
2022-05-29 12:04:58,550 INFO MainThread:1720934 [backend.py:ensure_launched():224] started backend process with pid: 1721989
|
14 |
+
2022-05-29 12:04:58,552 INFO MainThread:1720934 [wandb_init.py:init():501] backend started and connected
|
15 |
+
2022-05-29 12:04:58,565 INFO MainThread:1720934 [wandb_init.py:init():565] updated telemetry
|
16 |
+
2022-05-29 12:04:58,625 INFO MainThread:1720934 [wandb_init.py:init():596] communicating run to backend with 30 second timeout
|
17 |
+
2022-05-29 12:04:59,384 INFO MainThread:1720934 [wandb_run.py:_on_init():1759] communicating current version
|
18 |
+
2022-05-29 12:04:59,446 INFO MainThread:1720934 [wandb_run.py:_on_init():1763] got version response upgrade_message: "wandb version 0.12.17 is available! To upgrade, please run:\n $ pip install wandb --upgrade"
|
19 |
+
|
20 |
+
2022-05-29 12:04:59,446 INFO MainThread:1720934 [wandb_init.py:init():625] starting run threads in backend
|
21 |
+
2022-05-29 12:05:01,606 INFO MainThread:1720934 [wandb_run.py:_console_start():1733] atexit reg
|
22 |
+
2022-05-29 12:05:01,606 INFO MainThread:1720934 [wandb_run.py:_redirect():1606] redirect: SettingsConsole.REDIRECT
|
23 |
+
2022-05-29 12:05:01,607 INFO MainThread:1720934 [wandb_run.py:_redirect():1611] Redirecting console.
|
24 |
+
2022-05-29 12:05:01,609 INFO MainThread:1720934 [wandb_run.py:_redirect():1667] Redirects installed.
|
25 |
+
2022-05-29 12:05:01,609 INFO MainThread:1720934 [wandb_init.py:init():664] run started, returning control to user process
|
wandb/run-20220529_120458-qf2iwkac/run-qf2iwkac.wandb
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:41e927d4ed171d350275f4ae49605fcc773813f7805542baaaf99fa5fcebe420
|
3 |
+
size 52033
|