Commit
•
2c4c266
1
Parent(s):
0323715
Training in progress, step 7000
Browse files- pytorch_model.bin +1 -1
- runs/May04_13-30-49_sanchit--v100/events.out.tfevents.1651674089.sanchit--v100.50430.0 +2 -2
- wandb/debug-cli.log +4 -0
- wandb/run-20220504_142129-w4rlzz90/files/output.log +0 -0
- wandb/run-20220504_142129-w4rlzz90/files/wandb-summary.json +0 -0
- wandb/run-20220504_142129-w4rlzz90/logs/debug-internal.log +2 -2
- wandb/run-20220504_142129-w4rlzz90/run-w4rlzz90.wandb +2 -2
pytorch_model.bin
CHANGED
@@ -1,3 +1,3 @@
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
-
oid sha256:
|
3 |
size 2353867057
|
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:35e5bd64525504f987bff198c66c6674ce4fae90478689b6996b5c0ca9edaa62
|
3 |
size 2353867057
|
runs/May04_13-30-49_sanchit--v100/events.out.tfevents.1651674089.sanchit--v100.50430.0
CHANGED
@@ -1,3 +1,3 @@
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
-
oid sha256:
|
3 |
-
size
|
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:8b06624f1c35dcce61d3d0ea9ae9565b8a340cefb3e60f9c36c63fd2f617b6c9
|
3 |
+
size 1112938
|
wandb/debug-cli.log
CHANGED
@@ -24,3 +24,7 @@
|
|
24 |
warmup_steps: 500
|
25 |
2022-05-04 13:30:45 INFO About to run command: python3 run_xtreme_s.py --overwrite_output_dir --freeze_feature_encoder --gradient_checkpointing --predict_with_generate --fp16 --group_by_length --do_train --do_eval --load_best_model_at_end --push_to_hub --use_auth_token --eval_split_name=test --eval_steps=500 --evaluation_strategy=steps --generation_max_length=40 --generation_num_beams=1 --gradient_accumulation_steps=8 --greater_is_better=True --hidden_dropout=0.17305159310134854 --language=fr.en --learning_rate=0.00012335092351490598 --logging_steps=1 --max_duration_in_seconds=20 --metric_for_best_model=bleu --model_name_or_path=./ --num_train_epochs=3 --output_dir=./ --per_device_eval_batch_size=8 --per_device_train_batch_size=8 --save_steps=500 --task=covost2 --warmup_steps=500
|
26 |
2022-05-04 13:30:50 INFO Running runs: ['w4rlzz90']
|
|
|
|
|
|
|
|
|
|
24 |
warmup_steps: 500
|
25 |
2022-05-04 13:30:45 INFO About to run command: python3 run_xtreme_s.py --overwrite_output_dir --freeze_feature_encoder --gradient_checkpointing --predict_with_generate --fp16 --group_by_length --do_train --do_eval --load_best_model_at_end --push_to_hub --use_auth_token --eval_split_name=test --eval_steps=500 --evaluation_strategy=steps --generation_max_length=40 --generation_num_beams=1 --gradient_accumulation_steps=8 --greater_is_better=True --hidden_dropout=0.17305159310134854 --language=fr.en --learning_rate=0.00012335092351490598 --logging_steps=1 --max_duration_in_seconds=20 --metric_for_best_model=bleu --model_name_or_path=./ --num_train_epochs=3 --output_dir=./ --per_device_eval_batch_size=8 --per_device_train_batch_size=8 --save_steps=500 --task=covost2 --warmup_steps=500
|
26 |
2022-05-04 13:30:50 INFO Running runs: ['w4rlzz90']
|
27 |
+
2022-05-05 09:24:36 ERROR 500 response executing GraphQL.
|
28 |
+
2022-05-05 09:24:36 ERROR {"errors":[{"message":"context deadline exceeded"}]}
|
29 |
+
2022-05-05 09:25:32 ERROR 500 response executing GraphQL.
|
30 |
+
2022-05-05 09:25:32 ERROR {"errors":[{"message":"context deadline exceeded"}]}
|
wandb/run-20220504_142129-w4rlzz90/files/output.log
CHANGED
The diff for this file is too large to render.
See raw diff
|
|
wandb/run-20220504_142129-w4rlzz90/files/wandb-summary.json
CHANGED
The diff for this file is too large to render.
See raw diff
|
|
wandb/run-20220504_142129-w4rlzz90/logs/debug-internal.log
CHANGED
@@ -1,3 +1,3 @@
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
-
oid sha256:
|
3 |
-
size
|
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:3ea0a01617a7f2eb83ebb75d6ae3d8f74fde57ef3d04708db9b2a7fed6f09209
|
3 |
+
size 16339117
|
wandb/run-20220504_142129-w4rlzz90/run-w4rlzz90.wandb
CHANGED
@@ -1,3 +1,3 @@
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
-
oid sha256:
|
3 |
-
size
|
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:2fa087fc845449e6ec3b1f4a2594beb537798985e3c3711320ca4a897a9dfb19
|
3 |
+
size 741708519
|