Saving weights and logs of step 1000
Browse files- events.out.tfevents.1643652616.t1v-n-00e295a4-w-0.190699.0.v2 +3 -0
- flax_model.msgpack +1 -1
- train.512.recover78k.sh +26 -0
- wandb/debug-internal.log +1 -1
- wandb/debug.log +1 -1
- wandb/latest-run +1 -1
- wandb/run-20220127_170102-3ai81gzh/files/output.log +335 -0
- wandb/run-20220127_170102-3ai81gzh/logs/debug-internal.log +2 -2
- wandb/run-20220127_170102-3ai81gzh/run-3ai81gzh.wandb +2 -2
- wandb/run-20220131_181009-hhypc7yn/files/code/run_mlm_flax.py +815 -0
- wandb/run-20220131_181009-hhypc7yn/files/config.yaml +147 -0
- wandb/run-20220131_181009-hhypc7yn/files/diff.patch +0 -0
- wandb/run-20220131_181009-hhypc7yn/files/events.out.tfevents.1643652616.t1v-n-00e295a4-w-0.190699.0.v2 +1 -0
- wandb/run-20220131_181009-hhypc7yn/files/output.log +1350 -0
- wandb/run-20220131_181009-hhypc7yn/files/requirements.txt +134 -0
- wandb/run-20220131_181009-hhypc7yn/files/wandb-metadata.json +48 -0
- wandb/run-20220131_181009-hhypc7yn/files/wandb-summary.json +1 -0
- wandb/run-20220131_181009-hhypc7yn/logs/debug-internal.log +0 -0
- wandb/run-20220131_181009-hhypc7yn/logs/debug.log +26 -0
- wandb/run-20220131_181009-hhypc7yn/run-hhypc7yn.wandb +0 -0
events.out.tfevents.1643652616.t1v-n-00e295a4-w-0.190699.0.v2
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:1c12eb4ecb414e275b744dabc40a8c5db948797cfb7ee2177206194e8b88c698
|
3 |
+
size 147136
|
flax_model.msgpack
CHANGED
@@ -1,3 +1,3 @@
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
-
oid sha256:
|
3 |
size 498796983
|
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:40964a5cce135986e63c56e58527461822ac8597fbf633b214fb7afda4c3d55b
|
3 |
size 498796983
|
train.512.recover78k.sh
ADDED
@@ -0,0 +1,26 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
python run_mlm_flax.py \
|
2 |
+
--output_dir="./" \
|
3 |
+
--model_type="roberta" \
|
4 |
+
--model_name_or_path="./" \
|
5 |
+
--config_name="./" \
|
6 |
+
--tokenizer_name="./" \
|
7 |
+
--dataset_name="NbAiLab/NCC" \
|
8 |
+
--max_seq_length="512" \
|
9 |
+
--weight_decay="0.01" \
|
10 |
+
--per_device_train_batch_size="46" \
|
11 |
+
--per_device_eval_batch_size="46" \
|
12 |
+
--pad_to_max_length \
|
13 |
+
--learning_rate="1.2866e-4" \
|
14 |
+
--warmup_steps="0" \
|
15 |
+
--overwrite_output_dir \
|
16 |
+
--num_train_epochs="9" \
|
17 |
+
--adam_beta1="0.9" \
|
18 |
+
--adam_beta2="0.98" \
|
19 |
+
--adam_epsilon="1e-6" \
|
20 |
+
--logging_steps="1000" \
|
21 |
+
--save_steps="1000" \
|
22 |
+
--eval_steps="1000" \
|
23 |
+
--do_train \
|
24 |
+
--do_eval \
|
25 |
+
--dtype="bfloat16" \
|
26 |
+
--push_to_hub
|
wandb/debug-internal.log
CHANGED
@@ -1 +1 @@
|
|
1 |
-
run-
|
|
|
1 |
+
run-20220131_181009-hhypc7yn/logs/debug-internal.log
|
wandb/debug.log
CHANGED
@@ -1 +1 @@
|
|
1 |
-
run-
|
|
|
1 |
+
run-20220131_181009-hhypc7yn/logs/debug.log
|
wandb/latest-run
CHANGED
@@ -1 +1 @@
|
|
1 |
-
run-
|
|
|
1 |
+
run-20220131_181009-hhypc7yn
|
wandb/run-20220127_170102-3ai81gzh/files/output.log
CHANGED
@@ -70927,3 +70927,338 @@ To disable this warning, you can either:
|
|
70927 |
huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks...
|
70928 |
To disable this warning, you can either:
|
70929 |
- Avoid using `tokenizers` before the fork if possible
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
70927 |
huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks...
|
70928 |
To disable this warning, you can either:
|
70929 |
- Avoid using `tokenizers` before the fork if possible
|
70930 |
+
- Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false)
|
70931 |
+
huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks...
|
70932 |
+
To disable this warning, you can either:
|
70933 |
+
- Avoid using `tokenizers` before the fork if possible
|
70934 |
+
- Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false)
|
70935 |
+
huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks...
|
70936 |
+
To disable this warning, you can either:
|
70937 |
+
- Avoid using `tokenizers` before the fork if possible
|
70938 |
+
- Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false)
|
70939 |
+
huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks...
|
70940 |
+
To disable this warning, you can either:
|
70941 |
+
- Avoid using `tokenizers` before the fork if possible
|
70942 |
+
- Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false)
|
70943 |
+
huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks...
|
70944 |
+
To disable this warning, you can either:
|
70945 |
+
- Avoid using `tokenizers` before the fork if possible
|
70946 |
+
|
70947 |
+
Training...: 44%|███████████████████████████████████████████████████████████████████████████████████████████████▏ | 23784/54217 [11:20:48<1805:44:45, 213.61s/it]
|
70948 |
+
huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks...
|
70949 |
+
To disable this warning, you can either:
|
70950 |
+
- Avoid using `tokenizers` before the fork if possible
|
70951 |
+
- Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false)
|
70952 |
+
huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks...
|
70953 |
+
To disable this warning, you can either:
|
70954 |
+
- Avoid using `tokenizers` before the fork if possible
|
70955 |
+
|
70956 |
+
|
70957 |
+
|
70958 |
+
|
70959 |
+
|
70960 |
+
|
70961 |
+
|
70962 |
+
|
70963 |
+
|
70964 |
+
|
70965 |
+
|
70966 |
+
|
70967 |
+
|
70968 |
+
|
70969 |
+
|
70970 |
+
|
70971 |
+
|
70972 |
+
|
70973 |
+
|
70974 |
+
|
70975 |
+
|
70976 |
+
|
70977 |
+
|
70978 |
+
|
70979 |
+
|
70980 |
+
|
70981 |
+
|
70982 |
+
|
70983 |
+
|
70984 |
+
|
70985 |
+
|
70986 |
+
|
70987 |
+
|
70988 |
+
|
70989 |
+
|
70990 |
+
|
70991 |
+
|
70992 |
+
|
70993 |
+
|
70994 |
+
|
70995 |
+
|
70996 |
+
|
70997 |
+
|
70998 |
+
|
70999 |
+
|
71000 |
+
|
71001 |
+
|
71002 |
+
|
71003 |
+
|
71004 |
+
|
71005 |
+
|
71006 |
+
|
71007 |
+
|
71008 |
+
|
71009 |
+
|
71010 |
+
|
71011 |
+
|
71012 |
+
|
71013 |
+
|
71014 |
+
|
71015 |
+
|
71016 |
+
|
71017 |
+
|
71018 |
+
|
71019 |
+
|
71020 |
+
|
71021 |
+
|
71022 |
+
|
71023 |
+
|
71024 |
+
|
71025 |
+
|
71026 |
+
|
71027 |
+
|
71028 |
+
|
71029 |
+
|
71030 |
+
|
71031 |
+
|
71032 |
+
|
71033 |
+
|
71034 |
+
|
71035 |
+
|
71036 |
+
|
71037 |
+
|
71038 |
+
|
71039 |
+
|
71040 |
+
|
71041 |
+
|
71042 |
+
|
71043 |
+
|
71044 |
+
|
71045 |
+
|
71046 |
+
|
71047 |
+
|
71048 |
+
|
71049 |
+
|
71050 |
+
|
71051 |
+
|
71052 |
+
|
71053 |
+
|
71054 |
+
|
71055 |
+
|
71056 |
+
|
71057 |
+
|
71058 |
+
|
71059 |
+
|
71060 |
+
|
71061 |
+
|
71062 |
+
|
71063 |
+
|
71064 |
+
|
71065 |
+
|
71066 |
+
|
71067 |
+
|
71068 |
+
|
71069 |
+
|
71070 |
+
|
71071 |
+
|
71072 |
+
|
71073 |
+
|
71074 |
+
|
71075 |
+
|
71076 |
+
|
71077 |
+
|
71078 |
+
|
71079 |
+
|
71080 |
+
|
71081 |
+
|
71082 |
+
|
71083 |
+
|
71084 |
+
|
71085 |
+
|
71086 |
+
|
71087 |
+
|
71088 |
+
|
71089 |
+
|
71090 |
+
|
71091 |
+
|
71092 |
+
|
71093 |
+
|
71094 |
+
|
71095 |
+
|
71096 |
+
|
71097 |
+
|
71098 |
+
|
71099 |
+
|
71100 |
+
|
71101 |
+
|
71102 |
+
|
71103 |
+
|
71104 |
+
|
71105 |
+
|
71106 |
+
|
71107 |
+
|
71108 |
+
|
71109 |
+
|
71110 |
+
|
71111 |
+
|
71112 |
+
|
71113 |
+
|
71114 |
+
|
71115 |
+
|
71116 |
+
|
71117 |
+
|
71118 |
+
|
71119 |
+
|
71120 |
+
|
71121 |
+
|
71122 |
+
|
71123 |
+
|
71124 |
+
|
71125 |
+
|
71126 |
+
|
71127 |
+
|
71128 |
+
|
71129 |
+
|
71130 |
+
|
71131 |
+
|
71132 |
+
|
71133 |
+
|
71134 |
+
|
71135 |
+
|
71136 |
+
|
71137 |
+
|
71138 |
+
|
71139 |
+
|
71140 |
+
|
71141 |
+
|
71142 |
+
|
71143 |
+
|
71144 |
+
|
71145 |
+
|
71146 |
+
|
71147 |
+
|
71148 |
+
|
71149 |
+
|
71150 |
+
|
71151 |
+
|
71152 |
+
|
71153 |
+
|
71154 |
+
|
71155 |
+
|
71156 |
+
|
71157 |
+
|
71158 |
+
|
71159 |
+
|
71160 |
+
|
71161 |
+
|
71162 |
+
|
71163 |
+
|
71164 |
+
|
71165 |
+
|
71166 |
+
|
71167 |
+
|
71168 |
+
|
71169 |
+
|
71170 |
+
|
71171 |
+
|
71172 |
+
|
71173 |
+
|
71174 |
+
|
71175 |
+
|
71176 |
+
|
71177 |
+
|
71178 |
+
|
71179 |
+
|
71180 |
+
|
71181 |
+
|
71182 |
+
|
71183 |
+
|
71184 |
+
|
71185 |
+
|
71186 |
+
|
71187 |
+
|
71188 |
+
|
71189 |
+
|
71190 |
+
|
71191 |
+
|
71192 |
+
|
71193 |
+
|
71194 |
+
|
71195 |
+
|
71196 |
+
|
71197 |
+
|
71198 |
+
|
71199 |
+
|
71200 |
+
|
71201 |
+
|
71202 |
+
|
71203 |
+
|
71204 |
+
|
71205 |
+
|
71206 |
+
|
71207 |
+
|
71208 |
+
|
71209 |
+
|
71210 |
+
|
71211 |
+
|
71212 |
+
|
71213 |
+
|
71214 |
+
|
71215 |
+
|
71216 |
+
|
71217 |
+
|
71218 |
+
|
71219 |
+
|
71220 |
+
|
71221 |
+
|
71222 |
+
|
71223 |
+
|
71224 |
+
|
71225 |
+
|
71226 |
+
|
71227 |
+
|
71228 |
+
|
71229 |
+
|
71230 |
+
|
71231 |
+
|
71232 |
+
|
71233 |
+
|
71234 |
+
|
71235 |
+
|
71236 |
+
|
71237 |
+
|
71238 |
+
|
71239 |
+
|
71240 |
+
|
71241 |
+
|
71242 |
+
|
71243 |
+
|
71244 |
+
|
71245 |
+
|
71246 |
+
|
71247 |
+
|
71248 |
+
|
71249 |
+
|
71250 |
+
|
71251 |
+
|
71252 |
+
|
71253 |
+
|
71254 |
+
|
71255 |
+
|
71256 |
+
|
71257 |
+
|
71258 |
+
|
71259 |
+
|
71260 |
+
|
71261 |
+
|
71262 |
+
|
71263 |
+
|
71264 |
+
|