Upload tokenizer
df1834a
-
runs
Training in progress, step 750
-
1.48 kB
initial commit
-
13 Bytes
Training in progress, step 250
-
2.34 kB
Training in progress, step 250
-
215 Bytes
Training in progress, step 250
-
378 MB
Training in progress, step 750
-
96 Bytes
Upload tokenizer
-
354 Bytes
Upload tokenizer
training_args.bin
Detected Pickle imports (8)
- "transformers.trainer_utils.IntervalStrategy",
- "transformers.trainer_utils.HubStrategy",
- "accelerate.state.PartialState",
- "accelerate.utils.dataclasses.DistributedType",
- "transformers.training_args.TrainingArguments",
- "transformers.trainer_utils.SchedulerType",
- "transformers.training_args.OptimizerNames",
- "torch.device"
How to fix it?
3.9 kB
Training in progress, step 250
-
331 Bytes
Upload tokenizer