Upload tokenizer
e2a72ff
verified
-
runs
Fine-tuned GPT-2 on Wikitext-2
-
1.52 kB
initial commit
-
1.11 kB
Upload model
-
918 Bytes
Upload model
-
119 Bytes
Upload model
-
456 kB
Upload tokenizer
-
498 MB
Upload model
-
996 MB
Upload 8 files
rng_state.pth
Detected Pickle imports (7)
- "torch._utils._rebuild_tensor_v2",
- "numpy.dtype",
- "collections.OrderedDict",
- "numpy.ndarray",
- "torch.ByteStorage",
- "_codecs.encode",
- "numpy.core.multiarray._reconstruct"
How to fix it?
14.6 kB
Upload 8 files
-
627 Bytes
Upload 8 files
-
470 Bytes
Upload tokenizer
-
525 Bytes
Upload tokenizer
-
2.69 kB
Upload 8 files
training_args.bin
Detected Pickle imports (9)
- "transformers.trainer_utils.HubStrategy",
- "accelerate.utils.dataclasses.DistributedType",
- "transformers.training_args.OptimizerNames",
- "transformers.trainer_pt_utils.AcceleratorConfig",
- "transformers.training_args.TrainingArguments",
- "transformers.trainer_utils.IntervalStrategy",
- "accelerate.state.PartialState",
- "transformers.trainer_utils.SchedulerType",
- "torch.device"
How to fix it?
4.6 kB
Upload 8 files
-
999 kB
Upload tokenizer