upload tokenizer.json
7ff264c
verified
-
1.52 kB
initial commit
-
9.35 kB
upload README.md
-
293 Bytes
upload added_tokens.json
-
3.91 kB
upload config.json
-
10.4 kB
upload configuration_phi3.py
-
8.87 kB
upload generation_utils.py
-
23.2 kB
upload llava_arch.py
-
4.97 GB
upload model-00001-of-00002.safetensors
-
3.3 GB
upload model-00002-of-00002.safetensors
-
65.5 kB
upload model.safetensors.index.json
-
12.2 kB
upload modeling_llava_phi3.py
-
73.9 kB
upload modeling_phi3.py
-
569 Bytes
upload special_tokens_map.json
-
1.84 MB
upload tokenizer.json
-
500 kB
upload tokenizer.model
-
3.15 kB
upload tokenizer_config.json
-
1.84 MB
upload trainer_state.json
training_args.bin
Detected Pickle imports (11)
- "accelerate.utils.dataclasses.DistributedType",
- "torch.device",
- "transformers.integrations.deepspeed.HfTrainerDeepSpeedConfig",
- "torch.bfloat16",
- "transformers.training_args.OptimizerNames",
- "llava.train.train.TrainingArguments",
- "transformers.trainer_utils.IntervalStrategy",
- "transformers.trainer_utils.HubStrategy",
- "transformers.trainer_utils.SchedulerType",
- "accelerate.state.PartialState",
- "accelerate.utils.dataclasses.DeepSpeedPlugin"
How to fix it?
6.78 kB
upload training_args.bin
-
97.5 kB
upload training_datasets_by_stage.jpg