Update README.md
c949f6b
verified
-
1.52 kB
initial commit
-
123 Bytes
Update README.md
best_model.pt
Detected Pickle imports (7)
- "torch._utils._rebuild_tensor_v2",
- "dp.preprocessing.text.Preprocessor",
- "collections.OrderedDict",
- "torch.FloatStorage",
- "__builtin__.set",
- "dp.preprocessing.text.LanguageTokenizer",
- "dp.preprocessing.text.SequenceTokenizer"
How to fix it?
175 MB
Upload folder using huggingface_hub
best_model_no_optim.pt
Detected Pickle imports (7)
- "dp.preprocessing.text.LanguageTokenizer",
- "dp.preprocessing.text.Preprocessor",
- "collections.OrderedDict",
- "torch.FloatStorage",
- "__builtin__.set",
- "dp.preprocessing.text.SequenceTokenizer",
- "torch._utils._rebuild_tensor_v2"
How to fix it?
73.5 MB
Upload folder using huggingface_hub
latest_model.pt
Detected Pickle imports (7)
- "torch.FloatStorage",
- "collections.OrderedDict",
- "dp.preprocessing.text.Preprocessor",
- "torch._utils._rebuild_tensor_v2",
- "__builtin__.set",
- "dp.preprocessing.text.LanguageTokenizer",
- "dp.preprocessing.text.SequenceTokenizer"
How to fix it?
175 MB
Upload folder using huggingface_hub
model_step_10k.pt
Detected Pickle imports (7)
- "dp.preprocessing.text.SequenceTokenizer",
- "torch._utils._rebuild_tensor_v2",
- "torch.FloatStorage",
- "dp.preprocessing.text.Preprocessor",
- "__builtin__.set",
- "dp.preprocessing.text.LanguageTokenizer",
- "collections.OrderedDict"
How to fix it?
175 MB
Upload folder using huggingface_hub
model_step_20k.pt
Detected Pickle imports (7)
- "dp.preprocessing.text.LanguageTokenizer",
- "dp.preprocessing.text.Preprocessor",
- "torch._utils._rebuild_tensor_v2",
- "collections.OrderedDict",
- "torch.FloatStorage",
- "dp.preprocessing.text.SequenceTokenizer",
- "__builtin__.set"
How to fix it?
175 MB
Upload folder using huggingface_hub
model_step_30k.pt
Detected Pickle imports (7)
- "torch._utils._rebuild_tensor_v2",
- "dp.preprocessing.text.Preprocessor",
- "__builtin__.set",
- "dp.preprocessing.text.LanguageTokenizer",
- "torch.FloatStorage",
- "dp.preprocessing.text.SequenceTokenizer",
- "collections.OrderedDict"
How to fix it?
175 MB
Upload folder using huggingface_hub
model_step_40k.pt
Detected Pickle imports (7)
- "torch.FloatStorage",
- "dp.preprocessing.text.SequenceTokenizer",
- "dp.preprocessing.text.Preprocessor",
- "collections.OrderedDict",
- "__builtin__.set",
- "dp.preprocessing.text.LanguageTokenizer",
- "torch._utils._rebuild_tensor_v2"
How to fix it?
175 MB
Upload folder using huggingface_hub