Update README.md
742e54b
verified
-
1.52 kB
initial commit
-
5.4 kB
Upload folder using huggingface_hub (#1)
-
1.13 kB
Upload folder using huggingface_hub (#1)
-
0 Bytes
Update README.md
-
456 kB
Upload folder using huggingface_hub (#1)
model.pt
Detected Pickle imports (30)
- "transformers.activations.NewGELUActivation",
- "transformers_modules.cognitivecomputations.dolphin-2_6-phi-2.8f9ec20eb43118517758278024f6ed911c79ee9f.modeling_phi.SelfAttention",
- "transformers_modules.cognitivecomputations.dolphin-2_6-phi-2.8f9ec20eb43118517758278024f6ed911c79ee9f.modeling_phi.MHA",
- "collections.OrderedDict",
- "torch.float16",
- "transformers_modules.cognitivecomputations.dolphin-2_6-phi-2.8f9ec20eb43118517758278024f6ed911c79ee9f.modeling_phi.Embedding",
- "torch._utils._rebuild_parameter",
- "__builtin__.set",
- "transformers_modules.cognitivecomputations.dolphin-2_6-phi-2.8f9ec20eb43118517758278024f6ed911c79ee9f.configuration_phi.PhiConfig",
- "transformers_modules.cognitivecomputations.dolphin-2_6-phi-2.8f9ec20eb43118517758278024f6ed911c79ee9f.modeling_phi.CausalLMLoss",
- "torch.device",
- "torch.int8",
- "quanto.nn.qlinear.QLinear",
- "torch.HalfStorage",
- "transformers_modules.cognitivecomputations.dolphin-2_6-phi-2.8f9ec20eb43118517758278024f6ed911c79ee9f.modeling_phi.PhiForCausalLM",
- "transformers_modules.cognitivecomputations.dolphin-2_6-phi-2.8f9ec20eb43118517758278024f6ed911c79ee9f.modeling_phi.CrossAttention",
- "transformers.generation.configuration_utils.GenerationConfig",
- "transformers_modules.cognitivecomputations.dolphin-2_6-phi-2.8f9ec20eb43118517758278024f6ed911c79ee9f.modeling_phi.ParallelBlock",
- "quanto.tensor.qtype.qtype",
- "transformers_modules.cognitivecomputations.dolphin-2_6-phi-2.8f9ec20eb43118517758278024f6ed911c79ee9f.modeling_phi.CausalLMHead",
- "transformers_modules.cognitivecomputations.dolphin-2_6-phi-2.8f9ec20eb43118517758278024f6ed911c79ee9f.modeling_phi.MLP",
- "transformers_modules.cognitivecomputations.dolphin-2_6-phi-2.8f9ec20eb43118517758278024f6ed911c79ee9f.modeling_phi.PhiModel",
- "torch._utils._rebuild_tensor_v2",
- "torch.nn.modules.dropout.Dropout",
- "torch.nn.modules.container.ModuleList",
- "torch.nn.modules.normalization.LayerNorm",
- "torch.nn.modules.sparse.Embedding",
- "torch.nn.modules.loss.CrossEntropyLoss",
- "transformers_modules.cognitivecomputations.dolphin-2_6-phi-2.8f9ec20eb43118517758278024f6ed911c79ee9f.modeling_phi.RotaryEmbedding",
- "torch.FloatStorage"
How to fix it?
5.57 GB
Upload folder using huggingface_hub (#1)
-
1.04 kB
Upload folder using huggingface_hub (#1)
-
584 Bytes
Upload folder using huggingface_hub (#1)
-
2.12 MB
Upload folder using huggingface_hub (#1)
-
8.09 kB
Upload folder using huggingface_hub (#1)
-
798 kB
Upload folder using huggingface_hub (#1)