Update README.md
367747f
verified
-
1.52 kB
initial commit
-
5.31 kB
Update README.md
-
0 Bytes
Update README.md
-
1.27 MB
Upload folder using huggingface_hub (#1)
model.pt
Detected Pickle imports (26)
- "torch._C._nn.gelu",
- "torch.nn.modules.sparse.Embedding",
- "__builtin__.set",
- "transformers_modules.dicta-il.dictalm-7b.c233431901e34e82235e058ff75053a292547e79.modeling_megatron_gpt.MegatronGPTMLP",
- "torch.device",
- "transformers_modules.dicta-il.dictalm-7b.c233431901e34e82235e058ff75053a292547e79.configuration_megatron_gpt.MegatronGPTConfig",
- "torch.float16",
- "torch._utils._rebuild_tensor_v2",
- "torch.nn.modules.dropout.Dropout",
- "transformers_modules.dicta-il.dictalm-7b.c233431901e34e82235e058ff75053a292547e79.modeling_megatron_gpt.MegatronGPTLayer",
- "transformers_modules.dicta-il.dictalm-7b.c233431901e34e82235e058ff75053a292547e79.modeling_megatron_gpt.MegatronGPTAttention",
- "transformers_modules.dicta-il.dictalm-7b.c233431901e34e82235e058ff75053a292547e79.modeling_megatron_gpt.MegatronGPTLayerNorm",
- "collections.OrderedDict",
- "torch.HalfStorage",
- "torch.nn.modules.container.ModuleList",
- "torch.BoolStorage",
- "torch.FloatStorage",
- "transformers_modules.dicta-il.dictalm-7b.c233431901e34e82235e058ff75053a292547e79.modeling_megatron_gpt.MegatronGPTForCausalLM",
- "transformers_modules.dicta-il.dictalm-7b.c233431901e34e82235e058ff75053a292547e79.modeling_megatron_gpt.MegatronGPTModel",
- "torch._utils._rebuild_parameter",
- "transformers_modules.dicta-il.dictalm-7b.c233431901e34e82235e058ff75053a292547e79.modeling_megatron_gpt.MegatronGPTRotaryEmbedding",
- "transformers.activations.GELUActivation",
- "quanto.nn.qlinear.QLinear",
- "torch.float8_e4m3fn",
- "transformers.generation.configuration_utils.GenerationConfig",
- "quanto.tensor.qtype.qtype"
How to fix it?
11.1 GB
Upload folder using huggingface_hub (#1)
-
1.02 kB
Upload folder using huggingface_hub (#1)
-
567 Bytes
Upload folder using huggingface_hub (#1)
-
3.87 MB
Upload folder using huggingface_hub (#1)
-
883 Bytes
Upload folder using huggingface_hub (#1)
-
1.65 MB
Upload folder using huggingface_hub (#1)