Update README.md
06235b7
verified
-
1.52 kB
initial commit
-
5.3 kB
Upload folder using huggingface_hub (#1)
-
0 Bytes
Update README.md
model.pt
Detected Pickle imports (23)
- "torch.nn.modules.sparse.Embedding",
- "torch.nn.modules.container.ModuleList",
- "quanto.nn.qlinear.QLinear",
- "torch._utils._rebuild_tensor_v2",
- "transformers_modules.apple.OpenELM-450M.d4e07ecb810667a89bbdf9ca6c39028443b9822b.modeling_openelm.OpenELMRMSNorm",
- "torch.FloatStorage",
- "transformers_modules.apple.OpenELM-450M.d4e07ecb810667a89bbdf9ca6c39028443b9822b.modeling_openelm.OpenELMRotaryEmbedding",
- "collections.OrderedDict",
- "torch.BoolStorage",
- "__builtin__.set",
- "torch.device",
- "transformers_modules.apple.OpenELM-450M.d4e07ecb810667a89bbdf9ca6c39028443b9822b.modeling_openelm.OpenELMForCausalLM",
- "torch.int8",
- "torch.nn.modules.activation.SiLU",
- "transformers.generation.configuration_utils.GenerationConfig",
- "quanto.tensor.qtype.qtype",
- "transformers_modules.apple.OpenELM-450M.d4e07ecb810667a89bbdf9ca6c39028443b9822b.configuration_openelm.OpenELMConfig",
- "transformers_modules.apple.OpenELM-450M.d4e07ecb810667a89bbdf9ca6c39028443b9822b.modeling_openelm.OpenELMMultiHeadCausalAttention",
- "transformers_modules.apple.OpenELM-450M.d4e07ecb810667a89bbdf9ca6c39028443b9822b.modeling_openelm.OpenELMFeedForwardNetwork",
- "transformers_modules.apple.OpenELM-450M.d4e07ecb810667a89bbdf9ca6c39028443b9822b.modeling_openelm.OpenELMModel",
- "torch._utils._rebuild_parameter",
- "torch.float32",
- "transformers_modules.apple.OpenELM-450M.d4e07ecb810667a89bbdf9ca6c39028443b9822b.modeling_openelm.OpenELMDecoderLayer"
How to fix it?
1.88 GB
Upload folder using huggingface_hub (#1)
-
1.02 kB
Upload folder using huggingface_hub (#1)
-
414 Bytes
Upload folder using huggingface_hub (#1)
-
1.84 MB
Upload folder using huggingface_hub (#1)
-
500 kB
Upload folder using huggingface_hub (#1)
-
918 Bytes
Upload folder using huggingface_hub (#1)