-
1.48 kB
initial commit
-
21 Bytes
initial commit
-
597 Bytes
init
pytorch_model.bin
Detected Pickle imports (30)
- "transformers.models.roberta.modeling_roberta.RobertaEmbeddings",
- "transformers.models.xlm_roberta.tokenization_xlm_roberta.XLMRobertaTokenizer",
- "transformers.models.roberta.modeling_roberta.RobertaIntermediate",
- "torch._utils._rebuild_tensor_v2",
- "torch.FloatStorage",
- "torch.nn.modules.activation.Tanh",
- "torch.LongStorage",
- "transformers.models.roberta.modeling_roberta.RobertaEncoder",
- "torch.nn.modules.activation.ReLU",
- "torch.nn.modules.sparse.Embedding",
- "transformers.models.roberta.modeling_roberta.RobertaAttention",
- "transformers.models.roberta.modeling_roberta.RobertaLayer",
- "torch.nn.modules.loss.CrossEntropyLoss",
- "DictMatching.moco.BackBone_Model",
- "torch.nn.modules.linear.Linear",
- "torch.nn.modules.dropout.Dropout",
- "torch.nn.modules.container.Sequential",
- "transformers.models.roberta.modeling_roberta.RobertaSelfAttention",
- "DictMatching.moco.MoCo",
- "transformers.models.roberta.modeling_roberta.RobertaPooler",
- "__builtin__.set",
- "collections.OrderedDict",
- "torch._C._nn.gelu",
- "transformers.models.roberta.modeling_roberta.RobertaOutput",
- "transformers.models.xlm_roberta.modeling_xlm_roberta.XLMRobertaModel",
- "torch.nn.modules.container.ModuleList",
- "torch.nn.modules.normalization.LayerNorm",
- "transformers.models.roberta.modeling_roberta.RobertaSelfOutput",
- "transformers.models.xlm_roberta.configuration_xlm_roberta.XLMRobertaConfig",
- "torch._utils._rebuild_parameter"
How to fix it?
2.26 GB
init
-
5.07 MB
init
-
9.1 MB
init