Update training_args.bin
1499a11
-
345 Bytes
initial commit
-
832 Bytes
Update config.json
-
154 Bytes
Update dataset-metadata.json
-
712 MB
Update pytorch_model.bin
-
112 Bytes
Update special_tokens_map.json
-
712 MB
Update tf_model.h5
-
48 Bytes
Update tokenizer_config.json
training_args.bin
Detected Pickle imports (30)
- "transformers.modeling_bert.BertLayer",
- "transformers.modeling_bert.BertIntermediate",
- "torch.nn.functional.gelu",
- "transformers.modeling_bert.BertForSequenceClassification",
- "torch.nn.modules.normalization.LayerNorm",
- "transformers.modeling_bert.BertEncoder",
- "collections.OrderedDict",
- "transformers.modeling_bert.BertPooler",
- "torch._utils._rebuild_tensor_v2",
- "__builtin__.set",
- "transformers.tokenization_bert.BasicTokenizer",
- "transformers.tokenization_bert.BertTokenizer",
- "torch.nn.modules.dropout.Dropout",
- "lingualytics.learner.Learner",
- "torch.FloatStorage",
- "torch.nn.modules.sparse.Embedding",
- "transformers.modeling_bert.BertSelfAttention",
- "torch.nn.modules.container.ModuleList",
- "transformers.modeling_bert.BertAttention",
- "torch.nn.modules.linear.Linear",
- "transformers.modeling_bert.BertOutput",
- "transformers.modeling_bert.BertModel",
- "transformers.configuration_bert.BertConfig",
- "transformers.modeling_bert.BertEmbeddings",
- "transformers.modeling_bert.BertSelfOutput",
- "torch._utils._rebuild_parameter",
- "pathlib.PosixPath",
- "torch.nn.modules.activation.Tanh",
- "torch.device",
- "transformers.tokenization_bert.WordpieceTokenizer"
How to fix it?
715 MB
Update training_args.bin
-
996 kB
Update vocab.txt