Update app.py
6647a0e
-
1.48 kB
initial commit
-
272 Bytes
initial commit
-
2.4 kB
Update app.py
learner.bin
Detected Pickle imports (41)
- "torch.nn.modules.loss.CrossEntropyLoss",
- "__main__.lr_schedule",
- "skorch.dataset.Dataset",
- "skorch.hf.HuggingfacePretrainedTokenizer",
- "tokenizers.models.Model",
- "skorch.utils._indexing_dict",
- "skorch.callbacks.scoring.PassthroughScoring",
- "modAL.models.learners.ActiveLearner",
- "builtins.set",
- "numpy.core.multiarray.scalar",
- "transformers.models.bert.tokenization_bert_fast.BertTokenizerFast",
- "torch.optim.lr_scheduler.LambdaLR",
- "__main__.BertModule",
- "skorch.callbacks.training.EarlyStopping",
- "torch.storage._load_from_bytes",
- "skorch.callbacks.scoring.EpochScoring",
- "modAL.uncertainty.uncertainty_sampling",
- "numpy.dtype",
- "collections.defaultdict",
- "skorch.callbacks.logging.ProgressBar",
- "builtins.dict",
- "numpy.core.multiarray._reconstruct",
- "tokenizers.Tokenizer",
- "torch._utils._rebuild_parameter",
- "skorch.callbacks.logging.EpochTimer",
- "skorch.callbacks.logging.PrintLog",
- "skorch.setter.optimizer_setter",
- "numpy.ndarray",
- "torch._utils._rebuild_tensor_v2",
- "collections.OrderedDict",
- "torch.utils.data.dataloader.DataLoader",
- "skorch.dataset.ValidSplit",
- "skorch.classifier.NeuralNetClassifier",
- "skorch.history.History",
- "sklearn.pipeline.Pipeline",
- "skorch.utils.to_numpy",
- "torch.optim.adamw.AdamW",
- "builtins.print",
- "skorch.callbacks.lr_scheduler.LRScheduler",
- "skorch.utils._indexing_other",
- "functools.partial"
How to fix it?
2.99 GB
Upload learner.bin
-
52 Bytes
Create requirements.txt