Upload 43 files
aaa1344
-
PY3
Upload 43 files
-
6.15 kB
Upload 43 files
-
8.57 kB
Upload 43 files
czech.pickle
Detected Pickle imports (9)
- "__builtin__.int",
- "nltk.tokenize.punkt.PunktToken",
- "collections.defaultdict",
- "__builtin__.set",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "__builtin__.object",
- "copy_reg._reconstructor",
- "nltk.tokenize.punkt.PunktParameters"
How to fix it?
1.27 MB
Upload 43 files
danish.pickle
Detected Pickle imports (9)
- "__builtin__.int",
- "nltk.tokenize.punkt.PunktToken",
- "collections.defaultdict",
- "__builtin__.set",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "__builtin__.object",
- "copy_reg._reconstructor",
- "nltk.tokenize.punkt.PunktParameters"
How to fix it?
1.26 MB
Upload 43 files
dutch.pickle
Detected Pickle imports (9)
- "__builtin__.int",
- "nltk.tokenize.punkt.PunktToken",
- "collections.defaultdict",
- "__builtin__.set",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "__builtin__.object",
- "copy_reg._reconstructor",
- "nltk.tokenize.punkt.PunktParameters"
How to fix it?
743 kB
Upload 43 files
english.pickle
Detected Pickle imports (9)
- "__builtin__.int",
- "nltk.tokenize.punkt.PunktToken",
- "collections.defaultdict",
- "__builtin__.set",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "__builtin__.object",
- "copy_reg._reconstructor",
- "nltk.tokenize.punkt.PunktParameters"
How to fix it?
433 kB
Upload 43 files
estonian.pickle
Detected Pickle imports (9)
- "__builtin__.int",
- "nltk.tokenize.punkt.PunktToken",
- "collections.defaultdict",
- "__builtin__.set",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "__builtin__.object",
- "copy_reg._reconstructor",
- "nltk.tokenize.punkt.PunktParameters"
How to fix it?
1.6 MB
Upload 43 files
finnish.pickle
Detected Pickle imports (9)
- "__builtin__.int",
- "nltk.tokenize.punkt.PunktToken",
- "collections.defaultdict",
- "__builtin__.set",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "__builtin__.object",
- "copy_reg._reconstructor",
- "nltk.tokenize.punkt.PunktParameters"
How to fix it?
1.95 MB
Upload 43 files
french.pickle
Detected Pickle imports (9)
- "__builtin__.int",
- "nltk.tokenize.punkt.PunktToken",
- "collections.defaultdict",
- "__builtin__.set",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "__builtin__.object",
- "copy_reg._reconstructor",
- "nltk.tokenize.punkt.PunktParameters"
How to fix it?
583 kB
Upload 43 files
german.pickle
Detected Pickle imports (9)
- "__builtin__.int",
- "nltk.tokenize.punkt.PunktToken",
- "collections.defaultdict",
- "__builtin__.set",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "__builtin__.object",
- "copy_reg._reconstructor",
- "nltk.tokenize.punkt.PunktParameters"
How to fix it?
1.53 MB
Upload 43 files
greek.pickle
Detected Pickle imports (9)
- "__builtin__.int",
- "nltk.tokenize.punkt.PunktToken",
- "collections.defaultdict",
- "__builtin__.set",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "__builtin__.object",
- "copy_reg._reconstructor",
- "nltk.tokenize.punkt.PunktParameters"
How to fix it?
1.95 MB
Upload 43 files
italian.pickle
Detected Pickle imports (9)
- "__builtin__.int",
- "nltk.tokenize.punkt.PunktToken",
- "collections.defaultdict",
- "__builtin__.set",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "__builtin__.object",
- "copy_reg._reconstructor",
- "nltk.tokenize.punkt.PunktParameters"
How to fix it?
658 kB
Upload 43 files
malayalam.pickle
Detected Pickle imports (7)
- "__builtin__.int",
- "nltk.tokenize.punkt.PunktToken",
- "collections.defaultdict",
- "__builtin__.set",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "nltk.tokenize.punkt.PunktParameters"
How to fix it?
221 kB
Upload 43 files
norwegian.pickle
Detected Pickle imports (9)
- "__builtin__.int",
- "nltk.tokenize.punkt.PunktToken",
- "collections.defaultdict",
- "__builtin__.set",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "__builtin__.object",
- "copy_reg._reconstructor",
- "nltk.tokenize.punkt.PunktParameters"
How to fix it?
1.26 MB
Upload 43 files
polish.pickle
Detected Pickle imports (9)
- "__builtin__.int",
- "nltk.tokenize.punkt.PunktToken",
- "collections.defaultdict",
- "__builtin__.set",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "__builtin__.object",
- "copy_reg._reconstructor",
- "nltk.tokenize.punkt.PunktParameters"
How to fix it?
2.04 MB
Upload 43 files
portuguese.pickle
Detected Pickle imports (9)
- "__builtin__.int",
- "nltk.tokenize.punkt.PunktToken",
- "collections.defaultdict",
- "__builtin__.set",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "__builtin__.object",
- "copy_reg._reconstructor",
- "nltk.tokenize.punkt.PunktParameters"
How to fix it?
649 kB
Upload 43 files
russian.pickle
Detected Pickle imports (7)
- "nltk.tokenize.punkt.PunktToken",
- "collections.defaultdict",
- "__builtin__.set",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "__builtin__.long",
- "nltk.tokenize.punkt.PunktParameters"
How to fix it?
33 kB
Upload 43 files
slovene.pickle
Detected Pickle imports (9)
- "__builtin__.int",
- "nltk.tokenize.punkt.PunktToken",
- "collections.defaultdict",
- "__builtin__.set",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "__builtin__.object",
- "copy_reg._reconstructor",
- "nltk.tokenize.punkt.PunktParameters"
How to fix it?
833 kB
Upload 43 files
spanish.pickle
Detected Pickle imports (9)
- "__builtin__.int",
- "nltk.tokenize.punkt.PunktToken",
- "collections.defaultdict",
- "__builtin__.set",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "__builtin__.object",
- "copy_reg._reconstructor",
- "nltk.tokenize.punkt.PunktParameters"
How to fix it?
598 kB
Upload 43 files
swedish.pickle
Detected Pickle imports (9)
- "__builtin__.int",
- "nltk.tokenize.punkt.PunktToken",
- "collections.defaultdict",
- "__builtin__.set",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "__builtin__.object",
- "copy_reg._reconstructor",
- "nltk.tokenize.punkt.PunktParameters"
How to fix it?
1.03 MB
Upload 43 files
turkish.pickle
Detected Pickle imports (9)
- "__builtin__.int",
- "nltk.tokenize.punkt.PunktToken",
- "collections.defaultdict",
- "__builtin__.set",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "__builtin__.object",
- "copy_reg._reconstructor",
- "nltk.tokenize.punkt.PunktParameters"
How to fix it?
1.23 MB
Upload 43 files