Upload 517 files
8d88d9b verified - PY3 Upload 517 files
- 6.15 kB Upload 517 files
- 8.57 kB Upload 517 files
czech.pickle Detected Pickle imports (9)
- "nltk.tokenize.punkt.PunktToken",
- "collections.defaultdict",
- "copy_reg._reconstructor",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "__builtin__.int",
- "__builtin__.set",
- "nltk.tokenize.punkt.PunktParameters",
- "__builtin__.object"
How to fix it?
1.27 MB Upload 517 files danish.pickle Detected Pickle imports (9)
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "__builtin__.int",
- "__builtin__.object",
- "collections.defaultdict",
- "nltk.tokenize.punkt.PunktToken",
- "__builtin__.set",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "nltk.tokenize.punkt.PunktParameters",
- "copy_reg._reconstructor"
How to fix it?
1.26 MB Upload 517 files dutch.pickle Detected Pickle imports (9)
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "nltk.tokenize.punkt.PunktParameters",
- "__builtin__.object",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "collections.defaultdict",
- "__builtin__.int",
- "__builtin__.set",
- "nltk.tokenize.punkt.PunktToken",
- "copy_reg._reconstructor"
How to fix it?
743 kB Upload 517 files english.pickle Detected Pickle imports (9)
- "__builtin__.int",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "copy_reg._reconstructor",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "__builtin__.set",
- "collections.defaultdict",
- "nltk.tokenize.punkt.PunktParameters",
- "__builtin__.object",
- "nltk.tokenize.punkt.PunktToken"
How to fix it?
433 kB Upload 517 files estonian.pickle Detected Pickle imports (9)
- "nltk.tokenize.punkt.PunktToken",
- "collections.defaultdict",
- "copy_reg._reconstructor",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "__builtin__.object",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "__builtin__.set",
- "__builtin__.int",
- "nltk.tokenize.punkt.PunktParameters"
How to fix it?
1.6 MB Upload 517 files finnish.pickle Detected Pickle imports (9)
- "__builtin__.object",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "nltk.tokenize.punkt.PunktParameters",
- "nltk.tokenize.punkt.PunktToken",
- "collections.defaultdict",
- "__builtin__.set",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "copy_reg._reconstructor",
- "__builtin__.int"
How to fix it?
1.95 MB Upload 517 files french.pickle Detected Pickle imports (9)
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "collections.defaultdict",
- "__builtin__.int",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "nltk.tokenize.punkt.PunktToken",
- "copy_reg._reconstructor",
- "__builtin__.set",
- "nltk.tokenize.punkt.PunktParameters",
- "__builtin__.object"
How to fix it?
583 kB Upload 517 files german.pickle Detected Pickle imports (9)
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "__builtin__.object",
- "copy_reg._reconstructor",
- "nltk.tokenize.punkt.PunktToken",
- "nltk.tokenize.punkt.PunktParameters",
- "__builtin__.set",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "collections.defaultdict",
- "__builtin__.int"
How to fix it?
1.53 MB Upload 517 files greek.pickle Detected Pickle imports (9)
- "__builtin__.object",
- "collections.defaultdict",
- "copy_reg._reconstructor",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "nltk.tokenize.punkt.PunktParameters",
- "__builtin__.set",
- "nltk.tokenize.punkt.PunktToken",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "__builtin__.int"
How to fix it?
1.95 MB Upload 517 files italian.pickle Detected Pickle imports (9)
- "collections.defaultdict",
- "copy_reg._reconstructor",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "__builtin__.int",
- "nltk.tokenize.punkt.PunktParameters",
- "__builtin__.set",
- "nltk.tokenize.punkt.PunktToken",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "__builtin__.object"
How to fix it?
658 kB Upload 517 files malayalam.pickle Detected Pickle imports (7)
- "nltk.tokenize.punkt.PunktParameters",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "nltk.tokenize.punkt.PunktToken",
- "__builtin__.int",
- "__builtin__.set",
- "collections.defaultdict"
How to fix it?
221 kB Upload 517 files norwegian.pickle Detected Pickle imports (9)
- "nltk.tokenize.punkt.PunktLanguageVars",
- "collections.defaultdict",
- "nltk.tokenize.punkt.PunktToken",
- "__builtin__.object",
- "__builtin__.int",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "nltk.tokenize.punkt.PunktParameters",
- "__builtin__.set",
- "copy_reg._reconstructor"
How to fix it?
1.26 MB Upload 517 files polish.pickle Detected Pickle imports (9)
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "nltk.tokenize.punkt.PunktParameters",
- "__builtin__.object",
- "__builtin__.int",
- "nltk.tokenize.punkt.PunktToken",
- "__builtin__.set",
- "collections.defaultdict",
- "copy_reg._reconstructor"
How to fix it?
2.04 MB Upload 517 files portuguese.pickle Detected Pickle imports (9)
- "collections.defaultdict",
- "copy_reg._reconstructor",
- "__builtin__.object",
- "nltk.tokenize.punkt.PunktToken",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "__builtin__.set",
- "nltk.tokenize.punkt.PunktParameters",
- "__builtin__.int",
- "nltk.tokenize.punkt.PunktSentenceTokenizer"
How to fix it?
649 kB Upload 517 files russian.pickle Detected Pickle imports (7)
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "nltk.tokenize.punkt.PunktParameters",
- "__builtin__.set",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "collections.defaultdict",
- "__builtin__.long",
- "nltk.tokenize.punkt.PunktToken"
How to fix it?
33 kB Upload 517 files slovene.pickle Detected Pickle imports (9)
- "nltk.tokenize.punkt.PunktLanguageVars",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "__builtin__.int",
- "nltk.tokenize.punkt.PunktParameters",
- "nltk.tokenize.punkt.PunktToken",
- "__builtin__.set",
- "collections.defaultdict",
- "copy_reg._reconstructor",
- "__builtin__.object"
How to fix it?
833 kB Upload 517 files spanish.pickle Detected Pickle imports (9)
- "copy_reg._reconstructor",
- "collections.defaultdict",
- "__builtin__.int",
- "__builtin__.object",
- "nltk.tokenize.punkt.PunktParameters",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "nltk.tokenize.punkt.PunktToken",
- "__builtin__.set",
- "nltk.tokenize.punkt.PunktLanguageVars"
How to fix it?
598 kB Upload 517 files swedish.pickle Detected Pickle imports (9)
- "__builtin__.int",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "copy_reg._reconstructor",
- "__builtin__.set",
- "nltk.tokenize.punkt.PunktParameters",
- "collections.defaultdict",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "__builtin__.object",
- "nltk.tokenize.punkt.PunktToken"
How to fix it?
1.03 MB Upload 517 files turkish.pickle Detected Pickle imports (9)
- "nltk.tokenize.punkt.PunktToken",
- "nltk.tokenize.punkt.PunktParameters",
- "copy_reg._reconstructor",
- "__builtin__.int",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "collections.defaultdict",
- "__builtin__.object",
- "__builtin__.set",
- "nltk.tokenize.punkt.PunktLanguageVars"
How to fix it?
1.23 MB Upload 517 files