- 8.57 kB data
czech.pickle Detected Pickle imports (7)
- "nltk.tokenize.punkt.PunktLanguageVars",
- "nltk.tokenize.punkt.PunktToken",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "builtins.int",
- "nltk.tokenize.punkt.PunktParameters",
- "builtins.set",
- "collections.defaultdict"
How to fix it?
1.12 MB data danish.pickle Detected Pickle imports (7)
- "nltk.tokenize.punkt.PunktToken",
- "collections.defaultdict",
- "builtins.int",
- "builtins.set",
- "nltk.tokenize.punkt.PunktParameters",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "nltk.tokenize.punkt.PunktLanguageVars"
How to fix it?
1.19 MB data dutch.pickle Detected Pickle imports (7)
- "builtins.int",
- "nltk.tokenize.punkt.PunktParameters",
- "builtins.set",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "collections.defaultdict",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "nltk.tokenize.punkt.PunktToken"
How to fix it?
694 kB data english.pickle Detected Pickle imports (7)
- "nltk.tokenize.punkt.PunktParameters",
- "collections.defaultdict",
- "builtins.int",
- "nltk.tokenize.punkt.PunktToken",
- "builtins.set",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "nltk.tokenize.punkt.PunktLanguageVars"
How to fix it?
407 kB data estonian.pickle Detected Pickle imports (7)
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "builtins.int",
- "builtins.set",
- "nltk.tokenize.punkt.PunktParameters",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "collections.defaultdict",
- "nltk.tokenize.punkt.PunktToken"
How to fix it?
1.5 MB data finnish.pickle Detected Pickle imports (7)
- "nltk.tokenize.punkt.PunktLanguageVars",
- "nltk.tokenize.punkt.PunktToken",
- "collections.defaultdict",
- "builtins.int",
- "builtins.set",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "nltk.tokenize.punkt.PunktParameters"
How to fix it?
1.85 MB data french.pickle Detected Pickle imports (7)
- "nltk.tokenize.punkt.PunktToken",
- "collections.defaultdict",
- "builtins.set",
- "builtins.int",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "nltk.tokenize.punkt.PunktParameters"
How to fix it?
554 kB data german.pickle Detected Pickle imports (7)
- "builtins.int",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "collections.defaultdict",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "nltk.tokenize.punkt.PunktToken",
- "nltk.tokenize.punkt.PunktParameters",
- "builtins.set"
How to fix it?
1.46 MB data greek.pickle Detected Pickle imports (7)
- "builtins.int",
- "nltk.tokenize.punkt.PunktParameters",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "collections.defaultdict",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "builtins.set",
- "nltk.tokenize.punkt.PunktToken"
How to fix it?
876 kB data italian.pickle Detected Pickle imports (7)
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "nltk.tokenize.punkt.PunktToken",
- "builtins.set",
- "builtins.int",
- "nltk.tokenize.punkt.PunktParameters",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "collections.defaultdict"
How to fix it?
615 kB data malayalam.pickle Detected Pickle imports (7)
- "nltk.tokenize.punkt.PunktParameters",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "nltk.tokenize.punkt.PunktToken",
- "__builtin__.int",
- "__builtin__.set",
- "collections.defaultdict"
How to fix it?
221 kB data norwegian.pickle Detected Pickle imports (7)
- "builtins.int",
- "builtins.set",
- "nltk.tokenize.punkt.PunktToken",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "collections.defaultdict",
- "nltk.tokenize.punkt.PunktParameters"
How to fix it?
1.18 MB data polish.pickle Detected Pickle imports (7)
- "nltk.tokenize.punkt.PunktToken",
- "builtins.set",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "nltk.tokenize.punkt.PunktParameters",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "builtins.int",
- "collections.defaultdict"
How to fix it?
1.74 MB data portuguese.pickle Detected Pickle imports (7)
- "nltk.tokenize.punkt.PunktLanguageVars",
- "nltk.tokenize.punkt.PunktParameters",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "nltk.tokenize.punkt.PunktToken",
- "collections.defaultdict",
- "builtins.set",
- "builtins.int"
How to fix it?
612 kB data russian.pickle Detected Pickle imports (7)
- "builtins.set",
- "nltk.tokenize.punkt.PunktParameters",
- "builtins.int",
- "collections.defaultdict",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "nltk.tokenize.punkt.PunktToken",
- "nltk.tokenize.punkt.PunktSentenceTokenizer"
How to fix it?
33 kB data slovene.pickle Detected Pickle imports (7)
- "nltk.tokenize.punkt.PunktParameters",
- "nltk.tokenize.punkt.PunktToken",
- "collections.defaultdict",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "builtins.set",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "builtins.int"
How to fix it?
734 kB data spanish.pickle Detected Pickle imports (7)
- "nltk.tokenize.punkt.PunktToken",
- "collections.defaultdict",
- "builtins.int",
- "builtins.set",
- "nltk.tokenize.punkt.PunktParameters",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "nltk.tokenize.punkt.PunktLanguageVars"
How to fix it?
562 kB data swedish.pickle Detected Pickle imports (7)
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "collections.defaultdict",
- "nltk.tokenize.punkt.PunktToken",
- "nltk.tokenize.punkt.PunktParameters",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "builtins.int",
- "builtins.set"
How to fix it?
980 kB data turkish.pickle Detected Pickle imports (7)
- "nltk.tokenize.punkt.PunktParameters",
- "collections.defaultdict",
- "builtins.int",
- "nltk.tokenize.punkt.PunktToken",
- "builtins.set",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "nltk.tokenize.punkt.PunktLanguageVars"
How to fix it?
1.02 MB data