Fetching metadata from the HF Docker repository... - PY3 first
- 6.15 kB first
- 8.57 kB first
czech.pickle Detected Pickle imports (9)
- "nltk.tokenize.punkt.PunktToken",
- "collections.defaultdict",
- "copy_reg._reconstructor",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "__builtin__.int",
- "__builtin__.set",
- "nltk.tokenize.punkt.PunktParameters",
- "__builtin__.object"
How to fix it?
1.27 MB first danish.pickle Detected Pickle imports (9)
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "__builtin__.int",
- "__builtin__.object",
- "collections.defaultdict",
- "nltk.tokenize.punkt.PunktToken",
- "__builtin__.set",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "nltk.tokenize.punkt.PunktParameters",
- "copy_reg._reconstructor"
How to fix it?
1.26 MB first dutch.pickle Detected Pickle imports (9)
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "nltk.tokenize.punkt.PunktParameters",
- "__builtin__.object",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "collections.defaultdict",
- "__builtin__.int",
- "__builtin__.set",
- "nltk.tokenize.punkt.PunktToken",
- "copy_reg._reconstructor"
How to fix it?
743 kB first english.pickle Detected Pickle imports (9)
- "__builtin__.int",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "copy_reg._reconstructor",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "__builtin__.set",
- "collections.defaultdict",
- "nltk.tokenize.punkt.PunktParameters",
- "__builtin__.object",
- "nltk.tokenize.punkt.PunktToken"
How to fix it?
433 kB first estonian.pickle Detected Pickle imports (9)
- "nltk.tokenize.punkt.PunktToken",
- "collections.defaultdict",
- "copy_reg._reconstructor",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "__builtin__.object",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "__builtin__.set",
- "__builtin__.int",
- "nltk.tokenize.punkt.PunktParameters"
How to fix it?
1.6 MB first finnish.pickle Detected Pickle imports (9)
- "__builtin__.object",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "nltk.tokenize.punkt.PunktParameters",
- "nltk.tokenize.punkt.PunktToken",
- "collections.defaultdict",
- "__builtin__.set",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "copy_reg._reconstructor",
- "__builtin__.int"
How to fix it?
1.95 MB first french.pickle Detected Pickle imports (9)
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "collections.defaultdict",
- "__builtin__.int",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "nltk.tokenize.punkt.PunktToken",
- "copy_reg._reconstructor",
- "__builtin__.set",
- "nltk.tokenize.punkt.PunktParameters",
- "__builtin__.object"
How to fix it?
583 kB first german.pickle Detected Pickle imports (9)
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "__builtin__.object",
- "copy_reg._reconstructor",
- "nltk.tokenize.punkt.PunktToken",
- "nltk.tokenize.punkt.PunktParameters",
- "__builtin__.set",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "collections.defaultdict",
- "__builtin__.int"
How to fix it?
1.53 MB first greek.pickle Detected Pickle imports (9)
- "__builtin__.object",
- "collections.defaultdict",
- "copy_reg._reconstructor",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "nltk.tokenize.punkt.PunktParameters",
- "__builtin__.set",
- "nltk.tokenize.punkt.PunktToken",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "__builtin__.int"
How to fix it?
1.95 MB first italian.pickle Detected Pickle imports (9)
- "collections.defaultdict",
- "copy_reg._reconstructor",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "__builtin__.int",
- "nltk.tokenize.punkt.PunktParameters",
- "__builtin__.set",
- "nltk.tokenize.punkt.PunktToken",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "__builtin__.object"
How to fix it?
658 kB first malayalam.pickle Detected Pickle imports (7)
- "nltk.tokenize.punkt.PunktParameters",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "nltk.tokenize.punkt.PunktToken",
- "__builtin__.int",
- "__builtin__.set",
- "collections.defaultdict"
How to fix it?
221 kB first norwegian.pickle Detected Pickle imports (9)
- "nltk.tokenize.punkt.PunktLanguageVars",
- "collections.defaultdict",
- "nltk.tokenize.punkt.PunktToken",
- "__builtin__.object",
- "__builtin__.int",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "nltk.tokenize.punkt.PunktParameters",
- "__builtin__.set",
- "copy_reg._reconstructor"
How to fix it?
1.26 MB first polish.pickle Detected Pickle imports (9)
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "nltk.tokenize.punkt.PunktParameters",
- "__builtin__.object",
- "__builtin__.int",
- "nltk.tokenize.punkt.PunktToken",
- "__builtin__.set",
- "collections.defaultdict",
- "copy_reg._reconstructor"
How to fix it?
2.04 MB first portuguese.pickle Detected Pickle imports (9)
- "collections.defaultdict",
- "copy_reg._reconstructor",
- "__builtin__.object",
- "nltk.tokenize.punkt.PunktToken",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "__builtin__.set",
- "nltk.tokenize.punkt.PunktParameters",
- "__builtin__.int",
- "nltk.tokenize.punkt.PunktSentenceTokenizer"
How to fix it?
649 kB first russian.pickle Detected Pickle imports (7)
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "nltk.tokenize.punkt.PunktParameters",
- "__builtin__.set",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "collections.defaultdict",
- "__builtin__.long",
- "nltk.tokenize.punkt.PunktToken"
How to fix it?
33 kB first slovene.pickle Detected Pickle imports (9)
- "nltk.tokenize.punkt.PunktLanguageVars",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "__builtin__.int",
- "nltk.tokenize.punkt.PunktParameters",
- "nltk.tokenize.punkt.PunktToken",
- "__builtin__.set",
- "collections.defaultdict",
- "copy_reg._reconstructor",
- "__builtin__.object"
How to fix it?
833 kB first spanish.pickle Detected Pickle imports (9)
- "copy_reg._reconstructor",
- "collections.defaultdict",
- "__builtin__.int",
- "__builtin__.object",
- "nltk.tokenize.punkt.PunktParameters",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "nltk.tokenize.punkt.PunktToken",
- "__builtin__.set",
- "nltk.tokenize.punkt.PunktLanguageVars"
How to fix it?
598 kB first swedish.pickle Detected Pickle imports (9)
- "__builtin__.int",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "copy_reg._reconstructor",
- "__builtin__.set",
- "nltk.tokenize.punkt.PunktParameters",
- "collections.defaultdict",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "__builtin__.object",
- "nltk.tokenize.punkt.PunktToken"
How to fix it?
1.03 MB first turkish.pickle Detected Pickle imports (9)
- "nltk.tokenize.punkt.PunktToken",
- "nltk.tokenize.punkt.PunktParameters",
- "copy_reg._reconstructor",
- "__builtin__.int",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "collections.defaultdict",
- "__builtin__.object",
- "__builtin__.set",
- "nltk.tokenize.punkt.PunktLanguageVars"
How to fix it?
1.23 MB first