Fetching metadata from the HF Docker repository... Eleonora Bernasconi
up
def4987 - PY3 up
- 6.15 kB up
- 8.57 kB up
czech.pickle Detected Pickle imports (9)
- "nltk.tokenize.punkt.PunktToken",
- "collections.defaultdict",
- "copy_reg._reconstructor",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "__builtin__.int",
- "__builtin__.set",
- "nltk.tokenize.punkt.PunktParameters",
- "__builtin__.object"
How to fix it?
1.27 MB up danish.pickle Detected Pickle imports (9)
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "__builtin__.int",
- "__builtin__.object",
- "collections.defaultdict",
- "nltk.tokenize.punkt.PunktToken",
- "__builtin__.set",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "nltk.tokenize.punkt.PunktParameters",
- "copy_reg._reconstructor"
How to fix it?
1.26 MB up dutch.pickle Detected Pickle imports (9)
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "nltk.tokenize.punkt.PunktParameters",
- "__builtin__.object",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "collections.defaultdict",
- "__builtin__.int",
- "__builtin__.set",
- "nltk.tokenize.punkt.PunktToken",
- "copy_reg._reconstructor"
How to fix it?
743 kB up english.pickle Detected Pickle imports (9)
- "__builtin__.int",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "copy_reg._reconstructor",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "__builtin__.set",
- "collections.defaultdict",
- "nltk.tokenize.punkt.PunktParameters",
- "__builtin__.object",
- "nltk.tokenize.punkt.PunktToken"
How to fix it?
433 kB up estonian.pickle Detected Pickle imports (9)
- "nltk.tokenize.punkt.PunktToken",
- "collections.defaultdict",
- "copy_reg._reconstructor",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "__builtin__.object",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "__builtin__.set",
- "__builtin__.int",
- "nltk.tokenize.punkt.PunktParameters"
How to fix it?
1.6 MB up finnish.pickle Detected Pickle imports (9)
- "__builtin__.object",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "nltk.tokenize.punkt.PunktParameters",
- "nltk.tokenize.punkt.PunktToken",
- "collections.defaultdict",
- "__builtin__.set",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "copy_reg._reconstructor",
- "__builtin__.int"
How to fix it?
1.95 MB up french.pickle Detected Pickle imports (9)
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "collections.defaultdict",
- "__builtin__.int",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "nltk.tokenize.punkt.PunktToken",
- "copy_reg._reconstructor",
- "__builtin__.set",
- "nltk.tokenize.punkt.PunktParameters",
- "__builtin__.object"
How to fix it?
583 kB up german.pickle Detected Pickle imports (9)
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "__builtin__.object",
- "copy_reg._reconstructor",
- "nltk.tokenize.punkt.PunktToken",
- "nltk.tokenize.punkt.PunktParameters",
- "__builtin__.set",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "collections.defaultdict",
- "__builtin__.int"
How to fix it?
1.53 MB up greek.pickle Detected Pickle imports (9)
- "__builtin__.object",
- "collections.defaultdict",
- "copy_reg._reconstructor",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "nltk.tokenize.punkt.PunktParameters",
- "__builtin__.set",
- "nltk.tokenize.punkt.PunktToken",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "__builtin__.int"
How to fix it?
1.95 MB up italian.pickle Detected Pickle imports (9)
- "collections.defaultdict",
- "copy_reg._reconstructor",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "__builtin__.int",
- "nltk.tokenize.punkt.PunktParameters",
- "__builtin__.set",
- "nltk.tokenize.punkt.PunktToken",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "__builtin__.object"
How to fix it?
658 kB up malayalam.pickle Detected Pickle imports (7)
- "nltk.tokenize.punkt.PunktParameters",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "nltk.tokenize.punkt.PunktToken",
- "__builtin__.int",
- "__builtin__.set",
- "collections.defaultdict"
How to fix it?
221 kB up norwegian.pickle Detected Pickle imports (9)
- "nltk.tokenize.punkt.PunktLanguageVars",
- "collections.defaultdict",
- "nltk.tokenize.punkt.PunktToken",
- "__builtin__.object",
- "__builtin__.int",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "nltk.tokenize.punkt.PunktParameters",
- "__builtin__.set",
- "copy_reg._reconstructor"
How to fix it?
1.26 MB up polish.pickle Detected Pickle imports (9)
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "nltk.tokenize.punkt.PunktParameters",
- "__builtin__.object",
- "__builtin__.int",
- "nltk.tokenize.punkt.PunktToken",
- "__builtin__.set",
- "collections.defaultdict",
- "copy_reg._reconstructor"
How to fix it?
2.04 MB up portuguese.pickle Detected Pickle imports (9)
- "collections.defaultdict",
- "copy_reg._reconstructor",
- "__builtin__.object",
- "nltk.tokenize.punkt.PunktToken",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "__builtin__.set",
- "nltk.tokenize.punkt.PunktParameters",
- "__builtin__.int",
- "nltk.tokenize.punkt.PunktSentenceTokenizer"
How to fix it?
649 kB up russian.pickle Detected Pickle imports (7)
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "nltk.tokenize.punkt.PunktParameters",
- "__builtin__.set",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "collections.defaultdict",
- "__builtin__.long",
- "nltk.tokenize.punkt.PunktToken"
How to fix it?
33 kB up slovene.pickle Detected Pickle imports (9)
- "nltk.tokenize.punkt.PunktLanguageVars",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "__builtin__.int",
- "nltk.tokenize.punkt.PunktParameters",
- "nltk.tokenize.punkt.PunktToken",
- "__builtin__.set",
- "collections.defaultdict",
- "copy_reg._reconstructor",
- "__builtin__.object"
How to fix it?
833 kB up spanish.pickle Detected Pickle imports (9)
- "copy_reg._reconstructor",
- "collections.defaultdict",
- "__builtin__.int",
- "__builtin__.object",
- "nltk.tokenize.punkt.PunktParameters",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "nltk.tokenize.punkt.PunktToken",
- "__builtin__.set",
- "nltk.tokenize.punkt.PunktLanguageVars"
How to fix it?
598 kB up swedish.pickle Detected Pickle imports (9)
- "__builtin__.int",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "copy_reg._reconstructor",
- "__builtin__.set",
- "nltk.tokenize.punkt.PunktParameters",
- "collections.defaultdict",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "__builtin__.object",
- "nltk.tokenize.punkt.PunktToken"
How to fix it?
1.03 MB up turkish.pickle Detected Pickle imports (9)
- "nltk.tokenize.punkt.PunktToken",
- "nltk.tokenize.punkt.PunktParameters",
- "copy_reg._reconstructor",
- "__builtin__.int",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "collections.defaultdict",
- "__builtin__.object",
- "__builtin__.set",
- "nltk.tokenize.punkt.PunktLanguageVars"
How to fix it?
1.23 MB up