Fetching metadata from the HF Docker repository...
Upload 88 files
e2b1d98 - 8.57 kB Upload 88 files
czech.pickle Detected Pickle imports (7)
- "nltk.tokenize.punkt.PunktLanguageVars",
- "nltk.tokenize.punkt.PunktToken",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "builtins.int",
- "nltk.tokenize.punkt.PunktParameters",
- "builtins.set",
- "collections.defaultdict"
How to fix it?
1.12 MB Upload 88 files danish.pickle Detected Pickle imports (7)
- "nltk.tokenize.punkt.PunktToken",
- "collections.defaultdict",
- "builtins.int",
- "builtins.set",
- "nltk.tokenize.punkt.PunktParameters",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "nltk.tokenize.punkt.PunktLanguageVars"
How to fix it?
1.19 MB Upload 88 files dutch.pickle Detected Pickle imports (7)
- "builtins.int",
- "nltk.tokenize.punkt.PunktParameters",
- "builtins.set",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "collections.defaultdict",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "nltk.tokenize.punkt.PunktToken"
How to fix it?
694 kB Upload 88 files english.pickle Detected Pickle imports (7)
- "nltk.tokenize.punkt.PunktParameters",
- "collections.defaultdict",
- "builtins.int",
- "nltk.tokenize.punkt.PunktToken",
- "builtins.set",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "nltk.tokenize.punkt.PunktLanguageVars"
How to fix it?
407 kB Upload 88 files estonian.pickle Detected Pickle imports (7)
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "builtins.int",
- "builtins.set",
- "nltk.tokenize.punkt.PunktParameters",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "collections.defaultdict",
- "nltk.tokenize.punkt.PunktToken"
How to fix it?
1.5 MB Upload 88 files finnish.pickle Detected Pickle imports (7)
- "nltk.tokenize.punkt.PunktLanguageVars",
- "nltk.tokenize.punkt.PunktToken",
- "collections.defaultdict",
- "builtins.int",
- "builtins.set",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "nltk.tokenize.punkt.PunktParameters"
How to fix it?
1.85 MB Upload 88 files french.pickle Detected Pickle imports (7)
- "nltk.tokenize.punkt.PunktToken",
- "collections.defaultdict",
- "builtins.set",
- "builtins.int",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "nltk.tokenize.punkt.PunktParameters"
How to fix it?
554 kB Upload 88 files german.pickle Detected Pickle imports (7)
- "builtins.int",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "collections.defaultdict",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "nltk.tokenize.punkt.PunktToken",
- "nltk.tokenize.punkt.PunktParameters",
- "builtins.set"
How to fix it?
1.46 MB Upload 88 files greek.pickle Detected Pickle imports (7)
- "builtins.int",
- "nltk.tokenize.punkt.PunktParameters",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "collections.defaultdict",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "builtins.set",
- "nltk.tokenize.punkt.PunktToken"
How to fix it?
876 kB Upload 88 files italian.pickle Detected Pickle imports (7)
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "nltk.tokenize.punkt.PunktToken",
- "builtins.set",
- "builtins.int",
- "nltk.tokenize.punkt.PunktParameters",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "collections.defaultdict"
How to fix it?
615 kB Upload 88 files malayalam.pickle Detected Pickle imports (7)
- "nltk.tokenize.punkt.PunktParameters",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "nltk.tokenize.punkt.PunktToken",
- "__builtin__.int",
- "__builtin__.set",
- "collections.defaultdict"
How to fix it?
221 kB Upload 88 files norwegian.pickle Detected Pickle imports (7)
- "builtins.int",
- "builtins.set",
- "nltk.tokenize.punkt.PunktToken",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "collections.defaultdict",
- "nltk.tokenize.punkt.PunktParameters"
How to fix it?
1.18 MB Upload 88 files polish.pickle Detected Pickle imports (7)
- "nltk.tokenize.punkt.PunktToken",
- "builtins.set",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "nltk.tokenize.punkt.PunktParameters",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "builtins.int",
- "collections.defaultdict"
How to fix it?
1.74 MB Upload 88 files portuguese.pickle Detected Pickle imports (7)
- "nltk.tokenize.punkt.PunktLanguageVars",
- "nltk.tokenize.punkt.PunktParameters",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "nltk.tokenize.punkt.PunktToken",
- "collections.defaultdict",
- "builtins.set",
- "builtins.int"
How to fix it?
612 kB Upload 88 files russian.pickle Detected Pickle imports (7)
- "builtins.set",
- "nltk.tokenize.punkt.PunktParameters",
- "builtins.int",
- "collections.defaultdict",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "nltk.tokenize.punkt.PunktToken",
- "nltk.tokenize.punkt.PunktSentenceTokenizer"
How to fix it?
33 kB Upload 88 files slovene.pickle Detected Pickle imports (7)
- "nltk.tokenize.punkt.PunktParameters",
- "nltk.tokenize.punkt.PunktToken",
- "collections.defaultdict",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "builtins.set",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "builtins.int"
How to fix it?
734 kB Upload 88 files spanish.pickle Detected Pickle imports (7)
- "nltk.tokenize.punkt.PunktToken",
- "collections.defaultdict",
- "builtins.int",
- "builtins.set",
- "nltk.tokenize.punkt.PunktParameters",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "nltk.tokenize.punkt.PunktLanguageVars"
How to fix it?
562 kB Upload 88 files swedish.pickle Detected Pickle imports (7)
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "collections.defaultdict",
- "nltk.tokenize.punkt.PunktToken",
- "nltk.tokenize.punkt.PunktParameters",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "builtins.int",
- "builtins.set"
How to fix it?
980 kB Upload 88 files turkish.pickle Detected Pickle imports (7)
- "nltk.tokenize.punkt.PunktParameters",
- "collections.defaultdict",
- "builtins.int",
- "nltk.tokenize.punkt.PunktToken",
- "builtins.set",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "nltk.tokenize.punkt.PunktLanguageVars"
How to fix it?
1.02 MB Upload 88 files