Fetching metadata from the HF Docker repository...
Upload 243 files
107f987 - PY3 Upload 243 files
- 8.57 kB Upload 243 files
czech.pickle Detected Pickle imports (3)
- "__builtin__
.object
",
- "copy_reg
._reconstructor
",
- "nltk.tokenize.punkt
.PunktSentenceTokenizer
"
How to fix it?
1.42 MB Upload 243 files danish.pickle Detected Pickle imports (3)
- "__builtin__
.object
",
- "copy_reg
._reconstructor
",
- "nltk.tokenize.punkt
.PunktSentenceTokenizer
"
How to fix it?
1.43 MB Upload 243 files dutch.pickle Detected Pickle imports (3)
- "__builtin__
.object
",
- "copy_reg
._reconstructor
",
- "nltk.tokenize.punkt
.PunktSentenceTokenizer
"
How to fix it?
840 kB Upload 243 files english.pickle Detected Pickle imports (3)
- "__builtin__
.object
",
- "copy_reg
._reconstructor
",
- "nltk.tokenize.punkt
.PunktSentenceTokenizer
"
How to fix it?
495 kB Upload 243 files estonian.pickle Detected Pickle imports (3)
- "__builtin__
.object
",
- "copy_reg
._reconstructor
",
- "nltk.tokenize.punkt
.PunktSentenceTokenizer
"
How to fix it?
1.8 MB Upload 243 files finnish.pickle Detected Pickle imports (3)
- "__builtin__
.object
",
- "copy_reg
._reconstructor
",
- "nltk.tokenize.punkt
.PunktSentenceTokenizer
"
How to fix it?
2.19 MB Upload 243 files french.pickle Detected Pickle imports (3)
- "__builtin__
.object
",
- "copy_reg
._reconstructor
",
- "nltk.tokenize.punkt
.PunktSentenceTokenizer
"
How to fix it?
664 kB Upload 243 files german.pickle Detected Pickle imports (3)
- "__builtin__
.object
",
- "copy_reg
._reconstructor
",
- "nltk.tokenize.punkt
.PunktSentenceTokenizer
"
How to fix it?
1.71 MB Upload 243 files greek.pickle Detected Pickle imports (3)
- "__builtin__
.object
",
- "copy_reg
._reconstructor
",
- "nltk.tokenize.punkt
.PunktSentenceTokenizer
"
How to fix it?
2.04 MB Upload 243 files italian.pickle Detected Pickle imports (3)
- "__builtin__
.object
",
- "copy_reg
._reconstructor
",
- "nltk.tokenize.punkt
.PunktSentenceTokenizer
"
How to fix it?
749 kB Upload 243 files malayalam.pickle Detected Pickle imports (7)
- "nltk.tokenize.punkt.PunktParameters",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "nltk.tokenize.punkt.PunktToken",
- "__builtin__.int",
- "__builtin__.set",
- "collections.defaultdict"
How to fix it?
221 kB Upload 243 files norwegian.pickle Detected Pickle imports (3)
- "__builtin__
.object
",
- "copy_reg
._reconstructor
",
- "nltk.tokenize.punkt
.PunktSentenceTokenizer
"
How to fix it?
1.42 MB Upload 243 files polish.pickle Detected Pickle imports (3)
- "__builtin__
.object
",
- "copy_reg
._reconstructor
",
- "nltk.tokenize.punkt
.PunktSentenceTokenizer
"
How to fix it?
2.29 MB Upload 243 files portuguese.pickle Detected Pickle imports (3)
- "__builtin__
.object
",
- "copy_reg
._reconstructor
",
- "nltk.tokenize.punkt
.PunktSentenceTokenizer
"
How to fix it?
740 kB Upload 243 files russian.pickle Detected Pickle imports (7)
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "nltk.tokenize.punkt.PunktParameters",
- "__builtin__.set",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "collections.defaultdict",
- "__builtin__.long",
- "nltk.tokenize.punkt.PunktToken"
How to fix it?
33 kB Upload 243 files slovene.pickle Detected Pickle imports (3)
- "__builtin__
.object
",
- "copy_reg
._reconstructor
",
- "nltk.tokenize.punkt
.PunktSentenceTokenizer
"
How to fix it?
940 kB Upload 243 files spanish.pickle Detected Pickle imports (3)
- "__builtin__
.object
",
- "copy_reg
._reconstructor
",
- "nltk.tokenize.punkt
.PunktSentenceTokenizer
"
How to fix it?
680 kB Upload 243 files swedish.pickle Detected Pickle imports (3)
- "__builtin__
.object
",
- "copy_reg
._reconstructor
",
- "nltk.tokenize.punkt
.PunktSentenceTokenizer
"
How to fix it?
1.17 MB Upload 243 files turkish.pickle Detected Pickle imports (3)
- "__builtin__
.object
",
- "copy_reg
._reconstructor
",
- "nltk.tokenize.punkt
.PunktSentenceTokenizer
"
How to fix it?
1.36 MB Upload 243 files