Fetching metadata from the HF Docker repository...
all resource
324bf29 - PY3 all resource
- 6.15 kB all resource
- 8.57 kB all resource
czech.pickle Detected Pickle imports (3)
- "__builtin__
.object
",
- "copy_reg
._reconstructor
",
- "nltk.tokenize.punkt
.PunktSentenceTokenizer
"
How to fix it?
1.42 MB all resource danish.pickle Detected Pickle imports (3)
- "__builtin__
.object
",
- "copy_reg
._reconstructor
",
- "nltk.tokenize.punkt
.PunktSentenceTokenizer
"
How to fix it?
1.43 MB all resource dutch.pickle Detected Pickle imports (3)
- "__builtin__
.object
",
- "copy_reg
._reconstructor
",
- "nltk.tokenize.punkt
.PunktSentenceTokenizer
"
How to fix it?
840 kB all resource english.pickle Detected Pickle imports (3)
- "__builtin__
.object
",
- "copy_reg
._reconstructor
",
- "nltk.tokenize.punkt
.PunktSentenceTokenizer
"
How to fix it?
495 kB all resource estonian.pickle Detected Pickle imports (3)
- "__builtin__
.object
",
- "copy_reg
._reconstructor
",
- "nltk.tokenize.punkt
.PunktSentenceTokenizer
"
How to fix it?
1.8 MB all resource finnish.pickle Detected Pickle imports (3)
- "__builtin__
.object
",
- "copy_reg
._reconstructor
",
- "nltk.tokenize.punkt
.PunktSentenceTokenizer
"
How to fix it?
2.19 MB all resource french.pickle Detected Pickle imports (3)
- "__builtin__
.object
",
- "copy_reg
._reconstructor
",
- "nltk.tokenize.punkt
.PunktSentenceTokenizer
"
How to fix it?
664 kB all resource german.pickle Detected Pickle imports (3)
- "__builtin__
.object
",
- "copy_reg
._reconstructor
",
- "nltk.tokenize.punkt
.PunktSentenceTokenizer
"
How to fix it?
1.71 MB all resource greek.pickle Detected Pickle imports (3)
- "__builtin__
.object
",
- "copy_reg
._reconstructor
",
- "nltk.tokenize.punkt
.PunktSentenceTokenizer
"
How to fix it?
2.04 MB all resource italian.pickle Detected Pickle imports (3)
- "__builtin__
.object
",
- "copy_reg
._reconstructor
",
- "nltk.tokenize.punkt
.PunktSentenceTokenizer
"
How to fix it?
749 kB all resource malayalam.pickle Detected Pickle imports (7)
- "nltk.tokenize.punkt.PunktParameters",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "nltk.tokenize.punkt.PunktToken",
- "__builtin__.int",
- "__builtin__.set",
- "collections.defaultdict"
How to fix it?
221 kB all resource norwegian.pickle Detected Pickle imports (3)
- "__builtin__
.object
",
- "copy_reg
._reconstructor
",
- "nltk.tokenize.punkt
.PunktSentenceTokenizer
"
How to fix it?
1.42 MB all resource polish.pickle Detected Pickle imports (3)
- "__builtin__
.object
",
- "copy_reg
._reconstructor
",
- "nltk.tokenize.punkt
.PunktSentenceTokenizer
"
How to fix it?
2.29 MB all resource portuguese.pickle Detected Pickle imports (3)
- "__builtin__
.object
",
- "copy_reg
._reconstructor
",
- "nltk.tokenize.punkt
.PunktSentenceTokenizer
"
How to fix it?
740 kB all resource russian.pickle Detected Pickle imports (7)
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "nltk.tokenize.punkt.PunktParameters",
- "__builtin__.set",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "collections.defaultdict",
- "__builtin__.long",
- "nltk.tokenize.punkt.PunktToken"
How to fix it?
33 kB all resource slovene.pickle Detected Pickle imports (3)
- "__builtin__
.object
",
- "copy_reg
._reconstructor
",
- "nltk.tokenize.punkt
.PunktSentenceTokenizer
"
How to fix it?
940 kB all resource spanish.pickle Detected Pickle imports (3)
- "__builtin__
.object
",
- "copy_reg
._reconstructor
",
- "nltk.tokenize.punkt
.PunktSentenceTokenizer
"
How to fix it?
680 kB all resource swedish.pickle Detected Pickle imports (3)
- "__builtin__
.object
",
- "copy_reg
._reconstructor
",
- "nltk.tokenize.punkt
.PunktSentenceTokenizer
"
How to fix it?
1.17 MB all resource turkish.pickle Detected Pickle imports (3)
- "__builtin__
.object
",
- "copy_reg
._reconstructor
",
- "nltk.tokenize.punkt
.PunktSentenceTokenizer
"
How to fix it?
1.36 MB all resource