Fetching metadata from the HF Docker repository...
agregar tokenizers
1569560 - 8.57 kB agregar tokenizers
czech.pickle Detected Pickle imports (7)
- "nltk.tokenize.punkt.PunktLanguageVars",
- "nltk.tokenize.punkt.PunktToken",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "builtins.int",
- "nltk.tokenize.punkt.PunktParameters",
- "builtins.set",
- "collections.defaultdict"
How to fix it?
1.12 MB agregar tokenizers danish.pickle Detected Pickle imports (7)
- "nltk.tokenize.punkt.PunktToken",
- "collections.defaultdict",
- "builtins.int",
- "builtins.set",
- "nltk.tokenize.punkt.PunktParameters",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "nltk.tokenize.punkt.PunktLanguageVars"
How to fix it?
1.19 MB agregar tokenizers dutch.pickle Detected Pickle imports (7)
- "builtins.int",
- "nltk.tokenize.punkt.PunktParameters",
- "builtins.set",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "collections.defaultdict",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "nltk.tokenize.punkt.PunktToken"
How to fix it?
694 kB agregar tokenizers english.pickle Detected Pickle imports (7)
- "nltk.tokenize.punkt.PunktParameters",
- "collections.defaultdict",
- "builtins.int",
- "nltk.tokenize.punkt.PunktToken",
- "builtins.set",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "nltk.tokenize.punkt.PunktLanguageVars"
How to fix it?
407 kB agregar tokenizers estonian.pickle Detected Pickle imports (7)
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "builtins.int",
- "builtins.set",
- "nltk.tokenize.punkt.PunktParameters",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "collections.defaultdict",
- "nltk.tokenize.punkt.PunktToken"
How to fix it?
1.5 MB agregar tokenizers finnish.pickle Detected Pickle imports (7)
- "nltk.tokenize.punkt.PunktLanguageVars",
- "nltk.tokenize.punkt.PunktToken",
- "collections.defaultdict",
- "builtins.int",
- "builtins.set",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "nltk.tokenize.punkt.PunktParameters"
How to fix it?
1.85 MB agregar tokenizers french.pickle Detected Pickle imports (7)
- "nltk.tokenize.punkt.PunktToken",
- "collections.defaultdict",
- "builtins.set",
- "builtins.int",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "nltk.tokenize.punkt.PunktParameters"
How to fix it?
554 kB agregar tokenizers german.pickle Detected Pickle imports (7)
- "builtins.int",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "collections.defaultdict",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "nltk.tokenize.punkt.PunktToken",
- "nltk.tokenize.punkt.PunktParameters",
- "builtins.set"
How to fix it?
1.46 MB agregar tokenizers greek.pickle Detected Pickle imports (7)
- "builtins.int",
- "nltk.tokenize.punkt.PunktParameters",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "collections.defaultdict",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "builtins.set",
- "nltk.tokenize.punkt.PunktToken"
How to fix it?
876 kB agregar tokenizers italian.pickle Detected Pickle imports (7)
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "nltk.tokenize.punkt.PunktToken",
- "builtins.set",
- "builtins.int",
- "nltk.tokenize.punkt.PunktParameters",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "collections.defaultdict"
How to fix it?
615 kB agregar tokenizers malayalam.pickle Detected Pickle imports (7)
- "nltk.tokenize.punkt.PunktParameters",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "nltk.tokenize.punkt.PunktToken",
- "__builtin__.int",
- "__builtin__.set",
- "collections.defaultdict"
How to fix it?
221 kB agregar tokenizers norwegian.pickle Detected Pickle imports (7)
- "builtins.int",
- "builtins.set",
- "nltk.tokenize.punkt.PunktToken",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "collections.defaultdict",
- "nltk.tokenize.punkt.PunktParameters"
How to fix it?
1.18 MB agregar tokenizers polish.pickle Detected Pickle imports (7)
- "nltk.tokenize.punkt.PunktToken",
- "builtins.set",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "nltk.tokenize.punkt.PunktParameters",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "builtins.int",
- "collections.defaultdict"
How to fix it?
1.74 MB agregar tokenizers portuguese.pickle Detected Pickle imports (7)
- "nltk.tokenize.punkt.PunktLanguageVars",
- "nltk.tokenize.punkt.PunktParameters",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "nltk.tokenize.punkt.PunktToken",
- "collections.defaultdict",
- "builtins.set",
- "builtins.int"
How to fix it?
612 kB agregar tokenizers russian.pickle Detected Pickle imports (7)
- "builtins.set",
- "nltk.tokenize.punkt.PunktParameters",
- "builtins.int",
- "collections.defaultdict",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "nltk.tokenize.punkt.PunktToken",
- "nltk.tokenize.punkt.PunktSentenceTokenizer"
How to fix it?
33 kB agregar tokenizers slovene.pickle Detected Pickle imports (7)
- "nltk.tokenize.punkt.PunktParameters",
- "nltk.tokenize.punkt.PunktToken",
- "collections.defaultdict",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "builtins.set",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "builtins.int"
How to fix it?
734 kB agregar tokenizers spanish.pickle Detected Pickle imports (7)
- "nltk.tokenize.punkt.PunktToken",
- "collections.defaultdict",
- "builtins.int",
- "builtins.set",
- "nltk.tokenize.punkt.PunktParameters",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "nltk.tokenize.punkt.PunktLanguageVars"
How to fix it?
562 kB agregar tokenizers swedish.pickle Detected Pickle imports (7)
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "collections.defaultdict",
- "nltk.tokenize.punkt.PunktToken",
- "nltk.tokenize.punkt.PunktParameters",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "builtins.int",
- "builtins.set"
How to fix it?
980 kB agregar tokenizers turkish.pickle Detected Pickle imports (7)
- "nltk.tokenize.punkt.PunktParameters",
- "collections.defaultdict",
- "builtins.int",
- "nltk.tokenize.punkt.PunktToken",
- "builtins.set",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "nltk.tokenize.punkt.PunktLanguageVars"
How to fix it?
1.02 MB agregar tokenizers