Upload untitled29.py
af042ac
-
1.7 kB
Upload libbitsandbytes_cuda116.dll
-
45.1 kB
Upload Analytics_API_Documentation_v0.1 (2).docx
-
39.7 kB
Upload Analytics_API_Documentation_v0.1 (4).docx
-
4.61 MB
Upload Files_Online2PDF.zip
-
25.1 kB
Upload Hunflair_API_Documentation_v0.1.docx
-
72.8 kB
Upload 2 files
-
34.6 kB
Upload Law_Insider_honda-auto-receivables-2006-2-owner-trust_honda-auto-receivables-2006-2-owner-trust-as_Filed_25-08-2006_Contract_removed.pdf
-
29.2 kB
Upload 2 files
-
8.08 kB
Upload META_DATA_INFO.txt
-
107 MB
Upload Py_tesseract.zip
-
23.7 MB
Upload 5 files
-
392 kB
Upload 5 files
-
3.61 kB
Upload colab_script.py
-
228 kB
Upload 2 files
-
276 kB
Upload 2 files
-
1.1 GB
Upload hunflair-celline-v1.0.pt
-
1.1 GB
Upload hunflair-chemical-full-v1.0.pt
-
1.1 GB
Upload hunflair-disease-full-v1.0.pt
-
1.1 GB
Upload hunflair-gene-full-v1.0.pt
-
1.1 GB
Upload hunflair-species-full-v1.1.pt
-
4.72 MB
Upload libbitsandbytes_cuda116.dll
punkt.zip
Detected Pickle imports (300)
- "__builtin__.set",
- "__builtin__.int",
- "__builtin__.object",
- "nltk.tokenize.punkt.PunktParameters",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "nltk.tokenize.punkt.PunktToken",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "collections.defaultdict",
- "copy_reg._reconstructor",
- "__builtin__.set",
- "__builtin__.int",
- "__builtin__.object",
- "nltk.tokenize.punkt.PunktParameters",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "nltk.tokenize.punkt.PunktToken",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "collections.defaultdict",
- "copy_reg._reconstructor",
- "__builtin__.set",
- "__builtin__.int",
- "__builtin__.object",
- "nltk.tokenize.punkt.PunktParameters",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "nltk.tokenize.punkt.PunktToken",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "collections.defaultdict",
- "copy_reg._reconstructor",
- "__builtin__.set",
- "__builtin__.int",
- "__builtin__.object",
- "nltk.tokenize.punkt.PunktParameters",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "nltk.tokenize.punkt.PunktToken",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "collections.defaultdict",
- "copy_reg._reconstructor",
- "builtins.int",
- "builtins.set",
- "nltk.tokenize.punkt.PunktParameters",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "nltk.tokenize.punkt.PunktToken",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "collections.defaultdict",
- "builtins.int",
- "builtins.set",
- "nltk.tokenize.punkt.PunktParameters",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "nltk.tokenize.punkt.PunktToken",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "collections.defaultdict",
- "builtins.int",
- "builtins.set",
- "nltk.tokenize.punkt.PunktParameters",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "nltk.tokenize.punkt.PunktToken",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "collections.defaultdict",
- "builtins.int",
- "builtins.set",
- "nltk.tokenize.punkt.PunktParameters",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "nltk.tokenize.punkt.PunktToken",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "collections.defaultdict",
- "builtins.int",
- "builtins.set",
- "nltk.tokenize.punkt.PunktParameters",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "nltk.tokenize.punkt.PunktToken",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "collections.defaultdict",
- "builtins.int",
- "builtins.set",
- "nltk.tokenize.punkt.PunktParameters",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "nltk.tokenize.punkt.PunktToken",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "collections.defaultdict",
- "builtins.int",
- "builtins.set",
- "nltk.tokenize.punkt.PunktParameters",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "nltk.tokenize.punkt.PunktToken",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "collections.defaultdict",
- "builtins.int",
- "builtins.set",
- "nltk.tokenize.punkt.PunktParameters",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "nltk.tokenize.punkt.PunktToken",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "collections.defaultdict",
- "builtins.int",
- "builtins.set",
- "nltk.tokenize.punkt.PunktParameters",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "nltk.tokenize.punkt.PunktToken",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "collections.defaultdict",
- "builtins.int",
- "builtins.set",
- "nltk.tokenize.punkt.PunktParameters",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "nltk.tokenize.punkt.PunktToken",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "collections.defaultdict",
- "builtins.int",
- "builtins.set",
- "nltk.tokenize.punkt.PunktParameters",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "nltk.tokenize.punkt.PunktToken",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "collections.defaultdict",
- "builtins.int",
- "builtins.set",
- "nltk.tokenize.punkt.PunktParameters",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "nltk.tokenize.punkt.PunktToken",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "collections.defaultdict",
- "builtins.int",
- "builtins.set",
- "nltk.tokenize.punkt.PunktParameters",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "nltk.tokenize.punkt.PunktToken",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "collections.defaultdict",
- "builtins.int",
- "builtins.set",
- "nltk.tokenize.punkt.PunktParameters",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "nltk.tokenize.punkt.PunktToken",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "collections.defaultdict",
- "builtins.int",
- "builtins.set",
- "nltk.tokenize.punkt.PunktParameters",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "nltk.tokenize.punkt.PunktToken",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "collections.defaultdict",
- "builtins.int",
- "builtins.set",
- "nltk.tokenize.punkt.PunktParameters",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "nltk.tokenize.punkt.PunktToken",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "collections.defaultdict",
- "builtins.int",
- "builtins.set",
- "nltk.tokenize.punkt.PunktParameters",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "nltk.tokenize.punkt.PunktToken",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "collections.defaultdict",
- "builtins.int",
- "builtins.set",
- "nltk.tokenize.punkt.PunktParameters",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "nltk.tokenize.punkt.PunktToken",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "collections.defaultdict",
- "__builtin__.set",
- "__builtin__.long",
- "nltk.tokenize.punkt.PunktParameters",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "nltk.tokenize.punkt.PunktToken",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "collections.defaultdict",
- "__builtin__.set",
- "__builtin__.int",
- "__builtin__.object",
- "nltk.tokenize.punkt.PunktParameters",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "nltk.tokenize.punkt.PunktToken",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "collections.defaultdict",
- "copy_reg._reconstructor",
- "__builtin__.set",
- "__builtin__.int",
- "__builtin__.object",
- "nltk.tokenize.punkt.PunktParameters",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "nltk.tokenize.punkt.PunktToken",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "collections.defaultdict",
- "copy_reg._reconstructor",
- "__builtin__.set",
- "__builtin__.int",
- "__builtin__.object",
- "nltk.tokenize.punkt.PunktParameters",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "nltk.tokenize.punkt.PunktToken",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "collections.defaultdict",
- "copy_reg._reconstructor",
- "__builtin__.set",
- "__builtin__.int",
- "__builtin__.object",
- "nltk.tokenize.punkt.PunktParameters",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "nltk.tokenize.punkt.PunktToken",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "collections.defaultdict",
- "copy_reg._reconstructor",
- "__builtin__.set",
- "__builtin__.int",
- "__builtin__.object",
- "nltk.tokenize.punkt.PunktParameters",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "nltk.tokenize.punkt.PunktToken",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "collections.defaultdict",
- "copy_reg._reconstructor",
- "__builtin__.set",
- "__builtin__.int",
- "__builtin__.object",
- "nltk.tokenize.punkt.PunktParameters",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "nltk.tokenize.punkt.PunktToken",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "collections.defaultdict",
- "copy_reg._reconstructor",
- "__builtin__.set",
- "__builtin__.int",
- "__builtin__.object",
- "nltk.tokenize.punkt.PunktParameters",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "nltk.tokenize.punkt.PunktToken",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "collections.defaultdict",
- "copy_reg._reconstructor",
- "__builtin__.set",
- "__builtin__.int",
- "__builtin__.object",
- "nltk.tokenize.punkt.PunktParameters",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "nltk.tokenize.punkt.PunktToken",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "collections.defaultdict",
- "copy_reg._reconstructor",
- "__builtin__.set",
- "__builtin__.int",
- "__builtin__.object",
- "nltk.tokenize.punkt.PunktParameters",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "nltk.tokenize.punkt.PunktToken",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "collections.defaultdict",
- "copy_reg._reconstructor",
- "__builtin__.set",
- "__builtin__.int",
- "__builtin__.object",
- "nltk.tokenize.punkt.PunktParameters",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "nltk.tokenize.punkt.PunktToken",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "collections.defaultdict",
- "copy_reg._reconstructor",
- "__builtin__.set",
- "__builtin__.int",
- "__builtin__.object",
- "nltk.tokenize.punkt.PunktParameters",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "nltk.tokenize.punkt.PunktToken",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "collections.defaultdict",
- "copy_reg._reconstructor",
- "__builtin__.set",
- "__builtin__.int",
- "__builtin__.object",
- "nltk.tokenize.punkt.PunktParameters",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "nltk.tokenize.punkt.PunktToken",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "collections.defaultdict",
- "copy_reg._reconstructor",
- "__builtin__.set",
- "__builtin__.int",
- "__builtin__.object",
- "nltk.tokenize.punkt.PunktParameters",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "nltk.tokenize.punkt.PunktToken",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "collections.defaultdict",
- "copy_reg._reconstructor",
- "__builtin__.set",
- "nltk.tokenize.punkt.PunktParameters",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "nltk.tokenize.punkt.PunktToken",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "collections.defaultdict",
- "__builtin__.int",
- "__builtin__.set",
- "nltk.tokenize.punkt.PunktParameters",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "nltk.tokenize.punkt.PunktToken",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "collections.defaultdict",
- "__builtin__.int"
How to fix it?
13.9 MB
Upload punkt.zip
-
9.25 kB
Upload request.txt
-
1.68 kB
Upload requirements_vicuna.txt
-
727 kB
Upload scansmpl_merged.pdf
-
129 kB
Upload similarity_samples.zip
-
87.1 kB
Upload similaritydocsSAMPLE.7z
-
34.3 kB
Upload stopwords.zip
-
1.86 kB
Upload test_model.py
-
826 kB
Upload textclusteringDBSCAN (2).7z
-
6.86 MB
Upload transformers-4.28.0.dev0-py3-none-any.whl
-
2.93 kB
Upload untitled29.py
vicuna_LLM.zip
Detected Pickle imports (30)
- "numpy.ndarray",
- "numpy.core.multiarray._reconstruct",
- "numpy.dtype",
- "pandas.core.frame.DataFrame",
- "pandas.core.indexes.base._new_Index",
- "pandas.core.internals.managers.BlockManager",
- "numpy.dtype",
- "pandas.core.indexes.base.Index",
- "numpy.ndarray",
- "numpy.core.multiarray._reconstruct",
- "pandas.core.frame.DataFrame",
- "pandas.core.indexes.base._new_Index",
- "pandas.core.indexes.multi.MultiIndex",
- "__builtin__.slice",
- "numpy.dtype",
- "pandas.core.indexes.range.RangeIndex",
- "pandas.core.indexes.frozen.FrozenNDArray",
- "pandas.core.internals.BlockManager",
- "pandas.core.indexes.base.Index",
- "numpy.ndarray",
- "numpy.core.multiarray._reconstruct",
- "pandas.core.frame.DataFrame",
- "pandas.core.indexes.base._new_Index",
- "__builtin__.slice",
- "pandas.core.internals.managers.BlockManager",
- "numpy.dtype",
- "pandas.core.indexes.range.RangeIndex",
- "pandas.core.indexes.base.Index",
- "numpy.ndarray",
- "numpy.core.multiarray._reconstruct"
How to fix it?
328 MB
Upload vicuna_LLM.zip