Fetching metadata from the HF Docker repository...
Atualizando doencas.json
1eaac0d
-
191 Bytes
Upload scripts, docker settings, requirements
-
1.52 kB
initial commit
-
973 Bytes
Tentando segmentar novamente.
-
415 Bytes
Update README.md
-
6.31 kB
Atualizando doencas.json
-
219 Bytes
tentando sem segmentação
encoder.pkl
Detected Pickle imports (4)
- "numpy.ndarray",
- "numpy.dtype",
- "numpy.core.multiarray._reconstruct",
- "sklearn.preprocessing._label.LabelEncoder"
How to fix it?
1.65 kB
Tentando novos binários
-
2.43 kB
Add endpoint de debug pra visualizar img segmentada.
-
7.85 kB
Add endpoint de debug pra visualizar img segmentada.
-
434 Bytes
Tentando segmentar novamente.
scaler.pkl
Detected Pickle imports (5)
- "numpy.ndarray",
- "sklearn.preprocessing._data.StandardScaler",
- "numpy.core.multiarray.scalar",
- "numpy.core.multiarray._reconstruct",
- "numpy.dtype"
How to fix it?
37.3 kB
Tentando novos binários
svm_model.pkl
Detected Pickle imports (4)
- "numpy.ndarray",
- "sklearn.svm._classes.SVC",
- "numpy.core.multiarray._reconstruct",
- "numpy.dtype"
How to fix it?
3.4 MB
Tentando segmentar novamente.
umap_reducer.pkl
Detected Pickle imports (13)
- "numpy.core.multiarray._reconstruct",
- "numpy.float32",
- "numba.core.serialize.custom_rebuild",
- "numpy.core._multiarray_umath.sqrt",
- "umap.umap_.UMAP",
- "pynndescent.pynndescent_.NNDescent",
- "numpy.ndarray",
- "numpy.dtype",
- "numpy.random._pickle.__randomstate_ctor",
- "scipy.sparse._csr.csr_matrix",
- "numpy.random._pickle.__bit_generator_ctor",
- "numba.core.serialize._unpickle__CustomPickled",
- "numpy.core.multiarray.scalar"
How to fix it?
256 MB
Tentando segmentar novamente.