Fetching metadata from the HF Docker repository...
Update app.py
9eb7a80 verified - 1.52 kB initial commit
- 261 Bytes Create Dockerfile
- 233 Bytes initial commit
RF.joblib Detected Pickle imports (7)
- "sklearn.ensemble._forest.RandomForestClassifier",
- "_codecs.encode",
- "numpy.dtype",
- "numpy.ndarray",
- "joblib.numpy_pickle.NumpyArrayWrapper",
- "sklearn.tree._classes.DecisionTreeClassifier",
- "numpy.core.multiarray._reconstruct"
How to fix it?
69.6 MB Upload RF.joblib SVM.joblib Detected Pickle imports (6)
- "numpy.dtype",
- "joblib.numpy_pickle.NumpyArrayWrapper",
- "numpy.core.multiarray._reconstruct",
- "sklearn.svm._classes.SVC",
- "_codecs.encode",
- "numpy.ndarray"
How to fix it?
5.55 MB Upload SVM.joblib - 5.34 kB Update app.py
norm (1).joblib Detected Pickle imports (6)
- "sklearn.preprocessing._data.MinMaxScaler",
- "numpy.ndarray",
- "_codecs.encode",
- "joblib.numpy_pickle.NumpyArrayWrapper",
- "numpy.dtype",
- "numpy.core.multiarray._reconstruct"
How to fix it?
9.76 kB Upload norm (1).joblib norm (2).joblib Detected Pickle imports (6)
- "sklearn.preprocessing._data.MinMaxScaler",
- "numpy.ndarray",
- "_codecs.encode",
- "joblib.numpy_pickle.NumpyArrayWrapper",
- "numpy.dtype",
- "numpy.core.multiarray._reconstruct"
How to fix it?
9.82 kB Upload norm (2).joblib norm (4).joblib Detected Pickle imports (6)
- "sklearn.preprocessing._data.MinMaxScaler",
- "numpy.ndarray",
- "_codecs.encode",
- "joblib.numpy_pickle.NumpyArrayWrapper",
- "numpy.dtype",
- "numpy.core.multiarray._reconstruct"
How to fix it?
87.9 kB Upload norm (4).joblib norm.joblib Detected Pickle imports (6)
- "_codecs.encode",
- "numpy.core.multiarray._reconstruct",
- "joblib.numpy_pickle.NumpyArrayWrapper",
- "sklearn.preprocessing._data.MinMaxScaler",
- "numpy.ndarray",
- "numpy.dtype"
How to fix it?
22.1 kB Upload norm.joblib - 62 Bytes Update requirements.txt