Fetching metadata from the HF Docker repository...
Update app.py
84656de verified - 275 Bytes Create .env
- 1.52 kB initial commit
- 236 Bytes initial commit
- 4.98 kB Update app.py
appliance_model.pkl Detected Pickle imports (7)
- "joblib.numpy_pickle.NumpyArrayWrapper",
- "sklearn.ensemble._forest.RandomForestClassifier",
- "numpy.ndarray",
- "numpy._core.multiarray._reconstruct",
- "None.dtype",
- "sklearn.tree._classes.DecisionTreeClassifier",
- "numpy.dtype"
How to fix it?
1.27 MB Upload 2 files label_encoder.pkl Detected Pickle imports (6)
- "numpy.ndarray",
- "sklearn.preprocessing._label.LabelEncoder",
- "numpy.dtype",
- "None.dtype",
- "joblib.numpy_pickle.NumpyArrayWrapper",
- "numpy._core.multiarray._reconstruct"
How to fix it?
630 Bytes Upload 3 files - 65 Bytes Update requirements.txt
scaler.pkl Detected Pickle imports (7)
- "numpy._core.multiarray.scalar",
- "None.dtype",
- "numpy.ndarray",
- "numpy.dtype",
- "sklearn.preprocessing._data.StandardScaler",
- "numpy._core.multiarray._reconstruct",
- "joblib.numpy_pickle.NumpyArrayWrapper"
How to fix it?
1.14 kB Upload 2 files xgb_model.pkl Detected Pickle imports (3)
- "builtins.bytearray",
- "xgboost.sklearn.XGBClassifier",
- "xgboost.core.Booster"
How to fix it?
3.65 MB Upload 3 files xgb_scaler.pkl Detected Pickle imports (7)
- "numpy.ndarray",
- "numpy.dtype",
- "None.dtype",
- "numpy._core.multiarray.scalar",
- "sklearn.preprocessing._data.StandardScaler",
- "joblib.numpy_pickle.NumpyArrayWrapper",
- "numpy._core.multiarray._reconstruct"
How to fix it?
1.28 kB Upload 3 files