Fetching metadata from the HF Docker repository...
Update app.py
fa3d68e verified - 1.52 kB initial commit
- 301 Bytes initial commit
- 18 kB Update app.py
car_price_preprocessor.pkl Detected Pickle imports (14)
- "numpy.float64",
- "sklearn.preprocessing._data.StandardScaler",
- "joblib.numpy_pickle.NumpyArrayWrapper",
- "numpy.dtype",
- "numpy._core.multiarray._reconstruct",
- "sklearn.preprocessing._label.MultiLabelBinarizer",
- "__main__.MultiLabelBinarizerTransformer",
- "sklearn.compose._column_transformer.ColumnTransformer",
- "sklearn.preprocessing._encoders.OneHotEncoder",
- "sklearn.compose._column_transformer._RemainderColsList",
- "None.dtype",
- "numpy.ndarray",
- "sklearn.pipeline.Pipeline",
- "sklearn.impute._base.SimpleImputer"
How to fix it?
16.9 kB Upload 3 files - 71 Bytes Update requirements.txt
stacking_car_price_model.pkl Detected Pickle imports (16)
- "None.dtype",
- "sklearn.tree._classes.DecisionTreeRegressor",
- "collections.OrderedDict",
- "numpy.ndarray",
- "lightgbm.sklearn.LGBMRegressor",
- "sklearn.ensemble._stacking.StackingRegressor",
- "lightgbm.basic.Booster",
- "numpy.dtype",
- "numpy._core.multiarray._reconstruct",
- "sklearn.tree._tree.Tree",
- "collections.defaultdict",
- "xgboost.sklearn.XGBRegressor",
- "builtins.bytearray",
- "xgboost.core.Booster",
- "joblib.numpy_pickle.NumpyArrayWrapper",
- "sklearn.ensemble._forest.RandomForestRegressor"
How to fix it?
53.9 MB Upload 3 files