Fetching metadata from the HF Docker repository...
Upload SuperKart.csv
3d22f43 verified - 1.52 kB initial commit
- 309 Bytes Update Dockerfile
- 276 Bytes initial commit
- 862 kB Upload SuperKart.csv
- 1.95 kB Update app.py
bagging_best_model.pkl Detected Pickle imports (11)
- "sklearn.compose._column_transformer.ColumnTransformer",
- "numpy.ndarray",
- "sklearn.preprocessing._data.StandardScaler",
- "numpy.float64",
- "sklearn.pipeline.Pipeline",
- "_codecs.encode",
- "joblib.numpy_pickle.NumpyArrayWrapper",
- "numpy._core.multiarray._reconstruct",
- "numpy.dtype",
- "sklearn.preprocessing._encoders.OneHotEncoder",
- "numpy._core.multiarray.scalar"
How to fix it?
91.2 MB Upload bagging_best_model.pkl random_forest_best_model.pkl Detected Pickle imports (11)
- "numpy._core.multiarray._reconstruct",
- "numpy.dtype",
- "numpy.ndarray",
- "_codecs.encode",
- "sklearn.preprocessing._data.StandardScaler",
- "sklearn.preprocessing._encoders.OneHotEncoder",
- "sklearn.compose._column_transformer.ColumnTransformer",
- "numpy._core.multiarray.scalar",
- "joblib.numpy_pickle.NumpyArrayWrapper",
- "sklearn.pipeline.Pipeline",
- "numpy.float64"
How to fix it?
51.4 MB Upload random_forest_best_model.pkl - 137 Bytes Update requirements.txt