Fetching metadata from the HF Docker repository...
Update prediction.py
734fbbb - 1.52 kB initial commit
- 237 Bytes initial commit
adaboost_logreg_10_features.pkl Detected Pickle imports (11)
- "sklearn.compose._column_transformer.ColumnTransformer",
- "sklearn.preprocessing._encoders.OneHotEncoder",
- "numpy.float64",
- "sklearn.ensemble._weight_boosting.AdaBoostClassifier",
- "numpy.core.multiarray._reconstruct",
- "numpy.ndarray",
- "sklearn.preprocessing._data.MinMaxScaler",
- "sklearn.linear_model._logistic.LogisticRegression",
- "numpy.dtype",
- "builtins.slice",
- "sklearn.pipeline.Pipeline"
How to fix it?
4 kB Upload 14 files adaboost_logreg_best.pkl Detected Pickle imports (11)
- "sklearn.ensemble._weight_boosting.AdaBoostClassifier",
- "numpy.float64",
- "sklearn.pipeline.Pipeline",
- "builtins.slice",
- "sklearn.linear_model._logistic.LogisticRegression",
- "sklearn.compose._column_transformer.ColumnTransformer",
- "numpy.dtype",
- "sklearn.preprocessing._encoders.OneHotEncoder",
- "numpy.ndarray",
- "numpy.core.multiarray._reconstruct",
- "sklearn.preprocessing._data.MinMaxScaler"
How to fix it?
4.59 kB Upload 7 files - 2.27 kB Upload app.py
- 234 Bytes Upload 11 files
- 4.4 kB Upload eda.py
- 1.33 kB Upload 14 files
kp.pkl Detected Pickle imports (7)
- "kmodes.util.dissim.euclidean_dissim",
- "numpy.ndarray",
- "kmodes.util.dissim.matching_dissim",
- "numpy.core.multiarray.scalar",
- "kmodes.kprototypes.KPrototypes",
- "numpy.dtype",
- "numpy.core.multiarray._reconstruct"
How to fix it?
5.82 kB Upload 11 files - 43 Bytes Upload 11 files
- 15 kB Update prediction.py
- 106 Bytes Update requirements.txt
scaler.pkl Detected Pickle imports (5)
- "numpy.dtype",
- "sklearn.preprocessing._data.StandardScaler",
- "numpy.core.multiarray.scalar",
- "numpy.ndarray",
- "numpy.core.multiarray._reconstruct"
How to fix it?
611 Bytes Upload 11 files - 1.02 MB Upload 7 files
- 8.89 kB Upload 14 files
- 14 kB Upload 12 files