Fetching metadata from the HF Docker repository...
Upload 45 files
1b70843 verified churn_model.pkl Detected Pickle imports (6)
- "sklearn.linear_model._logistic.LogisticRegression",
- "joblib.numpy_pickle.NumpyArrayWrapper",
- "numpy.dtype",
- "numpy.core.multiarray._reconstruct",
- "numpy.ndarray",
- "_codecs.encode"
How to fix it?
3.92 kB Upload 45 files churn_scaler.pkl Detected Pickle imports (7)
- "joblib.numpy_pickle.NumpyArrayWrapper",
- "numpy.core.multiarray.scalar",
- "numpy.dtype",
- "numpy.core.multiarray._reconstruct",
- "numpy.ndarray",
- "_codecs.encode",
- "sklearn.preprocessing._data.StandardScaler"
How to fix it?
5.05 kB Upload 45 files - 2.79 kB Upload 45 files
ltv_model.pkl Detected Pickle imports (12)
- "sklearn.ensemble._gb.GradientBoostingRegressor",
- "sklearn._loss.loss.HalfSquaredError",
- "total_data_gb.__pyx_unpickle_CyHalfSquaredError",
- "numpy.core.multiarray._reconstruct",
- "total_data_gb.CyHalfSquaredError",
- "numpy.dtype",
- "sklearn._loss.link.IdentityLink",
- "numpy.ndarray",
- "avg_mos_score.Interval",
- "_codecs.encode",
- "joblib.numpy_pickle.NumpyArrayWrapper",
- "sklearn.dummy.DummyRegressor"
How to fix it?
644 kB Upload 45 files ltv_scaler.pkl Detected Pickle imports (7)
- "joblib.numpy_pickle.NumpyArrayWrapper",
- "numpy.core.multiarray.scalar",
- "numpy.dtype",
- "numpy.core.multiarray._reconstruct",
- "numpy.ndarray",
- "_codecs.encode",
- "sklearn.preprocessing._data.StandardScaler"
How to fix it?
4.99 kB Upload 45 files - 233 Bytes Upload 45 files