Fetching metadata from the HF Docker repository...
Update app.py
79251cc verified - 1.52 kB initial commit
- 323 Bytes initial commit
- 9.75 kB Update app.py
model.pkl Detected Pickle imports (25)
- "numpy.core.multiarray._reconstruct",
- "lightgbm.basic.Booster",
- "sklearn.tree._classes.DecisionTreeRegressor",
- "sklearn.ensemble._forest.RandomForestClassifier",
- "sklearn._loss.link.LogitLink",
- "sklearn.ensemble._voting.VotingClassifier",
- "numpy.ndarray",
- "sklearn._loss.link.Interval",
- "numpy.random._pickle.__bit_generator_ctor",
- "numpy.core.multiarray.scalar",
- "_loss.CyHalfBinomialLoss",
- "lightgbm.sklearn.LGBMClassifier",
- "_loss.__pyx_unpickle_CyHalfBinomialLoss",
- "sklearn._loss.loss.HalfBinomialLoss",
- "sklearn.dummy.DummyClassifier",
- "numpy.random._pickle.__randomstate_ctor",
- "sklearn.tree._classes.DecisionTreeClassifier",
- "sklearn.preprocessing._label.LabelEncoder",
- "sklearn.tree._tree.Tree",
- "numpy.core.numeric._frombuffer",
- "numpy.dtype",
- "collections.defaultdict",
- "collections.OrderedDict",
- "sklearn.ensemble._gb.GradientBoostingClassifier",
- "sklearn.utils._bunch.Bunch"
How to fix it?
34.1 MB Upload 2 files - 140 Bytes Update requirements.txt
scaler.pkl Detected Pickle imports (4)
- "sklearn.preprocessing._data.StandardScaler",
- "numpy.dtype",
- "numpy.core.numeric._frombuffer",
- "numpy.core.multiarray.scalar"
How to fix it?
1.16 kB Upload 2 files