- 1.52 kB initial commit
- 1.94 kB main stocks models
- 135 Bytes json
- 103 Bytes json
- 135 Bytes json
scaler_1h.joblib Detected Pickle imports (7)
- "numpy.core.multiarray.scalar",
- "numpy.ndarray",
- "None.dtype",
- "joblib.numpy_pickle.NumpyArrayWrapper",
- "numpy.dtype",
- "sklearn.preprocessing._data.StandardScaler",
- "numpy.core.multiarray._reconstruct"
How to fix it?
1.25 kB main stocks models scaler_crypto_5m.joblib Detected Pickle imports (6)
- "numpy.ndarray",
- "None.dtype",
- "joblib.numpy_pickle.NumpyArrayWrapper",
- "numpy.dtype",
- "sklearn.preprocessing._data.StandardScaler",
- "numpy.core.multiarray._reconstruct"
How to fix it?
1.18 kB main stocks models scaler_daily.joblib Detected Pickle imports (7)
- "numpy.core.multiarray.scalar",
- "numpy.ndarray",
- "None.dtype",
- "joblib.numpy_pickle.NumpyArrayWrapper",
- "numpy.dtype",
- "sklearn.preprocessing._data.StandardScaler",
- "numpy.core.multiarray._reconstruct"
How to fix it?
1.25 kB main stocks models stock_model_1h.joblib Detected Pickle imports (5)
- "numpy.ndarray",
- "sklearn.ensemble._forest.RandomForestClassifier",
- "joblib.numpy_pickle.NumpyArrayWrapper",
- "sklearn.tree._classes.DecisionTreeClassifier",
- "numpy.dtype"
How to fix it?
2.27 GB main stocks models stock_model_crypto_5m.joblib Detected Pickle imports (5)
- "numpy.ndarray",
- "sklearn.ensemble._forest.RandomForestClassifier",
- "joblib.numpy_pickle.NumpyArrayWrapper",
- "sklearn.tree._classes.DecisionTreeClassifier",
- "numpy.dtype"
How to fix it?
257 MB main stocks models stock_model_daily.joblib Detected Pickle imports (5)
- "numpy.ndarray",
- "sklearn.ensemble._forest.RandomForestClassifier",
- "joblib.numpy_pickle.NumpyArrayWrapper",
- "sklearn.tree._classes.DecisionTreeClassifier",
- "numpy.dtype"
How to fix it?
2.15 GB main stocks models