Fetching metadata from the HF Docker repository... - 6.15 kB init
- 610 kB image
parameters.pkl Detected Pickle imports (8)
- "_codecs.encode",
- "sklearn.preprocessing._data.StandardScaler",
- "pytorch_forecasting.data.encoders.NaNLabelEncoder",
- "numpy.core.multiarray._reconstruct",
- "pytorch_forecasting.data.encoders.EncoderNormalizer",
- "numpy.core.multiarray.scalar",
- "numpy.ndarray",
- "numpy.dtype"
How to fix it?
17.8 kB init parameters_q.pkl Detected Pickle imports (11)
- "sklearn.preprocessing._data.StandardScaler",
- "pytorch_forecasting.data.encoders.NaNLabelEncoder",
- "torch._utils._rebuild_tensor_v2",
- "numpy.core.multiarray._reconstruct",
- "pytorch_forecasting.data.encoders.EncoderNormalizer",
- "collections.OrderedDict",
- "numpy.core.multiarray.scalar",
- "numpy.ndarray",
- "numpy.core.numeric._frombuffer",
- "torch.storage._load_from_bytes",
- "numpy.dtype"
How to fix it?
14.6 kB new model test_data.pkl Detected Pickle imports (13)
- "pandas.core.indexes.numeric.Int64Index",
- "pandas._libs.arrays.__pyx_unpickle_NDArrayBacked",
- "numpy.dtype",
- "pandas.core.indexes.base.Index",
- "pandas.core.arrays.datetimes.DatetimeArray",
- "numpy.core.multiarray._reconstruct",
- "_codecs.encode",
- "pandas.core.indexes.base._new_Index",
- "pandas.core.frame.DataFrame",
- "pandas._libs.internals._unpickle_block",
- "pandas.core.internals.managers.BlockManager",
- "__builtin__.slice",
- "numpy.ndarray"
How to fix it?
31.3 MB init