Fetching metadata from the HF Docker repository...
Upload 6 files
69698c3 verified - LSTMModel_cargo_horizon1_with_hour_input_batch256 Upload 31 files
- LSTMModel_cargo_horizon1_with_month_day_time_input_batch256_cleaned Upload 6 files
- LSTM_whole_atlantic_horizon1_with_time_decimal_input_batch256 Upload 100 files
- LSTM_whole_atlantic_horizon1_with_time_decimal_input_batch256_KD_Mid Upload 100 files
- LSTM_whole_atlantic_horizon1_with_time_decimal_input_batch256_KD_North Upload 100 files
- LSTM_whole_atlantic_horizon1_with_time_decimal_input_batch256_KD_South Upload 100 files
- 2.22 kB Upload 100 files
- 232 Bytes initial commit
- 33 kB Update app.py
atlantic_cargo_1h_dataset.joblib Detected Pickle imports (6)
- "_codecs.encode",
- "numpy.core.multiarray._reconstruct",
- "numpy.ndarray",
- "sklearn.preprocessing._data.MinMaxScaler",
- "numpy.dtype",
- "joblib.numpy_pickle.NumpyArrayWrapper"
How to fix it?
1.66 kB Upload atlantic_cargo_1h_dataset.joblib - 58 Bytes Rename requirement.txt to requirements.txt
scaler_features_cargo_cleaned.joblib Detected Pickle imports (6)
- "_codecs.encode",
- "numpy.core.multiarray._reconstruct",
- "numpy.ndarray",
- "sklearn.preprocessing._data.MinMaxScaler",
- "numpy.dtype",
- "joblib.numpy_pickle.NumpyArrayWrapper"
How to fix it?
1.29 kB Upload scaler_features_cargo_cleaned.joblib scaler_train_Mid.joblib Detected Pickle imports (6)
- "numpy.core.multiarray._reconstruct",
- "sklearn.preprocessing._data.MinMaxScaler",
- "numpy.ndarray",
- "_codecs.encode",
- "joblib.numpy_pickle.NumpyArrayWrapper",
- "numpy.dtype"
How to fix it?
1.75 kB Upload 100 files scaler_train_Mid_up.joblib Detected Pickle imports (6)
- "_codecs.encode",
- "sklearn.preprocessing._data.MinMaxScaler",
- "joblib.numpy_pickle.NumpyArrayWrapper",
- "numpy.dtype",
- "numpy.core.multiarray._reconstruct",
- "numpy.ndarray"
How to fix it?
1.75 kB Upload 3 files scaler_train_North.joblib Detected Pickle imports (6)
- "numpy.core.multiarray._reconstruct",
- "sklearn.preprocessing._data.MinMaxScaler",
- "numpy.ndarray",
- "_codecs.encode",
- "joblib.numpy_pickle.NumpyArrayWrapper",
- "numpy.dtype"
How to fix it?
1.75 kB Upload 100 files scaler_train_North_up.joblib Detected Pickle imports (6)
- "_codecs.encode",
- "sklearn.preprocessing._data.MinMaxScaler",
- "joblib.numpy_pickle.NumpyArrayWrapper",
- "numpy.dtype",
- "numpy.core.multiarray._reconstruct",
- "numpy.ndarray"
How to fix it?
1.75 kB Upload 3 files scaler_train_South.joblib Detected Pickle imports (6)
- "numpy.core.multiarray._reconstruct",
- "sklearn.preprocessing._data.MinMaxScaler",
- "numpy.ndarray",
- "_codecs.encode",
- "joblib.numpy_pickle.NumpyArrayWrapper",
- "numpy.dtype"
How to fix it?
1.75 kB Upload 100 files scaler_train_South_up.joblib Detected Pickle imports (6)
- "_codecs.encode",
- "sklearn.preprocessing._data.MinMaxScaler",
- "joblib.numpy_pickle.NumpyArrayWrapper",
- "numpy.dtype",
- "numpy.core.multiarray._reconstruct",
- "numpy.ndarray"
How to fix it?
1.75 kB Upload 3 files scaler_train_wholedata.joblib Detected Pickle imports (6)
- "numpy.core.multiarray._reconstruct",
- "sklearn.preprocessing._data.MinMaxScaler",
- "numpy.ndarray",
- "_codecs.encode",
- "joblib.numpy_pickle.NumpyArrayWrapper",
- "numpy.dtype"
How to fix it?
1.75 kB Upload 100 files scaler_train_wholedata_up.joblib Detected Pickle imports (6)
- "_codecs.encode",
- "sklearn.preprocessing._data.MinMaxScaler",
- "joblib.numpy_pickle.NumpyArrayWrapper",
- "numpy.dtype",
- "numpy.core.multiarray._reconstruct",
- "numpy.ndarray"
How to fix it?
1.75 kB Upload scaler_train_wholedata_up.joblib - 96.7 MB Upload 100 files
- 130 MB Upload 100 files
- 647 MB Upload 100 files